Provided by: spark_2012.0.deb-11build1_amd64
spark - examines SPARK programs and generates verification conditions
spark [OPTIONS] [ FILE_LIST or @METAFILE ]
The Examiner for SPARK, spark, analyses the given source files for errors and violations of the SPARK subset and (optionally) generates verification conditions and dead path conjectures necessary for proof of absence of run-time exceptions and partial correctness. This manual page only summarises the spark command-line flags, please refer to the full Examiner manual for further information.
These options do not quite follow the usual GNU command line syntax. All options start with a single dash instead of the usual two and they can also be abbreviated, as long as the abbreviation is unique. For example -help can be abbreviated to -he but not -h as this clashes with -html. -source_extension=file_type Specifies source file extension (Default 'ada') -noindex_file, -index_file=file_spec Specifies the index file. By default no index file is used. -nowarning_file, -warning_file=file_spec Specifies the warning control file. By default no warning control file is used. -noconfig_file, -config_file=file_spec Specifies the Examiner configuration file. By default no configuration file is used. -noswitch Ignore the spark.sw file, if it exists. -nolistings, -listing_extension=file_type By default all listing files have the 'lst' extension. These options can be used to disable listing file generation or to change the default extension. -noreport_file, -report_file=file_spec By default the report will be named 'SPARK.REP'. These options can be used to change the default name or to disable report generation. -html, -html=dir_spec Generate HTML listings and report file. -output_directory=dir_spec Generate report, listing and proof files in the specified directory. -plain_output No dates, line or error numbers will appear in the output files. -language=L This can be one of 83, 95 (the default) or 2005. -profile=P Choose between the sequential (the default) or ravenscar language profile. -noduration Do not predefine Standard.Duration. -syntax_check Perform syntax check only. No semantic checks. -flow_analysis=TYPE Choose between information or data. -policy=TYPE Select security or safety policy for flow analysis. -vcg Generate VCs. -dpc Generate DPCs. -rules=CHOICE Select policy for generation of composite constant proof rules. Can be one of none (the default), lazy, keen or all. -annotation_character=CHAR Select alternative annotation character. The default is '#'. -noecho Suppress screen output. -nosli Don't generate SLI files. -sparklib Use the standard SPARK library. -nostatistics, -statistics Append Examiner table usage statistics to the report file. By default we don't do this. -fdl_identifiers=OPTION Control treatment of FDL identifiers when found in SPARK source. Can be one of 'reject' (the default) or 'accept' or <string>. -version Print Examiner banner and statistics and then exit. -help Print command line summary and options. -original_flow_errors Print information flow errors in original, less compact, format. -error_explanations=SETTING Print explanations after error messages. Settings can be off (the default), first_occurrence or every_occurrence. -justification_option=TYPE Select policy for justification of errors. Values can be full (the default), brief or ignore. -casing, -casing=CHOICE Check casing for identifier references and check casing of package Standard identifiers. It is also possible to specify i or s to check for only one of these.
sparksimp(1), spadesimp(1), zombiescope(1), victor(1), pogs(1) sparkformat(1), sparkmake(1)
This manual page was written by Florian Schanda <email@example.com> for the Debian GNU/Linux system (but may be used by others). Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts and no Back-Cover Texts. 22 March 2011 spark(1)