SPARK Examiner

Release 5.0 - Release Note

EXM/RN

Issue: 1.0

Status: Definitive

27th June 2000

 

Originator     

 

Peter Amey

 

Approver      

 

Mel Jackson

 

 

 

 

 

 



Contents

1       Introduction

2     Principal Examiner changes

3     SPARK language changes

4     Enhanced proof support

4.1  Abstract proof

4.2     Other proof-related enhancements

5     HTML output

6     Other significant changes

6.1     Better support for common Ada file naming conventions

6.2     User-selectable annotation character

6.3  Improved suppression of analysis were results might otherwise be misleading

6.4     Static expression evaluation in proof contexts

6.5  Singleton enumeration types

6.6     Revised SPARK_IO package

7       Miscellaneous corrections and improvements

7.1  Common to all Platforms

7.2  Platform-specific

8     Backward incompatibilities

9     User manual change summary

9.1  Examiner user manual

9.2  Generation of verification conditions from SPARK programs

9.3  Generation of run-time checks from SPARK programs

9.4     POGS user manual

10   Limitations and known errors

10.1       Tool limitations

10.2  Known error summary

11   Change summary from Release 2.0

11.1  Release 2.0 - November 1995

11.2  Release 2.1 - July 1996

11.3  Release 2.5 - March 1997

11.4  Release 3.0 - September 1997

11.5  Release 4.0 - December 1998

12   Operating system compatibility

12.1  VAX/VMS

12.2  Solaris

12.3  Windows NT

 

 




1                   Introduction

The SPARK[1] Examiner has undergone considerable development and revision since Release 4.0.  The largest component of this work involved extending the Examiner’s proof capability, especially the proof of programs which contain “abstract” state; however, some general improvements to the Examiner have also resulted. 

This document describes changes in the behaviour of all variants of the SPARK Examiner Release 5.0 compared with Release 4.0.

Advice on the contents of this document may be obtained:

            by phone+44 (0)1225 466991

            by FAX    +44 (0)1225 469006

            by Email            sparkinfo@praxis-cs.co.uk, pna@praxis-cs.co.uk

2                   Principal Examiner changes

The following features are new or significantly improved in Release 5.0:

·         Enhanced proof support:

§         facilities for proof of programs containing “abstract state”;

§         addition of quantified expressions;

§         proof rule generation for enumeration types;

§         identification of the kind and source of each VC;

§         suppression of trivially true VCs;

§         Proof Obligation Summariser tool

·         Optional HTML output files with hyperlinks that can be “browsed” interactively

·         Better support for common Ada file naming conventions

·         User-selectable annotation character

·         Improved suppression of analysis were results might otherwise be misleading

·         Static expression evaluation in proof contexts

·         Singleton enumeration types

·         Revised SPARK_IO package

·         Error numbering

3                   SPARK language changes

There are no SPARK language changes associated with this Examiner release.  The Examiner does now accept enumeration types consisting of a single literal; this has been legal SPARK since the publication of [1, 2] but has not previously been accepted by the Examiner.  See section 6.5.

4                   Enhanced proof support

4.1              Abstract proof

The formal verification capabilities of the Examiner have been extended to encompass proof of programs containing “abstract” state i.e. data items in package bodies represented by an “own variable” that is not of a concrete Ada type.  This development removes a fundamental limitation on what kinds of program can be proved.  In particular, the development means that the kinds of encapsulation and hierarchical state decomposition advocated in [3] is no longer a barrier to program proof. 

4.1.1        Outline

The new facilities are described in [4]; however, a brief outline is presented here.  Essentially the approach relies on being able to declare “abstract proof types” which can be used as the types of abstract own variables.  Once abstract own variables have a named type it becomes possible to write proof functions describing their behaviour.  Subprogram constraints are placed on both the subprogram specification (in abstract form) and body (in terms of the abstract own variable’s refinement constituents).   The Examiner generates a data refinement based on the stated refinement constituents of the own variable to bind the two sets of constraints together.  In addition to the normal VCs, two VCs are generated which serve to show that the abstract and refined constraints are consistent.

4.1.2        Example

Consider part of a simple abstract state machine providing a stack object.

package Stack

--# own State;

is

  procedure Push(X : in Integer);

  --# global in out State;

  --# derives State from State, X;

  --# pre  how do we express “not full” here

  --# post and what goes here

 

end Stack;

The difficulty with expressing the desired pre- and postconditions is that the Push procedure operates on own variable State which has no visible type.  With Release 5.0 the Examiner provides a mechanism for giving State a proof type which can be used to express the subprogram constraints.

package Stack

--# own State : StackType; -- own variable type announcement

is

  -- new syntax allowing declaration of an abstract proof type

  --# type StackType is abstract;

 

  -- the new proof type can be used to declare proof functions

  --# function NotFull(S : StackType) return Boolean;

  --# function Append (S : StackType;

                       X : Integer) return StackType;

 

  procedure Push(X : in Integer);

  --# global in out State;

  --# derives State from State, X;

  -- the new proof functions can be used to express the subprogram constraints

  --# pre  NotFull(State); 

  --# post State = Append(State~, X);

 

end Stack;

The above information is sufficient for the stack package to be used and for proofs of those uses to be constructed.  We now need to ensure that the behaviour described by this abstract view is equivalent to the behaviour of the actual implementation.

The body of the stack package might be refined thus:

package body Stack

--# own State is Vector, Ptr; -- data refinement

is

  MaxStackSize : constant := 100;

  type Ptrs is range 0..MaxStackSize;

  subtype Indexes is Ptrs range 1..MaxStackSize;

  type Vectors is array(Indexes) of Integer;

 

  Ptr    : Ptrs    := 0;

  Vector : Vectors := Vectors’(Indexes => 0);

 

 


  procedure Push(X : in Integer)

  --# global in out Vector, Ptr;

  --# derives Ptr    from * &

  --#         Vector from *, Ptr, X;

  -- a second set of constraints is expressed in concrete terms

  --# pre  Ptr < MaxStackSize; 

  --# post Ptr = Ptr~ + 1 and

  --#      Vector = Vector~[Ptr => X];

  is

  begin

    Ptr := Ptr + ;

    Vector(Ptr) := X;

  end Push;

 

end Stack;

The body of Push can easily be proved against the refined pre- and postconditions.  To show that abstract and refined constraints are consistent we need three things:

1.       A data refinement linking State with refinement constituents Vector and Ptr.

2.       A VC sufficient to show that “abstract precondition” -> “refined precondition”

3.       A VC sufficient to show that “refined postcondition” -> “abstract postcondition”

The data refinement takes the form of an FDL record declaration that is produced automatically from the SPARK own variable refinement clause.  The stack example produces the following FDL declaration:

type stacktype = record

       ptr : integer;

       vector : vectors;

     end;

type vectors = array [integer] of integer;

type ptrs = integer;

The precondition validity takes the form of a VC which uses the abstract and refined preconditions together with the data refinements thus:

H1:     notfull(state) .

H2:     vector = fld_vector(state) .

H3:     ptr = fld_ptr(state) .

         ->

C1:     ptr < maxstacksize .

This VC is readily proved given a suitable definition of proof function notfull.


The postcondition validity also take the form of VC.  This includes initial and final states of the data refinement combined with the abstract and refined postconditions thus:

H1:     vector~ = fld_vector(state~) .

H2:     vector = fld_vector(state) .

H3:     ptr~ = fld_ptr(state~) .

H4:     ptr = fld_ptr(state) .

H5:     ptr = ptr~ + 1 .

H6:     vector = update(vector~, [ptr], x) .

         ->

C1:     state = append(state~, x) .

Again proof depends on a suitable definition for proof function append.

Proofs of uses of the stack package (which would of course be extended to allow Pop operations as well as Push!) use only the abstract constraints.  For example, we can readily prove operations such as (N.B. preconditions omitted for simplicity of illustration):

procedure DoNothing(X : in out Integer)

--# global in out Stack.State;     

--# derives Stack.State, X from Stack.State, X;

--# post Stack.State = Stack.State~ and X = X~;

is

begin

  Push(X);

  Pop(X);

end DoNothing;

and the slightly more useful:

procedure Swap(X, Y : in out Integer)

--# global in out Stack.State;     

--# derives Stack.State, X, Y from Stack.State, X, Y;

--# post Stack.State = Stack.State~ and

--#      X = Y~                     and

--#      Y = X~;

is

begin

  Push(X);

  Push(Y);

  Pop(X);

  Pop(Y);

end Swap;


4.1.3        SPARK Rules for Abstract Proof

New rules concerning proof involving abstract own variables are:

1.       An own variable cannot be announced as being of a proof type declared in another package; this is because the proof type is considered to be a record based on the SPARK own variable refinement which is not visible outside the package in which it occurs.  Thus --# own State : P.T; is illegal if P.T is an abstract proof type declared in package P.

2.       Only one own variable may be declared as being of each abstract proof type.  Thus
--# own State1, State2 : StateType; is illegal if StateType is an abstract proof type.  This is because the data refinement that is used creates a record from the refinement constituents of the own variable.  Since both State1 and State2 cannot be refined to the same constituents there would be no valid definition of the StateType record.

3.       If an own variable is to be implemented as a concrete Ada variable it must be type announced as being of that type e.g. --# own State : Integer;.  This rule is to prevent an own variable apparently changing from abstract to concrete simply depending on whether its package body happens to have been encountered.  Note that this rule does not prevent the use of a one-to-one refinement which is intended to hide the concrete type of an own variable. For example:

package Accumulator

--# own State : AccuType;

is

  --# type AccuType is abstract;

 

  ...

end Accumulator;

 

package body Accumulator

--# own State is X;

is

  X : Integer;

 

  ...

end Accumulator;

provides a satisfactory way of preventing the actual concrete type of State being seen outside the body of package Accumulator.

4.1.4        Abstract own variables which do not have a type announcement

From rule 3 of section 4.1.3 it is clear that the Examiner regards any own variable lacking a type announcement as being abstract.   Where no name is provided for the abstract proof type the Examiner creates an FDL declaration for it from the own variable name by appending _ _ type to it.   All the features of abstract proof described above will be unaffected by the absence of a user-provided name for the abstract proof type.  The absence of such a name, however, does fundamentally limit what can be achieved because it means that it is not possible to define proof functions to describe the behaviour of operations in abstract terms.

4.2              Other proof-related enhancements

4.2.1        Quantified expressions

It is often necessary to express constraints about array data structures and quantified expressions are a convenient way of doing this.  The grammar of annotation expression is extended to included quantified expressions:

quantified_expression :

     for all | for some identifier in type_mark [ range arange ] => ( predicate );

The syntax has been chosen to be as similar as possible as that for for loops and should thus be familiar to SPARK programmers.  The grammar term identifier is a user-chosen name for the quantified variable; this, as is also the case for for loop control variables, must not already be visible.  The quantified variable has a scope that extends only to the end of the quantified expression so quantified variable may be re-used in later expressions.

For example given an array of type Table which may or may not contain some sought value, we can express the return condition of a function that reports whether the value is present thus:

function ThreePresent(A      : Table;

                      Sought : Integer) return Boolean;

--# return for some M in Index => ( A(M) = Sought );

A more complex example is a function which, given that Sought does appear somewhere in the table, returns the index of first occurrence of it.  The precondition of the function ensures that it is only called if Sought it definitely present and the postcondition uses the implicit form of function return annotation together with a quantifier to capture the property that the first occurrence of Sought is returned.

function FindThree(A      : Table;

                   Sought : Integer) return Index;

--# pre for some M in Index => ( A(M) = Sought );

--# return Z => (( A(Z) = Sought) and

--#    (for all M in Index range

--#        1 .. (Z - 1) => (A(M) /= Sought)));

4.2.2        Proof rule generation for enumeration types

When carrying out proofs involving enumeration types it is often useful to have rules relating, for example, an enumeration literal with its position number.  Where such rules are generated manually there is a risk that they will get out of step with the enumeration type they represent.  Release 5.0 of the Examiner generates a useful set of enumeration type rules automatically.  For example, use of the enumeration type:

type Colour is (Red, Amber, Green);

generates the following rules:

 

...

nextcolour_rules(3): colour__first may_be_replaced_by red.

nextcolour_rules(4): colour__last may_be_replaced_by green.

nextcolour_rules(5): colour__base__first may_be_replaced_by red.

nextcolour_rules(6): colour__base__last may_be_replaced_by green.

nextcolour_rules(7): colour__pos(colour__first) may_be_replaced_by 0.

nextcolour_rules(8): colour__pos(red) may_be_replaced_by 0.

nextcolour_rules(9): colour__val(0) may_be_replaced_by red.

nextcolour_rules(10): colour__pos(amber) may_be_replaced_by 1.

nextcolour_rules(11): colour__val(1) may_be_replaced_by amber.

nextcolour_rules(12): colour__pos(green) may_be_replaced_by 2.

nextcolour_rules(13): colour__val(2) may_be_replaced_by green.

nextcolour_rules(14): colour__pos(colour__last) may_be_replaced_by 2.

nextcolour_rules(15): colour__pos(succ(X)) may_be_replaced_by

     colour__pos(X) + 1

     if [X <=green, X <> green].

nextcolour_rules(16): colour__pos(pred(X)) may_be_replaced_by

     colour__pos(X) - 1

     if [X >=red, X <> red].

nextcolour_rules(17): colour__pos(X) >= 0 may_be_deduced_from

     [red <= X, X <= green].

nextcolour_rules(18): colour__pos(X) <= 2 may_be_deduced_from

     [red <= X, X <= green].

nextcolour_rules(19): colour__val(X) >= red may_be_deduced_from

     [0 <= X, X <= 2].

nextcolour_rules(20): colour__val(X) <= green may_be_deduced_from

     [0 <= X, X <= 2].

nextcolour_rules(21): succ(colour__val(X)) may_be_replaced_by

     colour__val(X+1)

     if [0 <= X, X < 2].

nextcolour_rules(22): pred(colour__val(X)) may_be_replaced_by

     colour__val(X-1)

     if [0 < X, X <= 2].

nextcolour_rules(23): colour__pos(colour__val(X)) may_be_replaced_by X

     if [0 <= X, X <= 2].

nextcolour_rules(24): colour__val(colour__pos(X)) may_be_replaced_by X

     if [red <= X, X <= green].

nextcolour_rules(25): colour__pos(X) <= colour__pos(Y) & X <= Y are_interchangeable

     if [red <= X, X <= green].

nextcolour_rules(26): colour__val(X) <= colour__val(Y) & X <= Y are_interchangeable

     if [0 <= X, X <= 2].

4.2.3        Identification of the kind and source of each VC

Labelling of VCs in .VCG files has been improved by identifying the source of each VC.  Each VC is labelled to indicate whether it is:

§         a user-defined check statement;

§         a user-defined assert statement (e.g. a loop invariant);

§         a run-time check (e.g. a range check);

§         a precondition check (generated by a subprogram call);

§         a default loop invariant inserted by the run-time checker; or

§         a postcondition (finish) check.

4.2.4        Suppression of trivially true VCs

When doing proof of absence of run time errors, we often get VCs of the form:

a_very_long_list_of_hypotheses

->

true .

Although the SPADE Simplifier readily discharges the proof obligation, it has to read and parse the long list of hypotheses before finding they are not needed; this wastes a significant amount of time.  The Examiner now identifies such trivially true VCs and replaces the whole VC with:

*** true ./* trivially true VC removed by Examiner */

This format is recognised by the Simplifier which does not process the VC further.

4.2.5        Proof Obligation Summariser tool

The new POGS tool is intended to assist with management of the proof process by constructing summaries of which VCs have been generated and which discharged.  Operation of POGS is described in a separate user manual [5].

5                   HTML output

The Examiner continues to produce simple text files of analysis results such as the report file and listing files.  These are easy to control and manage and provide auditable evidence of the analysis that has been conducted.

When using the Examiner interactively, during program development, it is convenient to have some means of quickly navigating the analysis output and identifying the source of errors.  Release 5.0 meets this need by providing an option to produce browsable HTML output files in addition to the normal text file output.  HTML output is selected by a new command line option (-html or /HTML) and the output files can be browsed using any frame-capable, HTML 4.0 compliant browser.  The output is platform-independent so, for example, HTML files can be generated on a VAX but browsed on a PC.

Browsing is centred on an HTML version of the report file (SPARK.REP) which contains links to the analysis performed and associated source, index and listing files.  Explanations for error messages are also available as hyperlinks.

A full description of the generation of, and navigation through, HTML output files is contained in [6].

6                   Other significant changes

6.1              Better support for common Ada file naming conventions

6.1.1        GNAT convention

This common convention uses the file extension .ads for package specifications and .adb for package bodies and subunits.  The Examiner, by default, generates listing file names by replacing the source file extension with .lst.  Unfortunately this means that the Examiner command line:

(Sun)                spark p.ads p.adb  

(VAX/NT)             spark p.ads, p.adb

results in the listing file for package specification p being overwritten by the listing from the body of p.  Although the problem can be avoided by using the Examiner’s listing_file argument option (see example below), this is somewhat cumbersome and inconvenient.

(Sun)         spark p.ads -listing=p_spec.lst p.adb -listing=p_body.lst

(VAX/NT) spark p.ads/listing=p_spec.lst, p.adb/listing=p_body.lst

Release 5.0 of the Examiner provides an alternative solution to this difficulty.  The existing listing_extension command line option is extended to allow a “wild cards” in the extension provided.  The under bar character preserves a character from the source file extension which can be used to disambiguate the listing file name.  For example:

(Sun)          spark -listing_extension=ls_    p.ads  p.adb

(VAX/NT) spark /listing_extension=ls_    p.ads, p.adb

would generate listing filesp.lssandp.lsb respectively.  The command line option can conveniently be placed in the default switch file, see [6].

The Examiner will not generate a listing file that is identical to the source file but there are no other restrictions on the names that can be generated using this new feature.

6.1.2        Apex convention

The Rational Apex compiler uses a filename convention which causes file names to contain multiple dot characters.  Source files for package specification takes the form p.1.ada and for package bodies p.2.ada.  Because of a decision made when only the VAX was supported, the Examiner cannot handle file names containing multiple dots cleanly.  Neither is it possible to correct this in a way which is completely neutral for existing users; this is because it is not possible to distinguish between p.1 where the intended file extension is 1 and p.1 where  the intention was to use the default source extension to give a file name p.1.ada

With Release 5.0 we have ensured that the Examiner handles this file name convention identically on Solaris and Windows NT (the platforms for where Apex may be in use).  The User Manual [6] gives advice on how these names should be handled.

6.1.3        Spaces in file names

Windows NT permits spaces in file names.  Release 5.0 will accept such file names provided they are quoted,  For example:

spark “c:\program files\my project\h \the main program.ada”

6.2              User-selectable annotation character

The comment convention used to introduce SPARK annotations occasionally clashes with other tools.  For example, Rational Rose uses --# for its own purposes.  A new command line option, annotation_character, allows a convenient selection of a character other than “#” to introduce annotations.  The new option takes a single character argument and this character becomes the annotation start character.  For example:

spark -annotation_character=$ ...

would cause annotations to begin --$ rather than the default --#.

On the Sun, the chosen character may need to be “escaped” if it is significant to the shell in use.  For example: -anno=\*.

6.3              Improved suppression of analysis were results might otherwise be misleading

The Examiner assumes the correctness of subprogram annotations when analysing a procedure call statement (the veracity of the annotation is, of course, confirmed when the body of the called procedure is itself analysed).  If a remote subprogram has static semantic errors in its interface the flow analysis of the caller may be incorrect.  Although the Examiner will always reveal errors of this kind in its report file, they are not visible at the point of call and it is possible that they might be missed.  Release 5.0 of the Examiner adds a new warning at the point of call where the interface (parameter list and annotations) of the called subprogram is malformed.  The caller will not be analysed if it depends on a malformed interface.

For example, given the (incorrect) subprogram:

procedure Inc(X : in out Integer)

--# derives X from Y;

is

begin

  X := X + 1;

end Inc;

and the call to it:

procedure SetX(X : out Integer)

--# derives X from ;

is

begin

  Inc(X);

end SetX;

Prior to Release 5.0 analysis of SetX would show no errors; this is because the well-formed portion of the signature of Inc happens to be exactly what is required by SetX.  A check of the overall analysis presented in the report file would reveal the errors in Inc:

9    procedure Inc(X : in out Integer)

10   --# derives X from Y;

                        ^1,2

*** (  1)  Semantic Error    :  1: The identifier Y is

           either undeclared or not visible at this

           point.

*** (  2)  Semantic Error    :504: Parameter X is of

       mode in out and must appear as an import.

 

11   is

12   begin

13     X := X + 1;

14   end Inc;

15 

but there is no indication within the listing of SetX that there is a problem.

With Release 5.0 a suitable warning is given at the point of call, thus:

17    procedure SetX(X : out Integer)

18    --# derives X from ;

19    is

20    begin

21      Inc(X);

        ^3

  --- (  3)  Warning       :399: The called subprogram

         has semantic errors in its interface

         (parameters and/or annotations) which

         prevent flow analysis of this call.

 

22    end SetX;

Furthermore, flow analysis of SetX is not carried out. 

6.4              Static expression evaluation in proof contexts

The Examiner now evaluates static expressions in proof contexts (e.g. postconditions) and will report expressions that are meaningless because they are out of type range.  For example:

9    procedure K(X : out Index)

10    --# derives X from ;

11    --# post X = Index'Succ(Index'Last);

                         ^2

*** (  2)  Semantic Error    :399: Range error in annotation

           expression.

6.5              Singleton enumeration types

The SPARK language definition permits enumeration types comprising a single literal.  This construct is now accepted by Release 5.0 of the Examiner.  The intended use is in incremental development.  For example:

type SystemStates is (Normal); -- add others later

 

...

 

case S is

  when Normal => OperateNormally;

  -- add other cases later

end case;

6.6              Revised SPARK_IO package

The package SPARK_IO which is provided with the Examiner provides a legal SPARK interface to the standard package Text_IO.  Since the publication of The INFORMED Design Method [3] we have realised that the SPARK_IO does not meet our own recommended form for annotating “boundary packages”.  Release 5.0 includes a revised SPARK_IO which is functionally identical to that supplied before but annotated in accordance with the principles of [3].  The essential change is that the package now has 3 own variables rather than one; this maintains separation between input streams, output streams and the essential state of SPARK_IO.  As a result, annotations of programs that use SPARK_IO should become much clearer although they may also be a little longer.  The old version of SPARK_IO is also supplied with Release 5.0 for reasons of backward compatibility.

7                   Miscellaneous corrections and improvements

7.1              Common to all Platforms

7.1.1        Proof related

§         The Examiner can now generate a VC model for Boolean operations between one-dimensional arrays of Booleans.  Such operations are modelled by creating a proof function with a name created from the array type and the operator being used.  Suitable proof rules are generated.  Given X and Y of array type T, the expression
X and Y would be modelled as t_ _ and(x, y) where t_ _ and is an automatically-generated proof function with the FDL signature function t_ _ and(t, t) : t;

§         When generating run-time checks the Examiner provides a default loop invariant if one is not provided by the user.  Release 5.0 strengthens this default invariant for for loops by including an assertion that the loop counter is in its type.

§         The VC generator no longer splits long identifiers across lines when word wrapping.

§         VCs of array attributes involving arrays of records no longer cause the VCG to crash.

§         The Examiner now produces a VC model of case choice with a range constraint such as:  when Index range 1..10 =>

§         An extra run-time check is generated for for loop to show that the bounds of the loop are consistent with the type of the loop variable.

§         The now VCG handles membership tests involving structured variables correctly. 

§         The expression not Boolean’Last no longer causes the VCG to malfunction.

§         The VCG now correctly handles array attributes with arguments.

§         The VCG now correctly handles attribute of array objects as well as array types.

7.1.2        Other

§         A spurious warning is no longer generated concerning the validity of global annotation refinement when only data flow analysis is being carried out.

§         All error messages generated by the Examiner now include a number as well as a text description.  Error message explanation in the user manual are arranged in numerical order to simplify looking them up.

§         The Examiner no longer malfunctions if the number of parameters on a subprogram body is more than 2 larger than the number on its specification.

§         If an embedded package contains only subprograms containing a pragma interface or pragma import then the Examiner no longer requires a package body to be provided.

7.2              Platform-specific

None.

8                   Backward incompatibilities

As usual we have tried hard to minimise any backward incompatibilities with earlier tool versions.  There are three changes in behaviour that might affect existing programs; however, these will only occur when proof work is being carried out.

1.       Mandatory type announcement of own variables which are implemented as concrete Ada variables (see Section 4.1.3).  When generating VCs from existing programs that contain own variables, implemented using concrete Ada types, it will be necessary to add suitable type announcements to each.  This a straightforward text edit of the own variable annotation and has no other knock-on effects.

2.       New SPARK 83 reserved identifier “abstract”.  The facility to declare named abstract proof types (see Section 4.1.1) makes use of the Ada 95 reserved word “abstract”; however, abstract is not a reserved word of Ada 83.  To allows it use for the declaration of proof type in SPARK 83, with minimum impact on existing users, abstract has been made a reserved FDL identifier in SPARK 83.  This means that abstract is not available in SPARK 83 as a user-defined identifier; however, it can be made available, if proof is not being carried out, by use of the nofdl_identifiers command line option.  The status of abstract in SPARK 83 is thus  exactly the same as “some” which was introduced to allow quantifiers of the form “for some...”.  Where SPARK 83 users, who are conducting proof, have used abstract as a user-defined identifier, a new identifier will have to be substituted.  It is hoped that this is a relatively small subset of SPARK users for whom the new abstract proof facilities will prove adequate compensation for this effort.

3.       Proof annotations must be placed with the first appearance of the subprogram they are associated with.  Earlier releases of the Examiner were very relaxed about where proof annotations could be placed.  With the introduction of the new abstract proof facilities the rules for SPARK proof contexts are essentially the same as those for the core global and derives annotations: the pre and post condition for a subprogram should be placed with the first occurrence of the declaration of the subprogram.  If the subprogram body manipulates refinement constituents of an abstract own variable then a second set of constraints may be placed on its body.  Existing programs, that have subprogram constraints on the body but not on the exported specification of the subprogram will need to be corrected. 

9                   User manual change summary

9.1              Examiner user manual

The principal changes are:

§         Section 3 (excluding VAX) contains a note on Rational Apex file names.

§         Section 3 (NT only) contains a note on spaces in file names.

§         Section 3.1 has been divided to show the command line options in functional groups.  Explanation of new command line option are included.

§         Section 4.6 has a revised explanation of the file names generated by the VCG.

§         Section 4.7 is new and explains the generation and browsing of HTML output.

§         Sections 6.2, 6.3, 6.4,and 8.3 now present the error message in numerical rather than alphabetical order.

§         Section 8.2 is new and provides an explanation of the data flow analysis option for SPARK 95.

9.2              Generation of verification conditions from SPARK programs

The principal changes are:

§         Section one includes a structural overview of the document to assist in navigating it.

§         A new section 2.7 provides an introduction to refinement proofs.

§         A new section 3.2.3 describes quantified expressions.

§         A new section 3.3.2.2 describes declaration of abstract proof types and 3.3.3.2 is extended to show how own variables may be “announced” as being of those proof types.

§         Section 3.4 is extended to show when a subprogram may have a second, refined, set of constraints.

§         Section 3.7 is revised to include the new forms of quantified expression included in Release 5.0.

§         Section 6.4.6 is extended to describe Boolean operations between Boolean array types.

§         A new section 6.6 describes refinement proofs.

§         A new section 6.7 describes the Examiner’s removal of trivially true VCs.

§         A new section 6.8 describes the revised labelling of VCs designed to simplify tracing them to their source.

§         A new section 8.6 provides an example of quantification.

§         A new section 8.7 provides an example of abstract/refined proof.

9.3              Generation of run-time checks from SPARK programs

§         Section 4.4.1 now lists the flow analysis error messages that might invalidate run-time check proofs.

§         Section 4.5.2 describes the addition to the default loop invariant provided for for loops.

9.4              POGS user manual

§         This manual is completely new.

10              Limitations and known errors

10.1          Tool limitations

10.1.1    General

      1          Array aggregate and case statement completeness checking: in some, rare, cases involving very large type ranges in the absence of an others clause, a warning may be produced that the Examiner cannot check the completeness of the aggregate or case statement. 

      2          Target data files can be used only to set values for integer’last and integer’first.  The values supplied can only be literals.  It is intended to extend the range of machine-specific values that can be supplied and allow the use of static expressions rather than literals.

      3          The flow analyser does not take account of for loops which must be entered because they are looping over a type mark which, in SPARK, cannot be empty.  The flow analysis is identical to the case where the loop range might be empty.

      4          The Examiner uses a simplification of the Ada type system designed to avoid the need for overload resolution.  As a result it may occasionally fail to detect illegally typed expressions made up of literal values; such expressions will be rejected by an Ada compiler.

      5          The Examiner does not permit the declaration of subprograms in the private part of a package; this limitation will be reviewed in a future Examiner release since the introduction of child packages may makes such subprograms useful.

      6          The SPARK 95 language definition removes the distinction between initial and later declarative items; this distinction remains in force in the Examiner which requires SPARK 83 declaration orders even in SPARK 95 mode.

      7          The Examiner does not yet permit the use of 8-bit characters in SPARK 95 user-defined identifiers.

10.1.2    Verification Condition Generation and Run-time Checks

      1          Ada string inequality is not modelled.

      2          Array type conversions are not modelled.

      3          VCs involving string catenation which includes the character ASCII.NUL will be incorrect.

      4          Aggregates of multi-dimensional arrays cannot be modelled although aggregates of arrays of arrays can.

      5          Verification conditions involving real numbers are evaluated using infinite precision or perfect arithmetic; this allows the correctness of an algorithm to be shown but cannot guard against the effects of cumulative rounding errors for example.

      6          The Examiner does not generate VCs for package initialization parts.  Statically-determinable constraint errors will be detected during well-formation checking of package initialization.

      7          Run-time checks for real numbers are not available by default.  Where these are made available a specific run time check is currently omitted.  The expression:
X ** Y will raise a constraint error if  X is a real number of value 0.0 and Y is negative; the Examiner does not create a run-time check to show the absence of this case.

      8          Recursion in meta files is not detected.  If a meta file contains recursive entries the Examiner will loop until it reaches an internal static limit.

      9          Recursion in superindex entries is not detected.

10.2          Known error summary

      1          The Examiner permits unnecessary repeated prefixes in annotation expressions.  For example, P.P.P.X where P.X was intended.

      2          The SPARK rule that array actual parameters must have the same bounds as the formal parameter is not checked for function parameters where the actual parameter is a subtype of an unconstrained array type.  Since subtype bounds are static in SPARK errors of this kind should be detected by an Ada compiler.  If not an unconditional run-time error will occur.

      3          The Examine permits the body of a subprogram to be entirely made up of proof statements thus breaching the Ada rule that at least on Ada statement must be present.

      4          The Examiner does not check the Ada rule which requires record field names to be distinct.

      5          Where a package declares two or more private types the Examiner permits mutual recursion between their definitions in the private part of the package.

      6          The Examiner does not “freeze” for loop bounds properly when generating VCs.  For example given for I in Integer range 1.. F(X) loop where the value of F(X) changes in the body of the loop.  To avoid this problem assign the value of F(X) to a suitable variable and use this as the loop bound.

      7          The Examiner does not take due account of a range constraint when determining the subtype of a loop variable; this affects completeness checking of case statements within the loop.  For example for I in Integer range 1..4 loop would require only values 1, 2, 3 and 4 to be covered by the case statement.

      8          A for loop over type Boolean with a specific range produces an invalid VCG model.  For example for I in Boolean range False..True loop. A semantically identical loop of the form for I in Boolean loop is modelled correctly.

      9          The Examiner wrongly demands that a private type that has limited private components must also be limited.

   10          When summarising the counts of pragmas found during an analysis the totals may depend on whether units are selected via the command line (or metafile) or using the index mechanism.  The difference affects only pragmas placed between program units and arises because placing a file name on the command line causes the entire file to be analysed whereas selecting it using indexes causes only the required unit to be read from the file.

   11          A string literal still requires type qualification when it is an actual parameter and the formal parameter is a constrained string subtype.

   12          Error messages may appear out of position (at the top of the listing file) if a package elaboration part is hidden and there are completeness errors (e.g. a missing subprogram body) to report.

11              Change summary from Release 2.0

Each Examiner Release is accompanied by a release note detailing changes from the previous version; this section simply summarises the various changes that have been made. 

11.1          Release 2.0 - November 1995

Release 2.0 added:

·         static expression evaluation;

·         variable initialization at declaration;

·         full-range scalar subtypes; and

·         operator renaming in package specifications.

11.2          Release 2.1 - July 1996

Release 2.1 added:

·         facilities for proof of absence of run-time errors

11.3          Release 2.5 - March 1997

Release 2.5 was distributed with “High Integrity Ada - the SPARK Approach” and provided initial facilities for SPARK 95

11.4          Release 3.0 - September 1997

Windows NT was supported for the first time with this release.  Release 3.0 also added:

·         additional SPARK 95 support;

·         flow analysis of record fields;

·         command line meta files;

·         named numbers;

·         unqualified string literals;

·         moded global annotations; and

·         optional information flow analysis.

11.5          Release 4.0 - December 1998

With Release 4.0 we upgraded all users to single produce standard supporting SPARK 83, SPARK 95 and analysis options up to an including proof facilities.  New features were:

·         full implementation of public and private child packages;

·         default switch file; and

·         provision of the INFORMED design document.

12              Operating system compatibility

12.1          VAX/VMS

Release 5.0 of the Examiner should operate with any version of VAX/VMS later than 5.5-2.  Previous Examiners, up to and including Release 4.0, have been operated successfully on VMS versions up to and including 7.0.

12.2          Solaris

With Release 5.0 of the Examiner we no longer supply a SunOS variant.  The Examiner is compatible with Solaris 5.4 through to 5.7 including 5.7 with 64-bit kernel.

12.3          Windows NT

Release 5.0 of the Examiner is compatible with Windows NT 4.0.  The executables are known also to work on Windows 95, Windows 98 and Windows 2000; however, the FlexLM license management server software must be run on Windows NT.  Use of the Examiner on Windows 95, 98 and 2000 is unsupported.


Document Control and References

Praxis Critical Systems, 20 Manvers Street, Bath BA1 1PX, UK.

Copyright Praxis Critical Systems Ltd 1998.  All rights reserved.

Changes history

Issue 0.1        (15/6/00):  First draft

Issue 1.0  (27/6/00):   After inspection by Rod Chapman

Changes forecast

Nil

File under

S.P0468.73.72

Document references

    1            SPARK ¾ The SPADE Ada Kernel ¾ Edition 3.3 August 1997

    2            SPARK 95 ¾ The SPADE Ada Kernel ¾ Edition 1.0 October 1999

    3            INFORMED Design Method for SPARK¾ Issue 1.0

    4            Generation of Verification Conditions for SPARK Programs

    5            POGS User Manual

    6            SPARK Examiner ¾ User Manual ¾ Release 5.0



[1] Note:  The SPARK programming language is not sponsored by or affiliated with SPARC International Inc. and is not based on SPARC™ architecture.