Axa Assurance Maroc - Insurer Innovation Award 2024
Model-Driven Software Verification
1. Model-Driven Software
Model-Driven Software
Verification (MDSV)
Verification (MDSV)
Gerard J. Holzmann & Rajeev Joshi
Gerard J. Holzmann & Rajeev Joshi
JPL Laboratory for Reliable Software, Caltech
JPL Laboratory for Reliable Software, Caltech
(Formal Methods for Developing Reliable Software Systems)
(Formal Methods for Developing Reliable Software Systems)
Master in Software Engineering & Artificial Intelligence
Master in Software Engineering & Artificial Intelligence
Computer Science Department
University of Malaga
Juan Antonio Martin Checa
2011
2011
2. “Learn from yesterday, live for
today, hope for tomorrow.
The important thing is not to
stop questioning.”
3. “Learn from yesterday, live for
today, hope for tomorrow.
The important thing is not to
stop questioning.”
- Albert Einstein
4. Index of contents
Index of contents
1. Introduction
1. Introduction
2. Model Checking with
2. Model Checking with
Embedded C Code
Embedded C Code
3. Two Sample Applications
3. Two Sample Applications
4. Related Work
4. Related Work
5. Conclusions
5. Conclusions
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 4
5. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
1. Introduction
1. Introduction
Software Engineering
Software Verification (SV) sw satisfies requirements?
- Dynamic verification (test, experimentation)
checks sw behaviour during execution
Test in the small: a single function/class
Test in the small: a single function/class
Test in the large: a group of classes (module, integration, system)
Test in the large: a group of classes (module, integration, system)
Acceptance test: functional, non-functional
Acceptance test: functional, non-functional
- Static verification (analysis)
checks sw meets requirements by physical inspection
Code conventions verification
Code conventions verification
Bad practices (anti-pattern) detection
Bad practices (anti-pattern) detection
Software metrics calculation
Software metrics calculation
Formal verification (e.g. MODEL CHECKING)
Formal verification (e.g. MODEL CHECKING)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 5
6. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
1. Introduction
1. Introduction
Model Checking: Classical approach
System model:
high-level / manually generated
model abstraction ↓ complexity
same language as MC
requires knowledge of
MC + application
↓ knowledge of MC
limits scope of verification
↓ knowledge of app
undermines validity of verification
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 6
8. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
1. Introduction
1. Introduction
Model Checking: Model-Driven Sw Verification approach
Model extraction:
automatic SM generation
SM = ‘black box’ (no knowledge needed)
model abstraction ↓ complexity
Test harness:
same language as MC
drives app through all relevant states
mapping table (AM): source statements
& data manipulations (relevant for test)
test driver (TD): input / output
requirements (SP): properties to check
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 8
12. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
1. Introduction
1. Introduction
Goal:
“A powerful extension of the SPIN model checker that
allows the user to directly define data abstractions
in the logic verification of application level
programs.”
“A new verification method that [...] can avoid the
need to manually construct a verification model,
while still retaining the capability to define
powerful abstractions that can be used to reduce
verification complexity.”
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 12
13. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2. Model Cheching with Embedded C-Code
2. Model Cheching with Embedded C-Code
2.1. Introduction
2.1. Introduction
2.2. Tracking Without Matching
2.2. Tracking Without Matching
2.3. Validity of Abstractions
2.3. Validity of Abstractions
2.4. Sufficient Conditions for
2.4. Sufficient Conditions for
Soundness
Soundness
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 13
14. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.1. Introduction
2.1. Introduction
SPIN 4.0+ allows embedded
C/C++ code within
the model
primitives: connexion app - SM
c_decl c_track
c_code c_expr
(model) state info:
- within primitives
- within C source code (optional)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 14
16. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.1. Introduction
2.1. Introduction
Primitives:
c_decl: defines state. introduces types/names of external C data objects
referred to in the model.
c_track: defines state. defines which of the C data objects should be
considered to hold state info relevant to the verification process.
c_code: defines state transitions. encloses an arbitrary fragment of C code
used to effect the desired state transition.
c_expr: defines state transitions. evaluates an arbitrary side-effect free
expression in C to compute a boolean truth value used to
determine the executability of the statement itself.
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 16
17. Primitives: example
c_decl {
extern float x;
extern void fiddle(void);
};
c_track “&x” “sizeof(float)”;
init {
do
:: c_expr { x < 10.0 } -> c_code { fiddle(); }
:: else -> break
od
} 17
18. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.1. Introduction
2.1. Introduction
Primitives (extensions)
more flexible than PROMELA
allow to introduce new datatypes in verification models
SPIN is used as usual
c_track:
state tracking: allows to restore the value of data objects to their
previous states when backtracking during depth-first search.
state matching: allows to recognize when a state is revisited
during the search.
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 18
19. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.2. Tracking Without Matching
2.2. Tracking Without Matching
c_track:
c_track:
--state tracking: restore the value of data objects.
state tracking: restore the value of data objects.
--state matching: recognize revisited states.
state matching: recognize revisited states.
Cases of interest (EDO --External Data Object):
Cases of interest (EDO External Data Object):
--EDO: contains no state info
EDO: contains no state info
--EDO: contains state info (but too much detail)
EDO: contains state info (but too much detail)
--state tracking: retain all necessary details for backtracking
state tracking: retain all necessary details for backtracking
--state matching: abstractions
state matching: abstractions
c_track extension (SPIN 4.1+):
c_track extension (SPIN 4.1+):
c_track “&x” “sizeof(float)” “Matched”;
c_track “&x” “sizeof(float)” “Matched”;
c_track “&x” “sizeof(float)”; “Unmatched”;
c_track “&x” “sizeof(float)”; “Unmatched”;
Matched
Matched value (EDO)
value (EDO) SS ^^ value (EDO)
SS value (EDO) SD SD
Unmatched
Unmatched value (EDO)
value (EDO) SS SS (not saved to SD)
(not saved to SD)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 19
21. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.3. Validity of Abstractions
2.3. Validity of Abstractions
c_track extension (SPIN 4.1+):
c_track extension (SPIN 4.1+):
c_track “&x” “sizeof(float)” “Matched”;
c_track “&x” “sizeof(float)” “Matched”; value (EDO)
value (EDO) SS, SD
SS, SD
c_track “&x” “sizeof(float)”; “Unmatched”;
c_track “&x” “sizeof(float)”; “Unmatched”; value (EDO)
value (EDO) SS
SS
allows to include data (relevant for execution, not for verification)
allows to include data (relevant for execution, not for verification) model
model
2 main uses:
2 main uses:
--data hidding: track data without saving ititto SD (hide data from SD)
data hidding: track data without saving to SD (hide data from SD)
Unmatched
Unmatched
--data hidding + abstraction:
data hidding + abstraction:
1. hide selected data from SD
1. hide selected data from SD
2. add abstraction functions ≡ abstract data representations
2. add abstraction functions ≡ abstract data representations
3. track new data saving ititto SD
3. track new data saving to SD
Unmatched + Abstraction + Matched
Unmatched + Abstraction + Matched
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 21
24. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.4. Sufficient Condions for Soundness
2.4. Sufficient Condions for Soundness
Model checher (MC)
Model checher (MC)
initial state
initial state explore set of reachable states
explore set of reachable states check propertie(s)
check propertie(s)
explore a state:
explore a state:
1. enumerating state’s sucessors (Suc)
1. enumerating state’s sucessors (Suc)
2. determining which states in Suc are potentially relevant
2. determining which states in Suc are potentially relevant
3. recording newly encountered states in a data structure (stack)
3. recording newly encountered states in a data structure (stack)
Symetric states relation (~)
Symetric states relation (~)
Relevant state: ssis relevant ififnone of its symetric states have been visited yet
Relevant state: is relevant none of its symetric states have been visited yet
Encountered states: those already visited at least once
Encountered states: those already visited at least once
Explored states: encountered states which complete Suc has been visited yet
Explored states: encountered states which complete Suc has been visited yet
NOTE: no abstraction
NOTE: no abstraction Symetric states relation (=) IDENTITY
Symetric states relation (=) IDENTITY
Relevant state: ssis relevant ififnot encountered yet
Relevant state: is relevant not encountered yet
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 24
25. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
2.4. Sufficient Condions for Soundness
2.4. Sufficient Condions for Soundness
When conditions (1) and (2) are both
When conditions (1) and (2) are both
satisfied, the abstraction will preserve
satisfied, the abstraction will preserve
logical soundness.
logical soundness.
∀ w, y, z: (w∼y) ∧ (y→z) ⇒ (∃ x: (w→x) ∧
∀ w, y, z: (w∼y) ∧ (y→z) ⇒ (∃ x: (w→x) ∧
(x∼z)) (1)
(x∼z)) (1)
∀ x, y:
∀ x, y: P(x) ∧ (x∼y) ⇒ P(y)
P(x) ∧ (x∼y) ⇒ P(y)
(2)
(2)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 25
P( ): proposition
P( ): proposition
26. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3. Two Sample Applications
3. Two Sample Applications
3.1. Tic Tac Toe
3.1. Tic Tac Toe
3.2. JPL’s Mars Exploration
3.2. JPL’s Mars Exploration
Rovers (MER)
Rovers (MER)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 26
27. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.1. Tic Tac Toe
3.1. Tic Tac Toe
Goal: obtaining a 3-marks linear array of “x” or “o”
Goal: obtaining a 3-marks linear array of “x” or “o”
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 27
29. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.1. Tic Tac Toe
3.1. Tic Tac Toe
Key Idea: introduce abstraction to explote board symetries
Key Idea: introduce abstraction to explote board symetries
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 29
30. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.2. JPL’s Mars Exploration Rovers (MER)
3.2. JPL’s Mars Exploration Rovers (MER)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 30
31. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.2. JPL’s Mars Exploration Rovers (MER)
3.2. JPL’s Mars Exploration Rovers (MER)
--System: module from the flight sw of JPL’s MER
System: module from the flight sw of JPL’s MER
--Threads: 11 (each for a specific application)
Threads: 11 (each for a specific application)
--Resources: 15 (shared & access controlled by arbiter)
Resources: 15 (shared & access controlled by arbiter)
--Target of verification: ARBITER
Target of verification: ARBITER
--ARBITER:
ARBITER:
--prevents potencial conflicts between resource requests
prevents potencial conflicts between resource requests
--enforces priorities (e.g. communication > driving)
enforces priorities (e.g. communication > driving)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 31
32. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.2. JPL’s Mars Exploration Rovers (MER)
3.2. JPL’s Mars Exploration Rovers (MER)
ARBITER:
ARBITER: priority (User U0) > priority (User U1)
priority (User U0) > priority (User U1)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 32
33. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.2. JPL’s Mars Exploration Rovers (MER)
3.2. JPL’s Mars Exploration Rovers (MER)
ARBITER:
ARBITER:
--Source code: 3.000 lines (ANSI-C)
Source code: 3.000 lines (ANSI-C)
--Lookup table:
Lookup table:
Conflicting combinations of resource requests
Conflicting combinations of resource requests
Priorities
Priorities
--Problem: 11 users (threads)
Problem: 11 users (threads)
15 resources
15 resources
} ↑↑complexity (large search spaces)
Full scale exhaustive verification:
--Solution-1 (limited): SPIN bitstate (supertrace)
Solution-1 (limited): SPIN bitstate (supertrace) prune (< phy mem)
prune (< phy mem)
--Solution-2 (exhaustive): Divide & Conquer
Solution-2 (exhaustive): Divide & Conquer global problem not
global problem not
verified as a whole
verified as a whole
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 33
34. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
3.2. JPL’s Mars Exploration Rovers (MER)
3.2. JPL’s Mars Exploration Rovers (MER)
Solution-2 (exhaustive): Divide & Conquer
- Key Idea: subproblem = {3 users + 3 resources} (random)
↑ repetitions of subproblems ≃ global problem
1. hand-built SPIN model. 245 lines (PROMELA) + 77 lines (arbiter lookup table)
2. state info (4,400 bytes) tracked & matched (c_track), no abstraction,
hand-built test harness of 110 lines (PROMELA) surrounds the arbiter code.
hashcompact state compression option enabled 127 bytes / state.
3. abstraction (restriction: only essential state info recorded)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 34
35. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
4. Related Work
4. Related Work
--Several different approaches to direct verification of implementation
Several different approaches to direct verification of implementation
level C code
level C code
--Examples:
Examples:
1. Verisoft:
1. Verisoft:
--partial order reduction theory (state-less search)
partial order reduction theory (state-less search)
--search along a given path is stopped when user-defined depth reached
search along a given path is stopped when user-defined depth reached
--V: ↓↓memory
V: memory
--X: code instrumentation
X: code instrumentation
--X: no state space maintained
X: no state space maintained no state space techniques available
no state space techniques available
(systematic depth-first search, verif. liveness properties, abstraction)
(systematic depth-first search, verif. liveness properties, abstraction)
2. CMC tool:
2. CMC tool:
--capture as much state info as possible
capture as much state info as possible
--V: store it in state space using aggessive compression
V: store it in state space using aggessive compression stack
stack
--X: does not distinguish relevant state info
X: does not distinguish relevant state info
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 35
36. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
Goal:
“A powerful extension of the SPIN model checker that
allows the user to directly define data abstractions
in the logic verification of application level
programs.”
“A new verification method that [...] can avoid the
need to manually construct a verification model,
while still retaining the capability to define
powerful abstractions that can be used to reduce
verification complexity.”
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 36
37. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
Model Checking
Model Checking
Classical approach
Classical approach Model-Driven Sw Verification approach
Model-Driven Sw Verification approach
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 37
38. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
SPIN 4.0+ allows embedded
C/C++ code within
the model
primitives: connexion app - SM
c_decl c_track
c_code c_expr
(model) state info:
- within primitives
- within C source code (optional)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 38
39. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 39
40. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 40
41. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 41
42. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
5. Conclusions
5. Conclusions
Tic Tac Toe / JPL’s Mars Exploration Rovers (MER)
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 42
43. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
References
References
Holzmann, G.J. Joshi, R. Model-Driven Software Verification. In PROCS 11th SPIN.WORKSHOP.
Holzmann, G.J. Joshi, R. Model-Driven Software Verification. In PROCS 11th SPIN.WORKSHOP.
BARCELONA, SPAIN. pages 77-92. 2004
Adobe Acrobat
BARCELONA, SPAIN. pages 77-92. 2004
7.0 Document
Adobe Acrobat
7.0 Document
Adobe Acrobat
7.0 Document
Adobe Acrobat
7.0 Document
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 43
44. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
References
References
Adobe Acrobat
7.0 Document
Adobe Acrobat
7.0 Document
Adobe Acrobat
7.0 Document
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 44
45. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
References
References
Computer Science, Caltech. http://www.cs.caltech.edu/people.html
Computer Science, Caltech. http://www.cs.caltech.edu/people.html
Formal Verification. Wikipedia. http://en.wikipedia.org/wiki/Formal_verification
Formal Verification. Wikipedia. http://en.wikipedia.org/wiki/Formal_verification
Gerard J. Holzmann. http://spinroot.com/gerard/
Gerard J. Holzmann. http://spinroot.com/gerard/
JLP Laboratory for Reliable Software (LaRS) http://lars-lab.jpl.nasa.gov/
JLP Laboratory for Reliable Software (LaRS) http://lars-lab.jpl.nasa.gov/
Model checking. Wikipedia. http://en.wikipedia.org/wiki/Model_checking
Model checking. Wikipedia. http://en.wikipedia.org/wiki/Model_checking
Nasa JPL - -Jet Propulsion Laboratory. http://www.jpl.nasa.gov/
Nasa JPL Jet Propulsion Laboratory. http://www.jpl.nasa.gov/
Rajeev Joshi. http://rjoshi.org/bio/
Rajeev Joshi. http://rjoshi.org/bio/
Software verification. Wikipedia. http://en.wikipedia.org/wiki/Software_verification
Software verification. Wikipedia. http://en.wikipedia.org/wiki/Software_verification
SPIN model checker. Wikipedia. http://en.wikipedia.org/wiki/SPIN_model_checker
SPIN model checker. Wikipedia. http://en.wikipedia.org/wiki/SPIN_model_checker
Tic-tac-toe. Wikipedia. http://en.wikipedia.org/wiki/Tic-tac-toe
Tic-tac-toe. Wikipedia. http://en.wikipedia.org/wiki/Tic-tac-toe
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 45
46. 1. Introduction 2. MC + C-Code 3. Examples 4. Related Work 5. Conclusions
References
References
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 46
47. You might be thinking...
You might be thinking...
Now you can
Now you can
impress your
impress your
friends talking
friends talking
about MDSV ...
about MDSV ...
please, ask!
please, ask!
Model-Driven Software Verification J.A.Martin Checa - CIS Dep, University of Malaga 47
48. “By learning you will teach, by teaching
you will learn.”
“You see things; and you say, 'Why?'
But I dream things that never were; and
I say, ‘Why not?’”
48
49. “By learning you will teach, by teaching
you will learn.”
- Latin Proverb
“You see things; and you say, 'Why?'
But I dream things that never were; and
I say, ‘Why not?’”
- George Bernard Shaw
49
50. Model-Driven Software
Model-Driven Software
Verification (MDSV)
Verification (MDSV)
Gerard J. Holzmann & Rajeev Joshi
Gerard J. Holzmann & Rajeev Joshi
JPL Laboratory for Reliable Software, Caltech
JPL Laboratory for Reliable Software, Caltech
(Formal Methods for Developing Reliable Software Systems)
(Formal Methods for Developing Reliable Software Systems)
Master in Software Engineering & Artificial Intelligence
Master in Software Engineering & Artificial Intelligence
Computer Science Department
University of Malaga
Juan Antonio Martin Checa
2011
2011