Automated Testing of Hybrid Simulink/Stateflow Controllers
1. .lusoftware verification & validation
VVS
Automated Testing of Hybrid
Simulink/Stateflow Controllers
Industrial Case Studies
Reza Matinnejad
Shiva Nejati
Lionel Briand
SnT Centre/University of Luxembourg
2. Software Development in
Automotive
• Software development is largely model-based
• Automotive software models have dynamic behaviors
• Mathematical models
capturing plants/hardware
• Controllers
2
3. Two Types of Controllers
• Open loop controllers
• Closed loop controllers
3
Controller
Actuator Sensor
Plant
Input
Controller
Actuator
Plant
Input
Disturbances
Disturbances
4. Open Loop vs Closed Loop
• Closed loop controllers – PID controllers
• More expensive
• More accurate and self-adaptive
• Always present in large and critical cyber-physical systems
• Open loop controllers – State-based models
• Less expensive
• Often controls timing behaviors
• Is combined with closed-loop controllers
4
5. Simulink/Stateflow Models
• Heterogeneous
• Continuous behavior
• Are used for
• simulation
• algorithm design testing
• code generation
5
Time-Continuous
Simulink Model Hardware
Model
Network Model
6. Existing Simulink Testing Tools
• Control theory techniques
• Synthesis of linear PID controllers
• Automated test case generation
• Based on (formal) assertions or structural code coverage
• Automated verification
• Model checking or theorem proving
6
7. Limitations
• Automotive models are rarely linear
• Automatable test oracles may not be available or sufficient
• Test oracles are in many cases manual
• Structural coverage may not help reveal faults
• White box approaches have incompatibility issues
• Scalability issues
7
8. Black Box Search Based
Testing of Simulink
8
Solution Generation
Fitness computation
• Explorative
• Exploitative
Model Input
Spec
Input
Signals
Input Signals
• Simulate the model
• Compute Fitness functions
on outputs
Simulink
Model Fitness
values
Model
Simulation
ut
als
Output
Signal(s)
[SSBSE 2013, ASE 2014, ESEC/FSE 2015,
IST J 2015, ICSE 2016]
9. Fitness Functions – Closed Loop
• Generic requirements
• Stability, responsiveness and smoothness
• Maximizing (quantitative) fitness functions to generate
failure
9
InitialDesired
(ID)
Desired ValueI (input)
Actual Value (output)
FinalDesired
(FD)
time
T/2 T
Smoothness
Responsiveness
Stability
10. Fitness Functions – Closed Loop
• Specific requirements
• E.g., ``The contact between caliper and disk should occur
within 32ms’’
caliper position à disk position à
10
⌃[0,32]⇤((x x0 + ✏) ^ (x x0 ✏))
x x0
Min{Max{Max{|x(t) (x0 + ✏)|, |x(t) (x0 ✏)|}}t0tT }
Translation [Abbas et. al. TECS 2013]
11. Fitness Functions – Open Loop
• Failure patterns
• Output diversity
11
0.0 1.0 2.00.0 1.0 2.0
-1.0
-0.5
0.0
0.5
1.0
Time Time
0.0
0.25
0.50
0.75
1.0
12. Output of Our Approach
• Failure Explanation
• A characterization of the input space showing under what
input conditions the system is likely to fail
• Visualized by diagrams or regression trees
• Failure Detection
• Individual test cases revealing failures
• A set of test input signals
12
13. Case studies
• A mixed of closed loop and open loop controllers, and plant
models
• Developed by BOSCH
• Publicly available
• A large plant model – a mathematical continuous model
• Developed by an automotive company
13
15. Failure Explanation – Regression
Tree
15
All Points
Count
Mean
Std Dev
2384
1.016e+10
4.898e+11
c_gear>=1.0279
Count
Mean
Std Dev
1997
25167.822
135651.79
Count
Mean
Std Dev
387
6.257e+10
1.216e+12
t0>=0.0029462
Count
Mean
Std Dev
1631
4550.4502
55698.046
t0<0.0029462
Count
Mean
Std Dev
366
117044.69
276423.68
c_gear<1.0279
17. Failure Detection – Open Loop
17
0 1.0 2.0 0 1.0 2.0
0
4.0
-4.0
-8.0
0
1.5
-1.5
1.0
-1.0
0.5
-0.5
2.0
-2.0
-6.0
10
5
10
6
Time Time
Test inputs exhibiting ``instability” and ``grow to infinity”
failure patterns
18. Summary of Lessons Learned
• Generating test cases is not enough
• It is important to help engineers with input space exploration
and failure explanation
18
All Points
Count
Mean
Std Dev
2384
1.016e+10
4.898e+11
c_gear>=1.0279
Count
Mean
Std Dev
1997
25167.822
135651.79
Count
Mean
Std Dev
387
6.257e+10
1.216e+12
t0>=0.0029462
Count
Mean
Std Dev
1631
4550.4502
55698.046
t0<0.0029462
Count
Mean
Std Dev
366
117044.69
276423.68
c_gear<1.0279
19. Summary of Lessons Learned
• Engineers do not always have specific and precise
requirements at hand
• We generate test cases that reveal violation of both specific
requirements and (estimated) failure patterns
19
0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.080.0
0.042
0.044
0.046
0.048
0.050
0.052
0 1.0 2.0
0
1.5
-1.5
1.0
-1.0
0.5
-0.5
10
5
Time
20. Summary of Lessons Learned
• Incompatibility issues or manual overhead is a major obstacle
for adoption of current Simulink testing tools
• Our approach is black box and has no overhead
• Time performance of our approach is acceptable
20