SlideShare una empresa de Scribd logo
1 de 60
Descargar para leer sin conexión
Model-Based Testing:
  Why, What, How
           Bob Binder
   System Verification Associates

 Juniper Systems Testing Conference
          November 9, 2011
Overview
• What is Model-Based Testing?
• Testing Economics
• Case Studies
  – Automated Derivatives Trading
  – Microsoft Protocol Interoperability
• Product Thumbnails
• Real Testers of …
• Q&A

                  Model-Based Testing: What, Why, How   2
Why?
• For Juniper:
  – Reduce cost of testing
  – Reduce time to market
  – Reduce cost of quality
  – Increase competitive advantage
• For you:
  – Focus on System Under Test (SUT), not test hassles
  – Engineering discipline with rigorous foundation
  – Enhanced effectiveness and prestige
  – Future of testing
                  Model-Based Testing: What, Why, How   3
WHAT IS MODEL-BASED TESTING?
          Model-Based Testing: What, Why, How   4
“All Testing is Model-Based”
                             • Patterns for test design
                                   – Methods
                                   – Classes
                                   – Package and System
                                     Integration
                                   – Regression
                                   – Test Automation
                                   – Oracles
                             • 35 patterns, each a
                               test meta-model
        Model-Based Testing: What, Why, How
                                                          5
What is a Test Model?
TwoPlayerGame
                                                         TwoPlayerGame                                                                                           Mode Machine test design pattern
                                              Two Play erG a me ( )
+TwoPlayerGame()                      α
                                            p1 _S tart ( ) /                                      p2 _S tart ( ) /
+p1_Start( )                                s im u lat eV olle y( )                               s im u lat eV olle y( )                                                                     ThreePlayerGame( ) /TwoPlayerGame( )
+p1_WinsVolley( )                                                        G am e S tarte d                                                                                                α
-p1_AddPoint( )
+p1_IsWinner( )      p1 _W ins V olle y ( )                                                                       p2 _W ins V olle y ( )
                                                                                                                                                                                             p1_Start( ) /                                   p3_Start( )/
+p1_IsServer( )      [th is .p 1_ Sc ore ( ) < 20 ] /                                                             [th is .p 2_ Sc ore ( ) < 20 ] /                                           simulateVolley( )                               simulateVolley( )
+p1_Points( )
                     th is .p 1A ddP o in t( )                                                                    th is .p 2A ddP o in t( )                                                                           Game Started
                     s im u lat eV olle y( )                          p1 _W ins V olle y ( ) /                    s im u lat eV olle y( )
+p2_Start( )                                                          s im u lat eV olle y( )
+p2_WinsVolley( )                                P la ye r 1                                          P la ye r 2                                                                                                            p2_Start( ) /
-p2_AddPoint( )                                  S erv ed                                             S erv ed
                                                                                                                                                                                                                             simulateVolley( )
+p2_IsWinner( )                                                       p2 _W ins V olle y ( ) /                                                                                               p1_WinsVolley( ) /
+p2_IsServer( )             p1 _W ins V olle y ( )
                                                                      s im u lat eV olle y( )
                                                                                                      p2 _W ins V olle y ( )
                                                                                                                                                                                             simulateVolley( )
+p2_Points( )               [th is .p 1_ Sc ore ( ) = = 20] /                                         [th is .p 2_ Sc ore ( ) = = 20] /
                            th is .p 1A ddP o in t( )                                                 th is .p 1A ddP o in t( )
+~( )
                                                                                                                                                                                                                               p2_WinsVolley( )
                                                               P la ye r 1                  P la ye r 2                                               p1_WinsVolley( )                                                         [this.p2_Score( ) < 20] /              p3_WinsVolley( )
                      p1 _Is W in ner( ) /
                      retu rn TR UE ;
                                                               Won                          Won                         p2 _Is W in ner( ) /
                                                                                                                        retu rn TR UE ;
                                                                                                                                                      [this.p1_Score( ) < 20] /                                                this.p2AddPoint( )                     [this.p3_Score( ) < 20] /
                                                                      ~( )             ~( )                                                           this.p1AddPoint( )                                                       simulateVolley( )                      this.p3AddPoint( )
                                                                                 ω
                                                                                                                                                      simulateVolley( )                                                                                               simulateVolley( )
                                                                                                                                                                                                p1_WinsVolley( ) /                      p2_WinsVolley( ) /
                                                                                                                                                                                                simulateVolley( )                       simulateVolley( )
                                                                                                                                                                            Player 1                                    Player 2                              Player 3
                                                                                                                                                                            Served                                      Served                                Served
                                                        ThreePlayerGame                                                                                                                         p2_WinsVolley( ) /                      p3_WinsVolley( ) /
                                                                                                                                                                                                simulateVolley( )                       simulateVolley( )
                                      Th ree P la y erG am e ( ) / Two P la y erG am e ( )
                                  α                                                                                                                  p1_WinsVolley( )                                                                                               p3_WinsVolley( )
                                                                                                   p 3_ S tart ( ) /                                 [this.p1_Score( ) == 20] /                                                                                     [this.p3_Score( ) == 20] /
                                                                                                   s im ulat eV o lley ( )
ThreePlayerGame                 p 3_ W ins V o lle y( ) /
                                                                      G a m e S ta rt ed
                                                                                                                                                     this.p1AddPoint( )                                                                                             this.p3AddPoint( )
                                s im ulat eV o lley ( )                                                                                                                                  p3_WinsVolley( ) /
+ThreePlayerGame()                                                                                             p 3_ W ins V o lle y( )                                                   simulateVolley( )
+p3_Start( )                                                                                                   [t his .p 3_ S co re ( ) < 2 0] /
                                                                                                                                                                                                                            p2_WinsVolley( )
                                                                                                               th is . p3 A dd P oint ( )
+p3_WinsVolley( )       Tw oP lay erG am e ( )
                                                                                                               s im ulat eV o lley ( )                                                                                      [this.p2_Score( ) == 20] /
-p3_AddPoint( )                                                   p 1_ W ins V o lle y( ) /

+p3_IsWinner( )
                                                                  s im ulat eV o lley ( )                                                                                                                                   this.p1AddPoint( )
                                                                                                    P la y er 3
+p3_IsServer( )                                                                                     S erv e d
+p3_Points( )                                                     p 2_ W ins V o lle y( ) /
                                                                  s im ulat eV o lley ( )
+~( )                                                                                               p 3_ W ins V o lle y( )
                                                                                                    [t his .p 3_ S co re ( ) = = 2 0] /
                                                                                                                                                      p1_IsWinner( ) /                             p2_IsWinner( ) /                                                        p3_IsWinner( ) /
                                                                                                    th is . p3 A dd P oint ( )                        return TRUE;         Player 1                return TRUE;          Player 2                             Player 3     return TRUE;
                                                                                                                                                                           Won                                           Won                                  Won
                                                                                        P la y er 3
                                                                                        W on                          p 3_ Is W in ne r( ) /                                                                                   ~( )
                                                                                                                      ret urn TR UE ;                                             ~( )                                                                       ~( )
                                                                                     ~( )
                                                                             ω                                                                                                                                             ω



      SUT                                  Design Model                                                                                                                                                   Test Model
                                                                                                 Model-Based Testing: What, Why, How
                                                                                                                                                                                                                                                                                           6
Model-based Test Suite
                            1    ThreePlayerGame( )

• N+ Strategy               2
                            3
                                 p1_Start( )
                                 p2_Start( )
                                                                                                           8    Player 2 Served

                            4    p3_Start( )
   – Start at α             5    p1_WinsVolley( )
                                                                                       Player 1 Served
                                                                                                          11    Player 3 Served
                                                                                                                                  17      omega
                            6    p1_WinsVolley( )[this.p1_Score( ) < 20]
                                                                                                          *7
   – Follow transition      7
                            8
                                 p1_WinsVolley( ) [this.p1_Score( ) == 20]
                                 p2_WinsVolley( )
                                                                                                                 Player 1 W on
                                                                                                                                  14
                                                                                                                                       Player 1 W on

     path                   9
                            10
                                 p2_WinsVolley( ) [this.p2_Score( ) < 20]
                                 p2_WinsVolley( ) [this.p2_Score( ) == 20]
                                                                                                          *6    Player 1 Served

                                                                               2
   – Stop if ω or visited                                                                                *9     Player 2 Served




   – Three loop
                                                                                                          11    Player 3 Served
                                                         1                         3
                                           alpha                Gam eStarted           Player 2 Served                            17      omega
                                                                                                         * 10
     iterations                                                                                                  Player 2 W on
                                                                                                                                  15
                                                                                                                                       Player 2 W on


   – Assumes state                                                                                         5    Player 1 Served
                                                                               4

                                                                                                         * 12
     observer                                                                                                   Player 3 Served
                                                                                                                                  17      omega
                            11   p3_WinsVolley( )
                                                                                                         * 13
   – Try all sneak paths
                                                                                                                 Player 3 W on
                            12   p3_WinsVolley( ) [this.p3_Score( ) < 20]                                                         16
                                                                                       Player 3 Served                                 Player 3 W on
                            13   p3_WinsVolley( ) [this.p3_Score( ) == 20]                                 8
                                                                                                                Player 2 Served
                            14   p1_IsWinner( )
                            15   p2_IsWinner( )
                            16   p3_IsWinner( )                                                            5    Player 1 Served

                            17   ~( )

                                                                               N+ Test Suite
                            Model-Based Testing: What, Why, How
                                                                                                                                                       7
Automated Model-based Testing
• Software that represents an SUT so that test
  inputs and expected results can be computed
  –   Useful abstraction of SUT aspects
  –   Algorithmic test input generation
  –   Algorithmic expected result generation
  –   Many possible data structures and algorithms

• SUT interface for control and observation
  – Abstraction critical
  – Generated and/or hand-coded
                    Model-Based Testing: What, Why, How   8
How MBT Improves Quality
                             Develop                                                             Missing, incorrect
                                                              Requirements
                                                                         Ambiguous, missing,
                                                                         contradictory, incorrect,
                     Model                                               obscured, incomplete                  Coverage:
                                                                 Model error, omission                         Requirements,
   Generate                                                                                                    Model,
                                                                                                               Code

        Inputs                                    Expected Outputs Evaluate
  (Test Sequences)                                  (Test Oracle)
 Control                                                                 Observe                                 Reliability
                                                                                                                 Estimate
                                          SUT                                                    SUT Bug
Stobie et al, © 2010 Microsoft. Adapted with permission.
                                                           Model-Based Testing: What, Why, How                             9
Typical Test Configuration

  Test Suite                                           Control
                                                        Agent
   Adapter
 Adapter                                               System
                                                      Under Test

    Transport
  Transport                                            Transport
                                                      Transport
Test Suite Host OS                                      SUT OS

                Model-Based Testing: What, Why, How                10
Typical MBT Environment
 Reqmts DB
                     MBT Tool
 Design DB
                                       Test Suite         Control
                                                           Agent
  Bug DB
                                        Adapter
                                      Adapter             System
                                                         Under Test
 Code Stack
                                         Transport
                                       Transport           Transport
                                                         Transport
                                     Test Suite Host
Test Manager                          Test Host OS        SUT OS


              Development Environment

              Configuration Management
                   Model-Based Testing: What, Why, How                 11
TESTING ECONOMICS
Show Me the Money




How much of this …                    for one of these?
             Model-Based Testing: What, Why, How          13
Testing by Poking Around

                           Manual
                           “Exploratory”
                           Testing

                                                                   System Under Test




+   No tooling costs
    No testware costs
                                   Subjective, wide variation
                                   Low coverage
                                                                    -
    Quick start                    Not repeatable
    Opportunistic                  Can’t scale
    Qualitative feedback           Inconsistent
                             Model-Based Testing: What, Why, How                  14
Manual Testing
                                Manual
                                Test Input



                                Manual
        Manual                  Test Results                         System Under Test
        Test Design             Evaluation
                                                                  Test Setup


+   Flexible, no SUT coupling
    Systematic coverage
                                  1 test per hour
                                  Usually not repeatable/ed           -
    No tooling costs              Not scalable
    No testware cost              Inconsistent
    Usage validation              Tends to “sunny day” tests
                            Model-Based Testing: What, Why, How                     15
Hand-coded Test Driver



     Manual                             Test Driver
     Test Design                        Programming               System Under Test




                                                                           -
    10+ tests per hour                Tooling costs

+   Repeatable
    Predictable
                                      Testware costs
                                      Brittle, high maintenance cost
    Consistent                        Short half-life
    Continuous Integration, TDD       Technology focus
                            Model-Based Testing: What, Why, How                  16
Model-based Testing


              Modeling,                     Automated
              Automated                     Setup and
              Generation                    Execution               System Under Test




+   1000+ tests per hour
    Maintain model (not testware)
                                        Tooling costs
                                        Training costs                       -
    Intellectual control                Paradigm shift
    Explore complex space               Still need manual, coded tests
    Consistent coverage
                              Model-Based Testing: What, Why, How                  17
Test Automation Envelope
Reliability (Effectiveness)

5 Nines
                                                                     Model-based
4 Nines

3 Nines                 Automated
                          Driver
2 Nines
              Manual
1 Nine

                   1            10               100                 1,000         10,000
                              Productivity: Tests/Hour (Efficiency)
                               Model-Based Testing: What, Why, How                          18
CASE STUDY:
REAL TIME DERIVATIVES TRADING
Real Time Derivatives Trading
• “Screen-based trading” over private network
  – 3 million transactions per hour
  – 15 billion dollars per day
• Six development increments
  – 3 years
  – 3 to 5 months per iteration
  – Testing cycle shadows dev increments
• QA staff test productivity
  – One test per hour

                 Model-Based Testing: What, Why, How   20
System Under Test
•   Unified process
•   About 90 use-cases, 650 KLOC Java
•   CORBA/IDL distributed object model
•   HA Sun server farm
•   Multi-host Oracle DBMS
•   Many interfaces
    – GUI (trading floor)
    – Many high speed program trading users
    – Many legacy input/output

                    Model-Based Testing: What, Why, How   21
MBT: Challenges and Solutions
• One time sample not                    • Simulator generates fresh,
  effective, but fresh test                accurate sample on demand
  suites too expense
• Too expensive to develop               • Oracle generates expected
  expected results                         on demand
• Too many test cases to                 • Comparator automates
  evaluate                                 checking
• Profile/Requirements                   • Incremental changes to rule
  change                                   base
• SUT interfaces change                  • Common agent interface


                      Model-Based Testing: What, Why, How            22
Test Input Generation
                                      10000000

• Simulation of users                  1000000


   – Use case profile                          100000

                                                           10000
   – 50 KLOC Prolog                                        1000

• Load Profile                                               100


   – Time domain variation                                    10

                                                               1
   – Orthogonal to event                                              1   2   3      4   5      6    7        8   9   10      11   12

     profile                                               3500.000


• Each generated event
                                                           3000.000

                                                           2500.000

  assigned a "port" and                                    2000.000
                                       Events Per Second




  submit time                                              1500.000

                                                           1000.000

• 1,000 to 750,000 unique                                   500.000


  tests for 4 hour session            -5000
                                             0.000

                                          -500.000
                                                   0                          5000       10000           15000        20000        25000
                                                                                             Time (seconds)
                      Model-Based Testing: What, Why, How                                                                           23
Automated Evaluation
• Oracle
  – Processes all test inputs
  – About 500 unique rules
  – Generates end of session “book”
• Comparator
  – Compares SUT “book” to oracle “book”
• Verification
  – “Splainer” rule backtracking
  – Rule/Run coverage analyzer
                 Model-Based Testing: What, Why, How   24
Test Harness
Simulator
                                    Oracle

  Adapter
                                                      Splainer
  Adapter

  Adapter                      Comparator

  Adapter                                          Test Verdict
            SUT                                    Run Reports
             Model-Based Testing: What, Why, How                  25
Technical Achievements
• AI-based user simulation generates test suites
• All inputs generated under operational profile
• High volume oracle and evaluation
• Every test run unique and realistic (about 200)
• Evaluated functionality and load response with
  fresh tests
• Effective control of many different test agents
  (COTS/ custom, Java/4Test/Perl/Sql/proprietary)

                  Model-Based Testing: What, Why, How   26
Technical Problems
• Stamp coupling
   – Simulator, Agents, Oracle, Comparator
• Re-factoring rule relationships, Prolog limitations
• Configuration hassles
• Scale-up constraints
• Distributed schedule brittleness
• Horn Clause Shock Syndrome

                    Model-Based Testing: What, Why, How   27
Results
• Revealed about 1,500 bugs over two years
  – 5% showstoppers
• Five person team, huge productivity increase
  – 1 TPH versus 1,800 TPH
• Achieved proven high reliability
  – Last pre-release test run: 500,000 events in two hours,
    no failures detected
  – No production failures
• Abandoned by successor QA staff

                   Model-Based Testing: What, Why, How    28
CASE STUDY:
MICROSOFT PROTOCOL INTEROPERABILITY
Challenges
• Prove interoperability to Federal Judge and
  court-appointed scrutineers
• Validation of documentation, not as-built
  implementation
• Is each TD all a third party needs to develop:
  – A client that interoperates with an existing service?
  – A service that interoperates with existing clients?
• Only use over-the-wire messages
                   Model-Based Testing: What, Why, How    30
Microsoft Protocols
• Remote API for a service                     • All product groups
                                                         –   Windows Server
                                                         –   Office
                                                         –   Exchange
                                                         –   SQL Server
                                                         –   Others
                                               • 500+ protocols
                                                         –   Remote Desktop
                                                         –   Active Directory
                                                         –   File System
                                                         –   Security
                                                         –   Many others


                   Model-Based Testing: What, Why, How                          31
Microsoft Technical Document (TD)
• Publish protocols as “Technical Documents”
• One TD for each protocol
• Black-box spec – no internals
• All data and behavior specified with text




                 Model-Based Testing: What, Why, How   32
Published Technical Docs
http://msdn.microsoft.com/en-us/library/cc216513(PROT.10).aspx




                    Model-Based Testing: What, Why, How          33
Validating Interoperability with MBT
Technical Document
                               Analysis                             Test
                           Data and                                 Requirements
                           behavior                                 Specification
                           statements


• Approximates third party implementation                                   Modeling
• Validates consistency with actual
  Windows implementation

                   Model assertions generate                           Model-based
                   and check response of actual                         Test Suite
WS 2008
                   Windows Services
WS 2003
WS 2000                  Test Execution
                                                           Stobie et al, © 2010 Microsoft. Adapted with permission.
                          Model-Based Testing: What, Why, How                                                  34
Protocol Quality Assurance Process
               TD v1                          TD v2                         TD vn
Authors
              Study                  Plan                      Design                    Final

          • Scrutinize         • Complete                • Complete              • Gen & Run
            TD                   Test Rqmts                Model                   Test Suite
Test      • Define Test        • High Level              • Complete              • Prep User
            Strategy             Test Plan                 Adapters                Doc
Suite
Developers
                      Review                  Review                   Review               Review
                          • TD ready?           • Test Rqmts             • Model Ok?         • Coverage
                          • Strategy OK?          OK?                    • Adapter Ok?         OK?
                                                • Plan OK?                                   • Test Code
                                                                                               OK?
Reviewers
                                 Model-Based Testing: What, Why, How                                  35
Productivity
“On average, model-                   Avg Hrs Per Test Requirement
based testing took 42%              Task
less time than hand-                Document review                    1.1
coding tests”                       Test requirement extract           0.8
                                    Model authoring                    0.5
Threshold result                    Traditional test coding            0.6
• Nearly all                        Adapter coding                     1.2
  requirements had                  Test case execution                0.6
  less than three tests             Final adjustments                  0.3
• Much greater gain for             Total, all phases                  5.1
  full coverage                                                Grieskamp et al.




                      Model-Based Testing: What, Why, How                   36
Results
• Published 500+ TDs, ~150,000 test requirements
• 50,000+ bugs, most identified before tests run
• Many Plugfests, many 3rd party users
• Released high interest test suites as open source
• Met all regulator requirements, on time
  – Judge closes DOJ anti-trust case May 12, 2011

• ~20 MSFT product teams now using Spec Explorer
                   Model-Based Testing: What, Why, How   37
TOOL THUMBNAILS
All product or company names mentioned herein may be trademarks or registered
trademarks of their respective owners.
CerifyIT
Smartesting
Model          Use cases, OCL; custom test stereotypes;
               keyword/action abstraction
Notation       UML 2, OCL, custom stereotypes, UML Test Profile
UML Support    Yes
Requirements   Interface to DOORS, HP QC, others
Traceability
Generation     Constraint solver selects minimal set of boundary values
Oracle         Post conditions in OCL, computed result for test point
Adapter        Natural language option; HP GUI drivers
Typical SUT    Financial, Smart Card
Notable        Top-down formally defined behavior; data stores; GUI
               model
                        Model-Based Testing: What, Why, How             39
Conformiq
Designer
Model          State machines with coded event/actions

Notation       Statecharts, Java
UML Support    Yes
Requirements   Integrated requirements, traceability matrix
Traceability
Generation     Graph traversal: state, transition, 2-switch
Oracle         Model post conditions, any custom function
Adapter        Output formatter, TTCN and user-defined
Typical SUT    Telecom, embedded
Notable        Timers; parallelism and concurrency; on-the-fly mode

                         Model-Based Testing: What, Why, How          40
MaTeLo
All4Tec
Model          State machine with transition probabilities (Markov);
               data domains, event timing
Notation       Decorated State Machine
UML Support    No
Requirements   Integrated requirements and trace matrix; import from
Traceability   DOORS, others
Generation     Most likely path, user defined, all transitions, Markov
               simulation; subset or full model
Oracle         User conditions; Matlab and Simulink
Adapter        EXAM mappers; Python output formatter
Typical SUT    hardware-in-the-loop; Automotive, Rail
Notable        Many standards-based device interfaces;
               supports software reliability engineering
                         Model-Based Testing: What, Why, How             41
Automatic Test Generation
IBM/Rational
Model          Sequence diagrams, flow charts, statecharts, codebase

Notation       UML, SysML, UML Testing Profile
UML Support    Yes
Requirements   DOORS integration; design model traceability
Traceability
Generation     Parses generated C++ to generate test cases; Reach
               states, transition, operations, events for modeled classes
Oracle         User code
Adapter        User code, merge generation
Typical SUT    Embedded
Notable        Part of systems engineering tool chain
                        Model-Based Testing: What, Why, How            42
Spec Explorer
Microsoft
Model          C# class with “action” method pre/post condition;
               regular expressions define “machine” of classes/actions
Notation       C#
UML Support    Sequence diagrams
Requirements   API for logging user defined requirements
Traceability
Generation     For any machine, constraint solver finds feasible short or
               long path of actions; generates C# runtime
Oracle         Action post conditions; any custom function
Adapter        User code
Typical SUT    Microsoft protocols, APIs, products
Notable        Pairwise data selection; on-the-fly mode; use any Dot
               Net capability
                        Model-Based Testing: What, Why, How            43
T-Vec/RAVE
T-Vec
Model          Boolean system with data boundaries; SCR types and
               modules; hierarchic modules
Notation       SCR-based, tabular definition; accepts Simulink
UML Support    No
Requirements   RAVE requirements management, interface to DOORS,
Traceability   others
Generation     Constraint solver identifies test points
Oracle         Solves constraints for expected value
Adapter        Output formatter; html, C++, java, perl, others
Typical SUT    Aerospace, DoD
Notable        Simulink for input, oracle, model checking; MCDC model
               coverage; non-linear and real-valued constraints
                         Model-Based Testing: What, Why, How        44
Close Cousins
• Data Generators
    – Grammar based
    – Pairwise, combinatoric
    – Fuzzers
•   TTCN-3 Compilers
•   Load Generators
•   Model Checkers
•   Model-driven Development tool chains

                   Model-Based Testing: What, Why, How   45
REAL TESTERS OF …
MBT User Survey
• Part of 1st Model-based Testing User
  Conference
  – Offered to many other tester communities
• In progress
• Preliminary analysis of responses to date
• https://www.surveymonkey.com/s/JSJVDJW



                 Model-Based Testing: What, Why, How   47
MBT Users, SUT Domain
               Gaming

          Social Media

                 Other

      Supercomputing

      Communications

Software Infrastructure

            Embedded

Transaction Processing

                          0%   5%      10%       15%      20%        25%   30%   35%   40%

                               Model-Based Testing: What, Why, How                           48
MBT User, Company Size
Employees

   10000 +

1001-10000

  501-1000

   101-500

    11-100

      1-10

             0%   5%    10%          15%          20%        25%   30%   35%
                       Model-Based Testing: What, Why, How                     49
MBT Users, Software Process
       Other

      Ad Hoc

    Waterfall

       Spiral

  Incremental

      XP/TDD

CMMI level 2+

        Agile

                0%   5%            10%               15%    20%   25%
                      Model-Based Testing: What, Why, How               50
How Used?
What stage of adoption?                             Who is the tool provider?


 Evaluation
                                                        In House

Pilot Project

                                                  Open Source
     Rollout


Routine use                                         Commercial

                0%   20%   40%      60%                                0% 20% 40% 60% 80%


                                 Model-Based Testing: What, Why, How                        51
What is the Overall MBT Role?
  At what scope is MBT used?                 What is overall test effort for
                                             each testing mode?



   System                                       Manual



Component                                 Hand-coded



     Unit                                Model-based


            0% 20% 40% 60% 80%                             25%   30%   35%   40%

                          Model-Based Testing: What, Why, How                      52
How Long to be Proficient?
                                                          Median: 100 hours
Hours                                                     of training/use to
                                                          become proficient
 160+




80-120




  1-40


         0%   10%          20%                30%            40%       50%

                    Model-Based Testing: What, Why, How                        53
How Bad are Common Problems?
                          Misses bugs

    Cant integrate w other test assets

  Developing SUT interfaces too hard

                 Inadequate coverage

Developing test models is too difficult

                    Oracle ineffective

        Too difficult to update model

                   Model "blows up"

                                          0%                           50%                 100%

           Worse than expected            Not an issue              Better than expected
                              Model-Based Testing: What, Why, How                           54
MBT Effect on Time, Cost, Quality?
                                                               Percent change
40%
          35%                                                  from baseline: e.g.,
35%                                               36%
                                                               35% fewer escaped
 30%                      28%
                                                               bugs, 0% more bugs
 25%                                23%
 20%
                                                             18%
 15%
 10%
  5%             0%
  0%                                                                  Better
                                                                      Worse
       Bugs Escaped
                      Overall Testing
                          Costs              Overall Testing
                                                 Time
                       Model-Based Testing: What, Why, How                      55
MBT Traction
  Overall, how effective is MBT?           How likely are you to continue
                                           using MBT?


                                            Not at all        0%
  No effect   4%

                                               Slightly        4%
   Slightly    13%
                                         Moderately                 21%
                          42%                                             38%
Moderately
                                                  Very
                                                                          38%
 Extremely                42%              Extremely



                        Model-Based Testing: What, Why, How                 56
CONCLUSIONS
What Have We Learned?
•   Test engineering with rigorous foundation
•   Global best practice
•   Broad applicability
•   Mature commercial offerings
•   Many proof points
•   Commitment and planning necessary
•   10x to 1,000x improvement possible

                  Model-Based Testing: What, Why, How   58
Q&A




                           rvbinder@gmail.com


Model-Based Testing: What, Why, How             59
Image Credits
Unless noted below, all content herein Copyright © Robert V. Binder, 2011.

•   Pensive Boy: Resource Rack, http://sites.google.com/site/resourcerack/mental
•   Isoquant Chart: MA Economics Blog, http://ma-economics.blogspot.com/2011/09/optimum-
    factor-combination.html
•   Derivatives Trading Floor: Money Mavens,
    http://medillmoneymavens.com/2009/02/11/cboe-and-cbot-a-story-in-two-floors/
•   Barrett Pettyman US Federal Courthouse: Earth in Pictures,
    http://www.earthinpictures.com/world/usa/washington,_d.c./e._barrett_prettyman_united_
    states_courthouse.html
•   Server Room: 1U Server Rack, http://1userverrack.net/2011/05/03/server-room-4/
•   Utility Knife: Marketing Tenerife, http://marketingtenerife.com/marketing-tools-in-tenerife/
•   Software Tester: IT Career Coach, http://www.it-career-coach.net/2010/02/14/the-job-of-
    software-testing-quality-assurance-career
•   Conclusion: European Network and information Security Agency (ENISA),
    http://www.enisa.europa.eu/media/news-items/summary-of-summer-
    school/image/image_view_fullscreen


                                 Model-Based Testing: What, Why, How                          60

Más contenido relacionado

La actualidad más candente

Basic software-testing-concepts
Basic software-testing-conceptsBasic software-testing-concepts
Basic software-testing-conceptsmedsherb
 
Software Testing Basics
Software Testing BasicsSoftware Testing Basics
Software Testing BasicsBelal Raslan
 
Types of Software Testing
Types of Software TestingTypes of Software Testing
Types of Software TestingNishant Worah
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation TestingArchana Krushnan
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual TestingDirecti Group
 
verification and validation
verification and validationverification and validation
verification and validationDinesh Pasi
 
Types of Software Testing | Edureka
Types of Software Testing | EdurekaTypes of Software Testing | Edureka
Types of Software Testing | EdurekaEdureka!
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance Webtech Learning
 
Software Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s GuideSoftware Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s GuideSyed Hassan Raza
 
Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...
Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...
Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...Edureka!
 
Black Box Testing
Black Box TestingBlack Box Testing
Black Box TestingTestbytes
 
Pairwise testing
Pairwise testingPairwise testing
Pairwise testingKanoah
 
Path testing, data flow testing
Path testing, data flow testingPath testing, data flow testing
Path testing, data flow testingpriyasoundar
 
Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.Tanzeem Aslam
 
Introduction to software testing
Introduction to software testingIntroduction to software testing
Introduction to software testingHadi Fadlallah
 
Integration testing
Integration testingIntegration testing
Integration testingqueen jemila
 

La actualidad más candente (20)

Basic software-testing-concepts
Basic software-testing-conceptsBasic software-testing-concepts
Basic software-testing-concepts
 
Black box & white-box testing technique
Black box & white-box testing techniqueBlack box & white-box testing technique
Black box & white-box testing technique
 
Software Testing Basics
Software Testing BasicsSoftware Testing Basics
Software Testing Basics
 
Types of Software Testing
Types of Software TestingTypes of Software Testing
Types of Software Testing
 
Cause effect graphing.ppt
Cause effect graphing.pptCause effect graphing.ppt
Cause effect graphing.ppt
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation Testing
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testing
 
verification and validation
verification and validationverification and validation
verification and validation
 
Types of Software Testing | Edureka
Types of Software Testing | EdurekaTypes of Software Testing | Edureka
Types of Software Testing | Edureka
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance
 
Software Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s GuideSoftware Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s Guide
 
Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...
Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...
Software Testing Tutorial For Beginners | Manual & Automation Testing | Selen...
 
Black Box Testing
Black Box TestingBlack Box Testing
Black Box Testing
 
Pairwise testing
Pairwise testingPairwise testing
Pairwise testing
 
Path testing, data flow testing
Path testing, data flow testingPath testing, data flow testing
Path testing, data flow testing
 
Introduction to White box testing
Introduction to White box testingIntroduction to White box testing
Introduction to White box testing
 
Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.Software Testing Strategies ,Validation Testing and System Testing.
Software Testing Strategies ,Validation Testing and System Testing.
 
Introduction to software testing
Introduction to software testingIntroduction to software testing
Introduction to software testing
 
Integration testing
Integration testingIntegration testing
Integration testing
 
Testing fundamentals
Testing fundamentalsTesting fundamentals
Testing fundamentals
 

Más de Bob Binder

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobHow to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobBob Binder
 
Lessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationLessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationBob Binder
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelBob Binder
 
Model-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowModel-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowBob Binder
 
Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Bob Binder
 
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Bob Binder
 
MTS: Controllable Test Objects
MTS: Controllable Test ObjectsMTS: Controllable Test Objects
MTS: Controllable Test ObjectsBob Binder
 
Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Bob Binder
 
The Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportThe Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportBob Binder
 
Performance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsPerformance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsBob Binder
 
Testing Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedTesting Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedBob Binder
 
mVerify Investor Overview
mVerify Investor OverviewmVerify Investor Overview
mVerify Investor OverviewBob Binder
 
MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.Bob Binder
 
Mobile Reliability Challenges
Mobile Reliability ChallengesMobile Reliability Challenges
Mobile Reliability ChallengesBob Binder
 
Experience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing EnvironmentExperience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing EnvironmentBob Binder
 
Testability: Factors and Strategy
Testability: Factors and StrategyTestability: Factors and Strategy
Testability: Factors and StrategyBob Binder
 
Test Objects -- They Just Work
Test Objects -- They Just WorkTest Objects -- They Just Work
Test Objects -- They Just WorkBob Binder
 
A Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryA Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryBob Binder
 
ISSRE 2008 Trip Report
ISSRE 2008 Trip ReportISSRE 2008 Trip Report
ISSRE 2008 Trip ReportBob Binder
 
Software Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesSoftware Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesBob Binder
 

Más de Bob Binder (20)

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobHow to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
 
Lessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationLessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentation
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next Level
 
Model-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowModel-based Testing: Today And Tomorrow
Model-based Testing: Today And Tomorrow
 
Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.
 
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
 
MTS: Controllable Test Objects
MTS: Controllable Test ObjectsMTS: Controllable Test Objects
MTS: Controllable Test Objects
 
Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology
 
The Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportThe Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision Support
 
Performance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsPerformance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier Applications
 
Testing Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedTesting Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons Learned
 
mVerify Investor Overview
mVerify Investor OverviewmVerify Investor Overview
mVerify Investor Overview
 
MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.
 
Mobile Reliability Challenges
Mobile Reliability ChallengesMobile Reliability Challenges
Mobile Reliability Challenges
 
Experience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing EnvironmentExperience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing Environment
 
Testability: Factors and Strategy
Testability: Factors and StrategyTestability: Factors and Strategy
Testability: Factors and Strategy
 
Test Objects -- They Just Work
Test Objects -- They Just WorkTest Objects -- They Just Work
Test Objects -- They Just Work
 
A Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryA Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS Story
 
ISSRE 2008 Trip Report
ISSRE 2008 Trip ReportISSRE 2008 Trip Report
ISSRE 2008 Trip Report
 
Software Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesSoftware Test Patterns: Successes and Challenges
Software Test Patterns: Successes and Challenges
 

Último

The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...Aggregage
 
Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.YounusS2
 
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostKubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostMatt Ray
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAshyamraj55
 
UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7DianaGray10
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1DianaGray10
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6DianaGray10
 
Computer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsComputer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsSeth Reyes
 
20200723_insight_release_plan
20200723_insight_release_plan20200723_insight_release_plan
20200723_insight_release_planJamie (Taka) Wang
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesDavid Newbury
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesThousandEyes
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Will Schroeder
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesMd Hossain Ali
 
UiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPathCommunity
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding TeamAdam Moalla
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Adtran
 
UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8DianaGray10
 
Bird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemBird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemAsko Soukka
 
Cybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxCybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxGDSC PJATK
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1DianaGray10
 

Último (20)

The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
The Data Metaverse: Unpacking the Roles, Use Cases, and Tech Trends in Data a...
 
Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.
 
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostKubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
 
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPAAnypoint Code Builder , Google Pub sub connector and MuleSoft RPA
Anypoint Code Builder , Google Pub sub connector and MuleSoft RPA
 
UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6
 
Computer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and HazardsComputer 10: Lesson 10 - Online Crimes and Hazards
Computer 10: Lesson 10 - Online Crimes and Hazards
 
20200723_insight_release_plan
20200723_insight_release_plan20200723_insight_release_plan
20200723_insight_release_plan
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond Ontologies
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
 
UiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation Developers
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™
 
UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8UiPath Studio Web workshop series - Day 8
UiPath Studio Web workshop series - Day 8
 
Bird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemBird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystem
 
Cybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxCybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptx
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1
 

Model-Based Testing: Why, What, How

  • 1. Model-Based Testing: Why, What, How Bob Binder System Verification Associates Juniper Systems Testing Conference November 9, 2011
  • 2. Overview • What is Model-Based Testing? • Testing Economics • Case Studies – Automated Derivatives Trading – Microsoft Protocol Interoperability • Product Thumbnails • Real Testers of … • Q&A Model-Based Testing: What, Why, How 2
  • 3. Why? • For Juniper: – Reduce cost of testing – Reduce time to market – Reduce cost of quality – Increase competitive advantage • For you: – Focus on System Under Test (SUT), not test hassles – Engineering discipline with rigorous foundation – Enhanced effectiveness and prestige – Future of testing Model-Based Testing: What, Why, How 3
  • 4. WHAT IS MODEL-BASED TESTING? Model-Based Testing: What, Why, How 4
  • 5. “All Testing is Model-Based” • Patterns for test design – Methods – Classes – Package and System Integration – Regression – Test Automation – Oracles • 35 patterns, each a test meta-model Model-Based Testing: What, Why, How 5
  • 6. What is a Test Model? TwoPlayerGame TwoPlayerGame Mode Machine test design pattern Two Play erG a me ( ) +TwoPlayerGame() α p1 _S tart ( ) / p2 _S tart ( ) / +p1_Start( ) s im u lat eV olle y( ) s im u lat eV olle y( ) ThreePlayerGame( ) /TwoPlayerGame( ) +p1_WinsVolley( ) G am e S tarte d α -p1_AddPoint( ) +p1_IsWinner( ) p1 _W ins V olle y ( ) p2 _W ins V olle y ( ) p1_Start( ) / p3_Start( )/ +p1_IsServer( ) [th is .p 1_ Sc ore ( ) < 20 ] / [th is .p 2_ Sc ore ( ) < 20 ] / simulateVolley( ) simulateVolley( ) +p1_Points( ) th is .p 1A ddP o in t( ) th is .p 2A ddP o in t( ) Game Started s im u lat eV olle y( ) p1 _W ins V olle y ( ) / s im u lat eV olle y( ) +p2_Start( ) s im u lat eV olle y( ) +p2_WinsVolley( ) P la ye r 1 P la ye r 2 p2_Start( ) / -p2_AddPoint( ) S erv ed S erv ed simulateVolley( ) +p2_IsWinner( ) p2 _W ins V olle y ( ) / p1_WinsVolley( ) / +p2_IsServer( ) p1 _W ins V olle y ( ) s im u lat eV olle y( ) p2 _W ins V olle y ( ) simulateVolley( ) +p2_Points( ) [th is .p 1_ Sc ore ( ) = = 20] / [th is .p 2_ Sc ore ( ) = = 20] / th is .p 1A ddP o in t( ) th is .p 1A ddP o in t( ) +~( ) p2_WinsVolley( ) P la ye r 1 P la ye r 2 p1_WinsVolley( ) [this.p2_Score( ) < 20] / p3_WinsVolley( ) p1 _Is W in ner( ) / retu rn TR UE ; Won Won p2 _Is W in ner( ) / retu rn TR UE ; [this.p1_Score( ) < 20] / this.p2AddPoint( ) [this.p3_Score( ) < 20] / ~( ) ~( ) this.p1AddPoint( ) simulateVolley( ) this.p3AddPoint( ) ω simulateVolley( ) simulateVolley( ) p1_WinsVolley( ) / p2_WinsVolley( ) / simulateVolley( ) simulateVolley( ) Player 1 Player 2 Player 3 Served Served Served ThreePlayerGame p2_WinsVolley( ) / p3_WinsVolley( ) / simulateVolley( ) simulateVolley( ) Th ree P la y erG am e ( ) / Two P la y erG am e ( ) α p1_WinsVolley( ) p3_WinsVolley( ) p 3_ S tart ( ) / [this.p1_Score( ) == 20] / [this.p3_Score( ) == 20] / s im ulat eV o lley ( ) ThreePlayerGame p 3_ W ins V o lle y( ) / G a m e S ta rt ed this.p1AddPoint( ) this.p3AddPoint( ) s im ulat eV o lley ( ) p3_WinsVolley( ) / +ThreePlayerGame() p 3_ W ins V o lle y( ) simulateVolley( ) +p3_Start( ) [t his .p 3_ S co re ( ) < 2 0] / p2_WinsVolley( ) th is . p3 A dd P oint ( ) +p3_WinsVolley( ) Tw oP lay erG am e ( ) s im ulat eV o lley ( ) [this.p2_Score( ) == 20] / -p3_AddPoint( ) p 1_ W ins V o lle y( ) / +p3_IsWinner( ) s im ulat eV o lley ( ) this.p1AddPoint( ) P la y er 3 +p3_IsServer( ) S erv e d +p3_Points( ) p 2_ W ins V o lle y( ) / s im ulat eV o lley ( ) +~( ) p 3_ W ins V o lle y( ) [t his .p 3_ S co re ( ) = = 2 0] / p1_IsWinner( ) / p2_IsWinner( ) / p3_IsWinner( ) / th is . p3 A dd P oint ( ) return TRUE; Player 1 return TRUE; Player 2 Player 3 return TRUE; Won Won Won P la y er 3 W on p 3_ Is W in ne r( ) / ~( ) ret urn TR UE ; ~( ) ~( ) ~( ) ω ω SUT Design Model Test Model Model-Based Testing: What, Why, How 6
  • 7. Model-based Test Suite 1 ThreePlayerGame( ) • N+ Strategy 2 3 p1_Start( ) p2_Start( ) 8 Player 2 Served 4 p3_Start( ) – Start at α 5 p1_WinsVolley( ) Player 1 Served 11 Player 3 Served 17 omega 6 p1_WinsVolley( )[this.p1_Score( ) < 20] *7 – Follow transition 7 8 p1_WinsVolley( ) [this.p1_Score( ) == 20] p2_WinsVolley( ) Player 1 W on 14 Player 1 W on path 9 10 p2_WinsVolley( ) [this.p2_Score( ) < 20] p2_WinsVolley( ) [this.p2_Score( ) == 20] *6 Player 1 Served 2 – Stop if ω or visited *9 Player 2 Served – Three loop 11 Player 3 Served 1 3 alpha Gam eStarted Player 2 Served 17 omega * 10 iterations Player 2 W on 15 Player 2 W on – Assumes state 5 Player 1 Served 4 * 12 observer Player 3 Served 17 omega 11 p3_WinsVolley( ) * 13 – Try all sneak paths Player 3 W on 12 p3_WinsVolley( ) [this.p3_Score( ) < 20] 16 Player 3 Served Player 3 W on 13 p3_WinsVolley( ) [this.p3_Score( ) == 20] 8 Player 2 Served 14 p1_IsWinner( ) 15 p2_IsWinner( ) 16 p3_IsWinner( ) 5 Player 1 Served 17 ~( ) N+ Test Suite Model-Based Testing: What, Why, How 7
  • 8. Automated Model-based Testing • Software that represents an SUT so that test inputs and expected results can be computed – Useful abstraction of SUT aspects – Algorithmic test input generation – Algorithmic expected result generation – Many possible data structures and algorithms • SUT interface for control and observation – Abstraction critical – Generated and/or hand-coded Model-Based Testing: What, Why, How 8
  • 9. How MBT Improves Quality Develop Missing, incorrect Requirements Ambiguous, missing, contradictory, incorrect, Model obscured, incomplete Coverage: Model error, omission Requirements, Generate Model, Code Inputs Expected Outputs Evaluate (Test Sequences) (Test Oracle) Control Observe Reliability Estimate SUT SUT Bug Stobie et al, © 2010 Microsoft. Adapted with permission. Model-Based Testing: What, Why, How 9
  • 10. Typical Test Configuration Test Suite Control Agent Adapter Adapter System Under Test Transport Transport Transport Transport Test Suite Host OS SUT OS Model-Based Testing: What, Why, How 10
  • 11. Typical MBT Environment Reqmts DB MBT Tool Design DB Test Suite Control Agent Bug DB Adapter Adapter System Under Test Code Stack Transport Transport Transport Transport Test Suite Host Test Manager Test Host OS SUT OS Development Environment Configuration Management Model-Based Testing: What, Why, How 11
  • 13. Show Me the Money How much of this … for one of these? Model-Based Testing: What, Why, How 13
  • 14. Testing by Poking Around Manual “Exploratory” Testing System Under Test + No tooling costs No testware costs Subjective, wide variation Low coverage - Quick start Not repeatable Opportunistic Can’t scale Qualitative feedback Inconsistent Model-Based Testing: What, Why, How 14
  • 15. Manual Testing Manual Test Input Manual Manual Test Results System Under Test Test Design Evaluation Test Setup + Flexible, no SUT coupling Systematic coverage 1 test per hour Usually not repeatable/ed - No tooling costs Not scalable No testware cost Inconsistent Usage validation Tends to “sunny day” tests Model-Based Testing: What, Why, How 15
  • 16. Hand-coded Test Driver Manual Test Driver Test Design Programming System Under Test - 10+ tests per hour Tooling costs + Repeatable Predictable Testware costs Brittle, high maintenance cost Consistent Short half-life Continuous Integration, TDD Technology focus Model-Based Testing: What, Why, How 16
  • 17. Model-based Testing Modeling, Automated Automated Setup and Generation Execution System Under Test + 1000+ tests per hour Maintain model (not testware) Tooling costs Training costs - Intellectual control Paradigm shift Explore complex space Still need manual, coded tests Consistent coverage Model-Based Testing: What, Why, How 17
  • 18. Test Automation Envelope Reliability (Effectiveness) 5 Nines Model-based 4 Nines 3 Nines Automated Driver 2 Nines Manual 1 Nine 1 10 100 1,000 10,000 Productivity: Tests/Hour (Efficiency) Model-Based Testing: What, Why, How 18
  • 19. CASE STUDY: REAL TIME DERIVATIVES TRADING
  • 20. Real Time Derivatives Trading • “Screen-based trading” over private network – 3 million transactions per hour – 15 billion dollars per day • Six development increments – 3 years – 3 to 5 months per iteration – Testing cycle shadows dev increments • QA staff test productivity – One test per hour Model-Based Testing: What, Why, How 20
  • 21. System Under Test • Unified process • About 90 use-cases, 650 KLOC Java • CORBA/IDL distributed object model • HA Sun server farm • Multi-host Oracle DBMS • Many interfaces – GUI (trading floor) – Many high speed program trading users – Many legacy input/output Model-Based Testing: What, Why, How 21
  • 22. MBT: Challenges and Solutions • One time sample not • Simulator generates fresh, effective, but fresh test accurate sample on demand suites too expense • Too expensive to develop • Oracle generates expected expected results on demand • Too many test cases to • Comparator automates evaluate checking • Profile/Requirements • Incremental changes to rule change base • SUT interfaces change • Common agent interface Model-Based Testing: What, Why, How 22
  • 23. Test Input Generation 10000000 • Simulation of users 1000000 – Use case profile 100000 10000 – 50 KLOC Prolog 1000 • Load Profile 100 – Time domain variation 10 1 – Orthogonal to event 1 2 3 4 5 6 7 8 9 10 11 12 profile 3500.000 • Each generated event 3000.000 2500.000 assigned a "port" and 2000.000 Events Per Second submit time 1500.000 1000.000 • 1,000 to 750,000 unique 500.000 tests for 4 hour session -5000 0.000 -500.000 0 5000 10000 15000 20000 25000 Time (seconds) Model-Based Testing: What, Why, How 23
  • 24. Automated Evaluation • Oracle – Processes all test inputs – About 500 unique rules – Generates end of session “book” • Comparator – Compares SUT “book” to oracle “book” • Verification – “Splainer” rule backtracking – Rule/Run coverage analyzer Model-Based Testing: What, Why, How 24
  • 25. Test Harness Simulator Oracle Adapter Splainer Adapter Adapter Comparator Adapter Test Verdict SUT Run Reports Model-Based Testing: What, Why, How 25
  • 26. Technical Achievements • AI-based user simulation generates test suites • All inputs generated under operational profile • High volume oracle and evaluation • Every test run unique and realistic (about 200) • Evaluated functionality and load response with fresh tests • Effective control of many different test agents (COTS/ custom, Java/4Test/Perl/Sql/proprietary) Model-Based Testing: What, Why, How 26
  • 27. Technical Problems • Stamp coupling – Simulator, Agents, Oracle, Comparator • Re-factoring rule relationships, Prolog limitations • Configuration hassles • Scale-up constraints • Distributed schedule brittleness • Horn Clause Shock Syndrome Model-Based Testing: What, Why, How 27
  • 28. Results • Revealed about 1,500 bugs over two years – 5% showstoppers • Five person team, huge productivity increase – 1 TPH versus 1,800 TPH • Achieved proven high reliability – Last pre-release test run: 500,000 events in two hours, no failures detected – No production failures • Abandoned by successor QA staff Model-Based Testing: What, Why, How 28
  • 29. CASE STUDY: MICROSOFT PROTOCOL INTEROPERABILITY
  • 30. Challenges • Prove interoperability to Federal Judge and court-appointed scrutineers • Validation of documentation, not as-built implementation • Is each TD all a third party needs to develop: – A client that interoperates with an existing service? – A service that interoperates with existing clients? • Only use over-the-wire messages Model-Based Testing: What, Why, How 30
  • 31. Microsoft Protocols • Remote API for a service • All product groups – Windows Server – Office – Exchange – SQL Server – Others • 500+ protocols – Remote Desktop – Active Directory – File System – Security – Many others Model-Based Testing: What, Why, How 31
  • 32. Microsoft Technical Document (TD) • Publish protocols as “Technical Documents” • One TD for each protocol • Black-box spec – no internals • All data and behavior specified with text Model-Based Testing: What, Why, How 32
  • 34. Validating Interoperability with MBT Technical Document Analysis Test Data and Requirements behavior Specification statements • Approximates third party implementation Modeling • Validates consistency with actual Windows implementation Model assertions generate Model-based and check response of actual Test Suite WS 2008 Windows Services WS 2003 WS 2000 Test Execution Stobie et al, © 2010 Microsoft. Adapted with permission. Model-Based Testing: What, Why, How 34
  • 35. Protocol Quality Assurance Process TD v1 TD v2 TD vn Authors Study Plan Design Final • Scrutinize • Complete • Complete • Gen & Run TD Test Rqmts Model Test Suite Test • Define Test • High Level • Complete • Prep User Strategy Test Plan Adapters Doc Suite Developers Review Review Review Review • TD ready? • Test Rqmts • Model Ok? • Coverage • Strategy OK? OK? • Adapter Ok? OK? • Plan OK? • Test Code OK? Reviewers Model-Based Testing: What, Why, How 35
  • 36. Productivity “On average, model- Avg Hrs Per Test Requirement based testing took 42% Task less time than hand- Document review 1.1 coding tests” Test requirement extract 0.8 Model authoring 0.5 Threshold result Traditional test coding 0.6 • Nearly all Adapter coding 1.2 requirements had Test case execution 0.6 less than three tests Final adjustments 0.3 • Much greater gain for Total, all phases 5.1 full coverage Grieskamp et al. Model-Based Testing: What, Why, How 36
  • 37. Results • Published 500+ TDs, ~150,000 test requirements • 50,000+ bugs, most identified before tests run • Many Plugfests, many 3rd party users • Released high interest test suites as open source • Met all regulator requirements, on time – Judge closes DOJ anti-trust case May 12, 2011 • ~20 MSFT product teams now using Spec Explorer Model-Based Testing: What, Why, How 37
  • 38. TOOL THUMBNAILS All product or company names mentioned herein may be trademarks or registered trademarks of their respective owners.
  • 39. CerifyIT Smartesting Model Use cases, OCL; custom test stereotypes; keyword/action abstraction Notation UML 2, OCL, custom stereotypes, UML Test Profile UML Support Yes Requirements Interface to DOORS, HP QC, others Traceability Generation Constraint solver selects minimal set of boundary values Oracle Post conditions in OCL, computed result for test point Adapter Natural language option; HP GUI drivers Typical SUT Financial, Smart Card Notable Top-down formally defined behavior; data stores; GUI model Model-Based Testing: What, Why, How 39
  • 40. Conformiq Designer Model State machines with coded event/actions Notation Statecharts, Java UML Support Yes Requirements Integrated requirements, traceability matrix Traceability Generation Graph traversal: state, transition, 2-switch Oracle Model post conditions, any custom function Adapter Output formatter, TTCN and user-defined Typical SUT Telecom, embedded Notable Timers; parallelism and concurrency; on-the-fly mode Model-Based Testing: What, Why, How 40
  • 41. MaTeLo All4Tec Model State machine with transition probabilities (Markov); data domains, event timing Notation Decorated State Machine UML Support No Requirements Integrated requirements and trace matrix; import from Traceability DOORS, others Generation Most likely path, user defined, all transitions, Markov simulation; subset or full model Oracle User conditions; Matlab and Simulink Adapter EXAM mappers; Python output formatter Typical SUT hardware-in-the-loop; Automotive, Rail Notable Many standards-based device interfaces; supports software reliability engineering Model-Based Testing: What, Why, How 41
  • 42. Automatic Test Generation IBM/Rational Model Sequence diagrams, flow charts, statecharts, codebase Notation UML, SysML, UML Testing Profile UML Support Yes Requirements DOORS integration; design model traceability Traceability Generation Parses generated C++ to generate test cases; Reach states, transition, operations, events for modeled classes Oracle User code Adapter User code, merge generation Typical SUT Embedded Notable Part of systems engineering tool chain Model-Based Testing: What, Why, How 42
  • 43. Spec Explorer Microsoft Model C# class with “action” method pre/post condition; regular expressions define “machine” of classes/actions Notation C# UML Support Sequence diagrams Requirements API for logging user defined requirements Traceability Generation For any machine, constraint solver finds feasible short or long path of actions; generates C# runtime Oracle Action post conditions; any custom function Adapter User code Typical SUT Microsoft protocols, APIs, products Notable Pairwise data selection; on-the-fly mode; use any Dot Net capability Model-Based Testing: What, Why, How 43
  • 44. T-Vec/RAVE T-Vec Model Boolean system with data boundaries; SCR types and modules; hierarchic modules Notation SCR-based, tabular definition; accepts Simulink UML Support No Requirements RAVE requirements management, interface to DOORS, Traceability others Generation Constraint solver identifies test points Oracle Solves constraints for expected value Adapter Output formatter; html, C++, java, perl, others Typical SUT Aerospace, DoD Notable Simulink for input, oracle, model checking; MCDC model coverage; non-linear and real-valued constraints Model-Based Testing: What, Why, How 44
  • 45. Close Cousins • Data Generators – Grammar based – Pairwise, combinatoric – Fuzzers • TTCN-3 Compilers • Load Generators • Model Checkers • Model-driven Development tool chains Model-Based Testing: What, Why, How 45
  • 47. MBT User Survey • Part of 1st Model-based Testing User Conference – Offered to many other tester communities • In progress • Preliminary analysis of responses to date • https://www.surveymonkey.com/s/JSJVDJW Model-Based Testing: What, Why, How 47
  • 48. MBT Users, SUT Domain Gaming Social Media Other Supercomputing Communications Software Infrastructure Embedded Transaction Processing 0% 5% 10% 15% 20% 25% 30% 35% 40% Model-Based Testing: What, Why, How 48
  • 49. MBT User, Company Size Employees 10000 + 1001-10000 501-1000 101-500 11-100 1-10 0% 5% 10% 15% 20% 25% 30% 35% Model-Based Testing: What, Why, How 49
  • 50. MBT Users, Software Process Other Ad Hoc Waterfall Spiral Incremental XP/TDD CMMI level 2+ Agile 0% 5% 10% 15% 20% 25% Model-Based Testing: What, Why, How 50
  • 51. How Used? What stage of adoption? Who is the tool provider? Evaluation In House Pilot Project Open Source Rollout Routine use Commercial 0% 20% 40% 60% 0% 20% 40% 60% 80% Model-Based Testing: What, Why, How 51
  • 52. What is the Overall MBT Role? At what scope is MBT used? What is overall test effort for each testing mode? System Manual Component Hand-coded Unit Model-based 0% 20% 40% 60% 80% 25% 30% 35% 40% Model-Based Testing: What, Why, How 52
  • 53. How Long to be Proficient? Median: 100 hours Hours of training/use to become proficient 160+ 80-120 1-40 0% 10% 20% 30% 40% 50% Model-Based Testing: What, Why, How 53
  • 54. How Bad are Common Problems? Misses bugs Cant integrate w other test assets Developing SUT interfaces too hard Inadequate coverage Developing test models is too difficult Oracle ineffective Too difficult to update model Model "blows up" 0% 50% 100% Worse than expected Not an issue Better than expected Model-Based Testing: What, Why, How 54
  • 55. MBT Effect on Time, Cost, Quality? Percent change 40% 35% from baseline: e.g., 35% 36% 35% fewer escaped 30% 28% bugs, 0% more bugs 25% 23% 20% 18% 15% 10% 5% 0% 0% Better Worse Bugs Escaped Overall Testing Costs Overall Testing Time Model-Based Testing: What, Why, How 55
  • 56. MBT Traction Overall, how effective is MBT? How likely are you to continue using MBT? Not at all 0% No effect 4% Slightly 4% Slightly 13% Moderately 21% 42% 38% Moderately Very 38% Extremely 42% Extremely Model-Based Testing: What, Why, How 56
  • 58. What Have We Learned? • Test engineering with rigorous foundation • Global best practice • Broad applicability • Mature commercial offerings • Many proof points • Commitment and planning necessary • 10x to 1,000x improvement possible Model-Based Testing: What, Why, How 58
  • 59. Q&A rvbinder@gmail.com Model-Based Testing: What, Why, How 59
  • 60. Image Credits Unless noted below, all content herein Copyright © Robert V. Binder, 2011. • Pensive Boy: Resource Rack, http://sites.google.com/site/resourcerack/mental • Isoquant Chart: MA Economics Blog, http://ma-economics.blogspot.com/2011/09/optimum- factor-combination.html • Derivatives Trading Floor: Money Mavens, http://medillmoneymavens.com/2009/02/11/cboe-and-cbot-a-story-in-two-floors/ • Barrett Pettyman US Federal Courthouse: Earth in Pictures, http://www.earthinpictures.com/world/usa/washington,_d.c./e._barrett_prettyman_united_ states_courthouse.html • Server Room: 1U Server Rack, http://1userverrack.net/2011/05/03/server-room-4/ • Utility Knife: Marketing Tenerife, http://marketingtenerife.com/marketing-tools-in-tenerife/ • Software Tester: IT Career Coach, http://www.it-career-coach.net/2010/02/14/the-job-of- software-testing-quality-assurance-career • Conclusion: European Network and information Security Agency (ENISA), http://www.enisa.europa.eu/media/news-items/summary-of-summer- school/image/image_view_fullscreen Model-Based Testing: What, Why, How 60