SlideShare una empresa de Scribd logo
1 de 41
Descargar para leer sin conexión
Software Testing:
Models, Patterns, Tools
  Guest Lecture, UIC CS 540
    November 16, 2010
      Robert V. Binder
Overview
•   Test design pattern fly-by
•   Levels of testing
•   Case study: CBOE Direct
•   Q&A
TEST DESIGN PATTERNS
Test Design Patterns
• Software testing, c. 1995
    – A large and fragmented body of knowledge
    – Few ideas about testing OO software
• Challenges
    –   Re-interpret wealth of knowledge for OO
    –   Address unique OO considerations
    –   Systematic presentation
    –   Uniform analytical framework
• Patterns looked like a useful schema
    – Existing templates didn’t address unique testing
      issues
4
Some Footprints
1995 Design Patterns       2003 Briand’s Experiments
1995 Beizer, Black Box     2003 Dot Net Test Objects
     Testing
1995 Firesmith PLOOT       2003 Microsoft Patterns Group
1995 McGregor              2004 Java Testing Patterns
1999 TOOSMPT               2005 JUnit Anti Patterns
2000 Tutorial Experiment   2007 Test Object Anti Patterns
2001 POST Workshops (4)    2007 Software QS-TAG


                                                            5
Test Design Patterns
             • Pattern schema for
               test design
               – Methods
               – Classes
               – Package and System
                 Integration
               – Regression
               – Test Automation
               – Oracles

         6
Test Design Patterns
                     • Pattern schema for
Name/Intent            test design
Context
                      Test Model
Fault Model
                      Test Procedure
Strategy
                      Oracle
Entry Criteria
                      Automation
Exit Criteria
Consequences
Known Uses
                 7
Test Design Patterns
       • Method Scope                           • Class/Cluster Scope
              –    Category-Partition              –   Invariant Boundaries
              –    Combinational Function          –   Modal Class
              –    Recursive Function              –   Quasi-Modal Class
              –    Polymorphic Message             –   Polymorphic Server
                                                   –   Modal Hierarchy




© 2000 Robert V. Binder, all
                                            8
rights reserved
Modal Class:
                     Implementation and Test Models
                                                          TwoPlayerGame
                                               Two Play erG am e ( )
                                       α
TwoPlayerGame                                p1 _S tart ( ) /                                      p2 _S tart ( ) /
+TwoPlayerGame()                             s im u lat eV olle y( )                               s im u lat eV olle y( )                                                                     ThreePlayerGame( ) /TwoPlayerGame( )
                                                                          G am e S tarte d                                                                                                α
+p1_Start( )
+p1_WinsVolley( )                                                                                                                                                                             p1_Start( ) /                                   p3_Start( )/
                      p1 _W ins V olle y ( )                                                                       p2 _W ins V olle y ( )
-p1_AddPoint( )                                                                                                                                                                               simulateVolley( )                               simulateVolley( )
+p1_IsWinner( )
                      [th is .p 1_ Sc ore ( ) < 20 ] /
                      th is .p 1A ddP oin t( )
                                                                                                                   [th is .p 2_ Sc ore ( ) < 20 ] /
                                                                                                                   th is .p 2A ddP oin t( )
                                                                                                                                                                                                                       Game Started
+p1_IsServer( )       s im u lat eV olle y( )                          p1 _W ins V olle y ( ) /                    s im u lat eV olle y( )
+p1_Points( )                                                          s im u lat eV olle y( )                                                                                                                                p2_Start( ) /
+p2_Start( )                                      P la ye r 1                                          P la ye r 2
                                                                                                                                                                                                                              simulateVolley( )
+p2_WinsVolley( )                                 S erv ed                                             S erv ed                                                                               p1_WinsVolley( ) /
-p2_AddPoint( )                                                        p2 _W ins V olle y ( ) /                                                                                               simulateVolley( )
                                                                       s im u lat eV olle y( )
+p2_IsWinner( )              p1 _W ins V olle y ( )                                                    p2 _W ins V olle y ( )
+p2_IsServer( )              [th is .p 1_ Sc ore ( ) = = 20] /                                         [th is .p 2_ Sc ore ( ) = = 20] /
+p2_Points( )                th is .p 1A ddP oin t( )                                                  th is .p 1A ddP oin t( )
                                                                                                                                                                                                                                p2_WinsVolley( )
+~( )                                                                                                                                                  p1_WinsVolley( )                                                         [this.p2_Score( ) < 20] /              p3_WinsVolley( )
                                                                                                                                                       [this.p1_Score( ) < 20] /                                                this.p2AddPoint( )                     [this.p3_Score( ) < 20] /
                                                                P la ye r 1                  P la ye r 2
                       p1 _Is W in ner( ) /                     Won                          Won
                                                                                                                                                       this.p1AddPoint( )                                                       simulateVolley( )                      this.p3AddPoint( )
                                                                                                                         p2 _Is W in ner( ) /
                       retu rn TR UE ;                                                                                   retu rn TR UE ;               simulateVolley( )                                                                                               simulateVolley( )
                                                                       ~( )             ~( )
                                                                                                                                                                                                 p1_WinsVolley( ) /                      p2_WinsVolley( ) /
                                                                                  ω                                                                                                              simulateVolley( )                       simulateVolley( )
                                                                                                                                                                             Player 1                                    Player 2                              Player 3
                                                                                                                                                                             Served                                      Served                                Served
                                                                                                                                                                                                 p2_WinsVolley( ) /                      p3_WinsVolley( ) /
                                                                                                                                                                                                 simulateVolley( )                       simulateVolley( )
                                                         ThreePlayerGame                                                                              p1_WinsVolley( )                                                                                               p3_WinsVolley( )
                                                                                                                                                      [this.p1_Score( ) == 20] /                                                                                     [this.p3_Score( ) == 20] /
                                       Th ree P la y erG a m e ( ) / Two P la y erG am e ( )
                                                                                                                                                      this.p1AddPoint( )                                                                                             this.p3AddPoint( )
                                   α                                                                                                                                                      p3_WinsVolley( ) /
                                                                                                    p 3_ S tart ( ) /                                                                     simulateVolley( )
                                                                                                    s im ulat eV o lley ( )                                                                                                  p2_WinsVolley( )
ThreePlayerGame                                                        G a m e S ta rt ed
                                 p 3_ W ins V o lle y( ) /                                                                                                                                                                   [this.p2_Score( ) == 20] /
+ThreePlayerGame()               s im ulat eV o lley ( )                                                                                                                                                                     this.p1AddPoint( )
+p3_Start( )                                                                                                    p 3_ W ins V o lle y( )
                                                                                                                [t his .p 3_ S co re ( ) < 2 0] /
+p3_WinsVolley( )                                                                                               th is . p3 A dd P oint ( )
                         Tw oP lay erG am e ( )
-p3_AddPoint( )                                                                                                 s im ulat eV o lley ( )
                                                                   p 1_ W ins V o lle y( ) /
+p3_IsWinner( )                                                    s im ulat eV o lley ( )                                                             p1_IsWinner( ) /                             p2_IsWinner( ) /                                                        p3_IsWinner( ) /
+p3_IsServer( )                                                                                      P la y er 3                                       return TRUE;         Player 1                return TRUE;          Player 2                             Player 3     return TRUE;
+p3_Points( )                                                                                        S erv e d                                                              Won                                           Won                                  Won
+~( )                                                              p 2_ W ins V o lle y( ) /
                                                                   s im ulat eV o lley ( )
                                                                                                     p 3_ W ins V o lle y( )
                                                                                                                                                                                                                                ~( )
                                                                                                     [t his .p 3_ S co re ( ) = = 2 0] /
                                                                                                                                                                                   ~( )                                                                       ~( )
                                                                                                     th is . p3 A dd P oint ( )                                                                                             ω

                                                                                         P la y er 3
                                                                                         W on                          p 3_ Is W in ne r( ) /
                                                                                                                       ret urn TR UE ;
                                                                                      ~( )
                                                                              ω

                                                                                                                                                                                                                                                                                         9
Test Plan and Test Size
• K events             1
                       2
                            ThreePlayerGame( )
                            p1_Start( )
                                                                                                      8    Player 2 Served
                       3    p2_Start( )

• N states             4
                       5
                            p3_Start( )
                            p1_WinsVolley( )
                                                                                  Player 1 Served
                                                                                                     11    Player 3 Served
                                                                                                                             17      omega
                       6    p1_WinsVolley( )[this.p1_Score( ) < 20]
                                                                                                     *7
                       7    p1_WinsVolley( ) [this.p1_Score( ) == 20]                                       Player 1 W on
                                                                                                                             14
                       8    p2_WinsVolley( )                                                                                      Player 1 W on
                       9    p2_WinsVolley( ) [this.p2_Score( ) < 20]
                                                                                                     *6    Player 1 Served


• With LSIFs
                       10   p2_WinsVolley( ) [this.p2_Score( ) == 20]
                                                                          2
                                                                                                    *9     Player 2 Served



   – KN tests                         alpha
                                                    1
                                                           Gam eStarted
                                                                              3
                                                                                  Player 2 Served
                                                                                                     11    Player 3 Served
                                                                                                                             17      omega
                                                                                                    * 10
                                                                                                            Player 2 W on
                                                                                                                             15
                                                                                                                                  Player 2 W on

                                                                                                      5    Player 1 Served


• No LSIFs                                                                4

                                                                                                    * 12   Player 3 Served
                                                                                                                             17      omega

   – K×   N3   tests   11
                       12
                            p3_WinsVolley( )
                            p3_WinsVolley( ) [this.p3_Score( ) < 20]
                                                                                  Player 3 Served
                                                                                                    * 13    Player 3 W on
                                                                                                                             16
                                                                                                                                  Player 3 W on
                       13   p3_WinsVolley( ) [this.p3_Score( ) == 20]                                 8
                                                                                                           Player 2 Served
                       14   p1_IsWinner( )
                       15   p2_IsWinner( )
                       16   p3_IsWinner( )                                                            5    Player 1 Served

                       17   ~( )

                                                                                                                                              10
Test Design Patterns
       • Subsystem Scope                        • Reusable Components
              –    Class Associations             –   Abstract Class
              –    Round-Trip Scenarios           –   Generic Class
              –    Mode Machine                   –   New Framework
              –    Controlled Exceptions          –   Popular Framework




© 2000 Robert V. Binder, all
                                           11
rights reserved
Test Design Patterns
       • Intra-class Integration             • Integration Strategy
              – Small Pop                       –   Big Bang
              – Alpha-Omega Cycle               –   Bottom up
                                                –   Top Down
                                                –   Collaborations
                                                –   Backbone
                                                –   Layers
                                                –   Client/Server
                                                –   Distributed Services
                                                –   High Frequency



© 2000 Robert V. Binder, all
                                        12
rights reserved
Test Design Patterns
• System Scope                               • Regression Testing
       – Extended Use Cases                     –   Retest All
       – Covered in CRUD                        –   Retest Risky Use Cases
       – Allocate by Profile                    –   Retest Profile
                                                –   Retest Changed Code
                                                –   Retest Within Firewall




© 2000 Robert V. Binder, all
                                        13
rights reserved
Test Oracle Patterns
• Smoke Test                                 •   Reversing
• Judging                                    •   Simulation
       – Testing By Poking Around
       – Code-Based Testing                  •   Approximation
       – Post Test Analysis                  •   Regression
• Pre-Production                             •   Voting
• Built-in Test
                                             •   Substitution
• Gold Standard
       –    Custom Test Suite                •   Equivalency
       –    Random Input Generation
       –    Live Input
       –    Parallel System


© 2000 Robert V. Binder, all
                                        14
rights reserved
Test Automation Patterns
• Test Case Implementation                  • Test Drivers
       – Test Case/Test Suite                  – TestDriver Super Class
         Method                                – Percolate the Object Under
       – Test Case /Test Suite Class             Test
       – Catch All Exceptions                  – Symmetric Driver
• Test Control                                 – Subclass Driver
       – Server Stub                           – Private Access Driver
       – Server Proxy                          – Test Control Interface
                                               – Drone
                                               – Built-in Test Driver



© 2000 Robert V. Binder, all
                                       15
rights reserved
Test Automation Patterns
       • Test Execution                       • Built-in Test
              – Command Line Test                – Coherence idiom
                Bundle                           – Percolation
              – Incremental Testing              – Built-in Test Driver
                Framework (e.g. Junit)
              – Fresh Objects




© 2000 Robert V. Binder, all
                                         16
rights reserved
Percolation Pattern          Base

                                       +
                                       +
                                       +
                                           Base()
                                           ~Base()
                                           foo()
                                       +   bar()

                                       #   invariant()
                                       #   fooPre()


• Enforces Liskov Subsitutability      #
                                       #
                                       #
                                           fooPost()
                                           barPre()
                                           barPost()




• Implement with No Code Left Behind   Derived1

                                       +   Derived1()
                                       +   ~Derived1()
                                       +   foo()
                                       +   bar()
                                       +   fum()

                                       #   invariant()
                                       #   fooPre()
                                       #   fooPost()
                                       #   barPre()
                                       #   barPost()
                                       #   fumPre()
                                       #   fumPost()




                                       Derived2

                                       +   Derived2()
                                       +   ~Derived2()
                                       +   foo()
                                       +   bar()
                                       +   fee()

                                       #   invariant()
                                       #   fooPre()
                                       #   fooPost()
                                       #   barPre()
                                       #   barPost()
                                       #   feePre()
                                       #   feePost()
                                                         17
Ten Years After …
• Many new design patterns for hand-crafted test
  automation
     – Elaboration of Incremental Test Framework (e.g. JUnit)
     – Platform-specific or application-specific
     – Narrow scope
•    Few new test design patterns
•    No new oracle patterns
•    Attempts to generate tests from design patterns
•    To date 10,000+ copies of TOOSMPT

18
What Have We Learned?
 • TP are effective for articulation of insight and
   practice
     – Requires discipline to develop
     – Supports research and tool implementation
 • Do not “work out of the box”
     – Requires discipline in application
     – Enabling factors
 • Irrelevant to the uninterested, undisciplined
     – Low incremental benefit
     – Readily available substitutes
 • Broadly influential, but not compelling

19
TEST AUTOMATION LEVELS
What is good testing?
 • Value creation (not technical merit)
        – Effectiveness (reliability/quality increase)
        – Efficiency (average cost per test)
 • Levels
        – 1: Testing by poking around
        – 2: Manual Testing
        – 3: Automated Test Script/Test Objects
        – 4: Model-based
        – 5: Full Test Automation
            Each Level 10x Improvement
© 2004 mVerify Corporation   21
Level 1: Testing by Poking Around



                             Manual
                             “Exploratory”
                             Testing


      •Low Coverage
      •Not Repeatable
      •Can’t Scale
      •Inconsistent                          System Under Test

© 2004 mVerify Corporation             22
Level 2: Manual Testing

                                          Test Setup




Manual                       Manual
Test Design/                 Test Input
Generation




  •1 test per hour
  •Not repeatable
                                  Test Results System Under Test
                                  Evaluation
© 2004 mVerify Corporation        23
Level 3: Automated Test Script

                                             Test Setup




Manual                       Test Script
Test Design/                 Programming
Generation


  •10+ tests per hour
  •Repeatable
  •High change cost                   Test Results System Under Test
                                      Evaluation
© 2004 mVerify Corporation             24
Level 4: Automated Model-based

                                          Test Setup



  Model-based
                             Automatic
  Test Design/
                             Test
  Generation
                             Execution



•1000+ tests per hour
•High fidelity

                                    Test Results System Under Test
                                    Evaluation
© 2004 mVerify Corporation          25
Level 5: Total Automation

                                          Automated
                                          Test Setup



    Model-based              Automatic
    Test Design/             Test
    Generation               Execution




   •10,000 TPH
                                   Automated
                                   Test Results
                                                  System Under Test
                                   Evaluation
© 2004 mVerify Corporation          26
MODEL-BASED TESTING OF
CBOE DIRECT
CBOE Direct ®
• Electronic technology platform built and maintained in-house by
  Chicago Board Options Exchange (CBOE)
   – Multiple trading models configurable by product
   – Multiple matching algorithms (options, futures, stocks, warrants,
     single stock futures)
   – Best features of screen-based trading and floor-based markets
• Electronic trading on CBOE, the CBOE Futures Exchange (CFX), and
  the CBOE Stock Exchange (CBSX), others
• As of April 2008:
   –   More than 188,000 listed products
   –   More than 3.8 billion industry quotes handled from OPRA on peak day
   –   More than two billion quotes on peak day
   –   More than 684,000 orders on peak day
   –   More than 124,000 peak quotes per second
   –   Less than 5 ms response time for quotes
Development
• Rational Unified process
     – Six development increments
     – 3 to 5 months
     – Test design/implementation parallel with app dev
•    Three + years, version 1.0 live Q4 2001
•    About 90 use-cases, 650 KLOC Java
•    CORBA/IDL distributed objects
•    Java (services and GUI), some XML
•    Oracle DBMS
•    HA Sun server farm
•    Many legacy interfaces
© 2004 mVerify Corporation      29
Test Models Used
 • Extended Use Case
        – Defines feature usage profile
        – Input conditions, output actions
 • Mode Machine
       – Use case sequencing
 • Invariant Boundaries

          Stealth Requirements Engineering
© 2004 mVerify Corporation          30
Behavior Model
 • Extended Use Case pattern
                                                                    1          2          3          4          5
                 Conditions
                             Variable/Object    Value/State
Test Input                   Widget 1           Query          T           T
Conditions for               Widget 2           Set Time                              T          T
                                                                                                   Logic combinations
                             Widget 3           DEL                                                   T
automatic test               Host Name Pick     Valid          T           F          T          F control test input
                                                                                                      DC
input generation             Host Name Enter    Host Name
                                                                                                   data selection
                 Actions
                             Variable/Interface Value/Result
                             Host Name Display No Change       T           T          T          T
                                                Deleted                                                     T
                                                Added
                             Host Time Display No Change                                              Usage Profile
                                                                                                         T
 Required Actions                               Host Time      T                      T
                                                                                                      controls statistical
 for automatic               CE Time Display    Last Local Time T          T                     T    distribution of test
                                                                                                         T
 result checking                                Host Time                             T
                                                                                                      cases
                             Error Message                      F          T          F          T       F

                 Relative Frequency                                 0.35       0.20       0.30       0.10       0.05


© 2004 mVerify Corporation                             31
Load Model
• Vary input rate, any quantifiable pattern
   –   Arc
   –   Flat
   –   Internet Fractal
   –   Negative ramp
   –   Positive ramp
   –   Random
   –   Spikes
   –   Square wave
   –   Waves
                                            Actual “Waves” Load Profile
                          © 2004 mVerify Corporation                      32
MBT Challenges/Solutions
• One time sample not         • Simulator generates fresh,
  effective, but fresh test     accurate sample on demand
  suites too expense
• Too expensive to develop    • Oracle generates
  expected results              expected on demand
• Too many test cases to      • Comparator automates
  evaluate                      checking
• Profile/Requirements        • Incremental changes to
  change                        rule base
• SUT Interfaces change       • Common agent
                                interface
Simulator
• Discrete event simulation of user behavior
• 25 KLOC, Prolog
      – Rule inversion
      – “Speaks”
• Load Profile
      – Time domain variation
      – Orthogonal to operational profile
• Each event assigned a "port" and submit time
© 2004 mVerify Corporation       34
Test Environment
 • Simulator, etc. on typical desktop
 • Dedicated, but reduced server farm
 • Live data links
 • ~10 client workstations for automatic test agents
        – Adapter, each System Under Test (SUT) Interface
        – Test Agents execute independently
 • Distributed processing/serialization challenges
        – Loosely coupled, best-effort strategy
        – Embed sever-side serialization monitor

© 2004 mVerify Corporation          35
Automated Run Evaluation
 •    Post-process evaluation
 •    Oracle accepts output of simulator
 •    About 500 unique rules (20 KLOC Prolog)
 •    Verification
        – Splainer: result/rule backtracking tool (Prolog, 5 KLOC)
        – Rule/Run coverage analyzer
 • Comparator (Prolog, 3 KLOC)
        – Extract transaction log
        – Post run database state
        – End-to-end invariants

© 2004 mVerify Corporation          36
Daily Test Process
   • Plan each day's test run
          – Load profile, total volume
          – Configuration/operational scenarios
   • Run Simulator
          – 100,000 events per hour
          – FTP event files to test agents
   • Test agents submit
   • Run Oracle/Comparator
   • Prepare bug reports
           1,000 to 750,000 unique tests per day
© 2004 mVerify Corporation           37
Technical Achievements
 • AI-based user simulation generates test suites
 • All inputs generated under operational profile
 • High volume oracle and evaluation
 • Every test run unique and realistic (about 200)
 • Evaluated functionality and load response with
   fresh tests
 • Effective control of many different test agents
   (COTS/ custom, Java/4Test/Perl/Sql/proprietary)

© 2004 mVerify Corporation   38
Technical Problems
• Stamp coupling
      – Simulator, Agents, Oracle, Comparator
• Re-factoring rule relationships, Prolog limitations
• Configuration hassles
• Scale-up constraints
• Distributed schedule brittleness
• Horn Clause Shock Syndrome

© 2004 mVerify Corporation           39
Results
• Revealed about 1,500 bugs over two years
      – ~ 5% showstoppers
• Five person team, huge productivity increase
• Achieved proven high reliability
      – Last pre-release test run: 500,000 events in two
        hours, no failures detected
      – No production failures

© 2004 mVerify Corporation      40
Q&A

Más contenido relacionado

La actualidad más candente

Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...
Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...
Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...Zohirul Alam Tiemoon
 
Test Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For SucesssTest Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For SucesssDavid O'Dowd
 
Unit Testing Concepts and Best Practices
Unit Testing Concepts and Best PracticesUnit Testing Concepts and Best Practices
Unit Testing Concepts and Best PracticesDerek Smith
 
TDD (Test Driven Design)
TDD (Test Driven Design)TDD (Test Driven Design)
TDD (Test Driven Design)nedirtv
 
Test Driven Development (TDD) Preso 360|Flex 2010
Test Driven Development (TDD) Preso 360|Flex 2010Test Driven Development (TDD) Preso 360|Flex 2010
Test Driven Development (TDD) Preso 360|Flex 2010guest5639fa9
 
SOFTWARE TESTING UNIT-4
SOFTWARE TESTING UNIT-4  SOFTWARE TESTING UNIT-4
SOFTWARE TESTING UNIT-4 Mohammad Faizan
 
SOLID Principles and Design Patterns
SOLID Principles and Design PatternsSOLID Principles and Design Patterns
SOLID Principles and Design PatternsGanesh Samarthyam
 
Unit and integration Testing
Unit and integration TestingUnit and integration Testing
Unit and integration TestingDavid Berliner
 
ATDD Using Robot Framework
ATDD Using Robot FrameworkATDD Using Robot Framework
ATDD Using Robot FrameworkPekka Klärck
 
Test Driven Development (TDD)
Test Driven Development (TDD)Test Driven Development (TDD)
Test Driven Development (TDD)David Ehringer
 
Unit Testing And Mocking
Unit Testing And MockingUnit Testing And Mocking
Unit Testing And MockingJoe Wilson
 
A Top Down Approach to End-to-End Testing
A Top Down Approach to End-to-End TestingA Top Down Approach to End-to-End Testing
A Top Down Approach to End-to-End TestingSmartBear
 
Unit Test Presentation
Unit Test PresentationUnit Test Presentation
Unit Test PresentationSayedur Rahman
 

La actualidad más candente (20)

Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...
Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...
Overview on TDD (Test Driven Development) & ATDD (Acceptance Test Driven Deve...
 
Gray box testing
Gray box testingGray box testing
Gray box testing
 
Istqb foundation level day 1
Istqb foundation level   day 1Istqb foundation level   day 1
Istqb foundation level day 1
 
Test Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For SucesssTest Automation: A Roadmap For Sucesss
Test Automation: A Roadmap For Sucesss
 
Unit Testing Concepts and Best Practices
Unit Testing Concepts and Best PracticesUnit Testing Concepts and Best Practices
Unit Testing Concepts and Best Practices
 
Testing ppt
Testing pptTesting ppt
Testing ppt
 
7 testing principles
7 testing principles7 testing principles
7 testing principles
 
TestNG
TestNGTestNG
TestNG
 
TDD (Test Driven Design)
TDD (Test Driven Design)TDD (Test Driven Design)
TDD (Test Driven Design)
 
Test Driven Development (TDD) Preso 360|Flex 2010
Test Driven Development (TDD) Preso 360|Flex 2010Test Driven Development (TDD) Preso 360|Flex 2010
Test Driven Development (TDD) Preso 360|Flex 2010
 
SOFTWARE TESTING UNIT-4
SOFTWARE TESTING UNIT-4  SOFTWARE TESTING UNIT-4
SOFTWARE TESTING UNIT-4
 
Sanity testing and smoke testing
Sanity testing and smoke testingSanity testing and smoke testing
Sanity testing and smoke testing
 
SOLID Principles and Design Patterns
SOLID Principles and Design PatternsSOLID Principles and Design Patterns
SOLID Principles and Design Patterns
 
Unit and integration Testing
Unit and integration TestingUnit and integration Testing
Unit and integration Testing
 
ATDD Using Robot Framework
ATDD Using Robot FrameworkATDD Using Robot Framework
ATDD Using Robot Framework
 
Test Driven Development (TDD)
Test Driven Development (TDD)Test Driven Development (TDD)
Test Driven Development (TDD)
 
Unit Testing And Mocking
Unit Testing And MockingUnit Testing And Mocking
Unit Testing And Mocking
 
A Top Down Approach to End-to-End Testing
A Top Down Approach to End-to-End TestingA Top Down Approach to End-to-End Testing
A Top Down Approach to End-to-End Testing
 
Integration test
Integration testIntegration test
Integration test
 
Unit Test Presentation
Unit Test PresentationUnit Test Presentation
Unit Test Presentation
 

Destacado

Software Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesSoftware Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesBob Binder
 
Patterns in Test Automation
Patterns in Test AutomationPatterns in Test Automation
Patterns in Test AutomationAnand Bagmar
 
Refactoring to Patterns
Refactoring to PatternsRefactoring to Patterns
Refactoring to PatternsAngel Nuñez
 
web 2.0 universidad autonoma de campeche
web 2.0 universidad autonoma de campecheweb 2.0 universidad autonoma de campeche
web 2.0 universidad autonoma de campecheaiderick
 
CS6201 Software Reuse - Design Patterns
CS6201 Software Reuse - Design PatternsCS6201 Software Reuse - Design Patterns
CS6201 Software Reuse - Design PatternsKwangshin Oh
 
Test Patterns - What is a Pattern?
Test Patterns - What is a Pattern?Test Patterns - What is a Pattern?
Test Patterns - What is a Pattern?Rafael Pires
 
Design Pattern - Singleton Pattern
Design Pattern - Singleton PatternDesign Pattern - Singleton Pattern
Design Pattern - Singleton PatternMudasir Qazi
 
Singleton design pattern
Singleton design patternSingleton design pattern
Singleton design pattern11prasoon
 

Destacado (10)

Software Test Patterns: Successes and Challenges
Software Test Patterns: Successes and ChallengesSoftware Test Patterns: Successes and Challenges
Software Test Patterns: Successes and Challenges
 
Patterns in Test Automation
Patterns in Test AutomationPatterns in Test Automation
Patterns in Test Automation
 
Refactoring to Patterns
Refactoring to PatternsRefactoring to Patterns
Refactoring to Patterns
 
web 2.0 universidad autonoma de campeche
web 2.0 universidad autonoma de campecheweb 2.0 universidad autonoma de campeche
web 2.0 universidad autonoma de campeche
 
CS6201 Software Reuse - Design Patterns
CS6201 Software Reuse - Design PatternsCS6201 Software Reuse - Design Patterns
CS6201 Software Reuse - Design Patterns
 
Test Patterns - What is a Pattern?
Test Patterns - What is a Pattern?Test Patterns - What is a Pattern?
Test Patterns - What is a Pattern?
 
GUI Test Patterns
GUI Test PatternsGUI Test Patterns
GUI Test Patterns
 
Design Pattern - Singleton Pattern
Design Pattern - Singleton PatternDesign Pattern - Singleton Pattern
Design Pattern - Singleton Pattern
 
JavaScript Patterns
JavaScript PatternsJavaScript Patterns
JavaScript Patterns
 
Singleton design pattern
Singleton design patternSingleton design pattern
Singleton design pattern
 

Más de Bob Binder

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobHow to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobBob Binder
 
Lessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationLessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationBob Binder
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelBob Binder
 
Model-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowModel-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowBob Binder
 
Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Bob Binder
 
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Bob Binder
 
MTS: Controllable Test Objects
MTS: Controllable Test ObjectsMTS: Controllable Test Objects
MTS: Controllable Test ObjectsBob Binder
 
Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Bob Binder
 
The Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportThe Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportBob Binder
 
Performance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsPerformance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsBob Binder
 
Testing Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedTesting Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedBob Binder
 
mVerify Investor Overview
mVerify Investor OverviewmVerify Investor Overview
mVerify Investor OverviewBob Binder
 
MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.Bob Binder
 
Mobile Reliability Challenges
Mobile Reliability ChallengesMobile Reliability Challenges
Mobile Reliability ChallengesBob Binder
 
Experience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing EnvironmentExperience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing EnvironmentBob Binder
 
Testability: Factors and Strategy
Testability: Factors and StrategyTestability: Factors and Strategy
Testability: Factors and StrategyBob Binder
 
Test Objects -- They Just Work
Test Objects -- They Just WorkTest Objects -- They Just Work
Test Objects -- They Just WorkBob Binder
 
A Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryA Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryBob Binder
 
ISSRE 2008 Trip Report
ISSRE 2008 Trip ReportISSRE 2008 Trip Report
ISSRE 2008 Trip ReportBob Binder
 
Assurance for Cloud Computing
Assurance for Cloud ComputingAssurance for Cloud Computing
Assurance for Cloud ComputingBob Binder
 

Más de Bob Binder (20)

How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlobHow to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
How to Release Rock-solid RESTful APIs and Ice the Testing BackBlob
 
Lessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentationLessons learned validating 60,000 pages of api documentation
Lessons learned validating 60,000 pages of api documentation
 
Model-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next LevelModel-based Testing: Taking BDD/ATDD to the Next Level
Model-based Testing: Taking BDD/ATDD to the Next Level
 
Model-based Testing: Today And Tomorrow
Model-based Testing: Today And TomorrowModel-based Testing: Today And Tomorrow
Model-based Testing: Today And Tomorrow
 
Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.Mobile App Assurance: Yesterday, Today, and Tomorrow.
Mobile App Assurance: Yesterday, Today, and Tomorrow.
 
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
Popular Delusions, Crowds, and the Coming Deluge: end of the Oracle?
 
MTS: Controllable Test Objects
MTS: Controllable Test ObjectsMTS: Controllable Test Objects
MTS: Controllable Test Objects
 
Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology Achieving Very High Reliability for Ubiquitous Information Technology
Achieving Very High Reliability for Ubiquitous Information Technology
 
The Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision SupportThe Tester’s Dashboard: Release Decision Support
The Tester’s Dashboard: Release Decision Support
 
Performance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier ApplicationsPerformance Testing Mobile and Multi-Tier Applications
Performance Testing Mobile and Multi-Tier Applications
 
Testing Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons LearnedTesting Object-Oriented Systems: Lessons Learned
Testing Object-Oriented Systems: Lessons Learned
 
mVerify Investor Overview
mVerify Investor OverviewmVerify Investor Overview
mVerify Investor Overview
 
MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.MDD and the Tautology Problem: Discussion Notes.
MDD and the Tautology Problem: Discussion Notes.
 
Mobile Reliability Challenges
Mobile Reliability ChallengesMobile Reliability Challenges
Mobile Reliability Challenges
 
Experience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing EnvironmentExperience with a Profile-based Automated Testing Environment
Experience with a Profile-based Automated Testing Environment
 
Testability: Factors and Strategy
Testability: Factors and StrategyTestability: Factors and Strategy
Testability: Factors and Strategy
 
Test Objects -- They Just Work
Test Objects -- They Just WorkTest Objects -- They Just Work
Test Objects -- They Just Work
 
A Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS StoryA Million Users in a Box: The WTS Story
A Million Users in a Box: The WTS Story
 
ISSRE 2008 Trip Report
ISSRE 2008 Trip ReportISSRE 2008 Trip Report
ISSRE 2008 Trip Report
 
Assurance for Cloud Computing
Assurance for Cloud ComputingAssurance for Cloud Computing
Assurance for Cloud Computing
 

Último

DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersNicole Novielli
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demoHarshalMandlekar2
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 

Último (20)

DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software Developers
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demo
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 

Software Testing: Models, Patterns, Tools

  • 1. Software Testing: Models, Patterns, Tools Guest Lecture, UIC CS 540 November 16, 2010 Robert V. Binder
  • 2. Overview • Test design pattern fly-by • Levels of testing • Case study: CBOE Direct • Q&A
  • 4. Test Design Patterns • Software testing, c. 1995 – A large and fragmented body of knowledge – Few ideas about testing OO software • Challenges – Re-interpret wealth of knowledge for OO – Address unique OO considerations – Systematic presentation – Uniform analytical framework • Patterns looked like a useful schema – Existing templates didn’t address unique testing issues 4
  • 5. Some Footprints 1995 Design Patterns 2003 Briand’s Experiments 1995 Beizer, Black Box 2003 Dot Net Test Objects Testing 1995 Firesmith PLOOT 2003 Microsoft Patterns Group 1995 McGregor 2004 Java Testing Patterns 1999 TOOSMPT 2005 JUnit Anti Patterns 2000 Tutorial Experiment 2007 Test Object Anti Patterns 2001 POST Workshops (4) 2007 Software QS-TAG 5
  • 6. Test Design Patterns • Pattern schema for test design – Methods – Classes – Package and System Integration – Regression – Test Automation – Oracles 6
  • 7. Test Design Patterns • Pattern schema for Name/Intent test design Context Test Model Fault Model Test Procedure Strategy Oracle Entry Criteria Automation Exit Criteria Consequences Known Uses 7
  • 8. Test Design Patterns • Method Scope • Class/Cluster Scope – Category-Partition – Invariant Boundaries – Combinational Function – Modal Class – Recursive Function – Quasi-Modal Class – Polymorphic Message – Polymorphic Server – Modal Hierarchy © 2000 Robert V. Binder, all 8 rights reserved
  • 9. Modal Class: Implementation and Test Models TwoPlayerGame Two Play erG am e ( ) α TwoPlayerGame p1 _S tart ( ) / p2 _S tart ( ) / +TwoPlayerGame() s im u lat eV olle y( ) s im u lat eV olle y( ) ThreePlayerGame( ) /TwoPlayerGame( ) G am e S tarte d α +p1_Start( ) +p1_WinsVolley( ) p1_Start( ) / p3_Start( )/ p1 _W ins V olle y ( ) p2 _W ins V olle y ( ) -p1_AddPoint( ) simulateVolley( ) simulateVolley( ) +p1_IsWinner( ) [th is .p 1_ Sc ore ( ) < 20 ] / th is .p 1A ddP oin t( ) [th is .p 2_ Sc ore ( ) < 20 ] / th is .p 2A ddP oin t( ) Game Started +p1_IsServer( ) s im u lat eV olle y( ) p1 _W ins V olle y ( ) / s im u lat eV olle y( ) +p1_Points( ) s im u lat eV olle y( ) p2_Start( ) / +p2_Start( ) P la ye r 1 P la ye r 2 simulateVolley( ) +p2_WinsVolley( ) S erv ed S erv ed p1_WinsVolley( ) / -p2_AddPoint( ) p2 _W ins V olle y ( ) / simulateVolley( ) s im u lat eV olle y( ) +p2_IsWinner( ) p1 _W ins V olle y ( ) p2 _W ins V olle y ( ) +p2_IsServer( ) [th is .p 1_ Sc ore ( ) = = 20] / [th is .p 2_ Sc ore ( ) = = 20] / +p2_Points( ) th is .p 1A ddP oin t( ) th is .p 1A ddP oin t( ) p2_WinsVolley( ) +~( ) p1_WinsVolley( ) [this.p2_Score( ) < 20] / p3_WinsVolley( ) [this.p1_Score( ) < 20] / this.p2AddPoint( ) [this.p3_Score( ) < 20] / P la ye r 1 P la ye r 2 p1 _Is W in ner( ) / Won Won this.p1AddPoint( ) simulateVolley( ) this.p3AddPoint( ) p2 _Is W in ner( ) / retu rn TR UE ; retu rn TR UE ; simulateVolley( ) simulateVolley( ) ~( ) ~( ) p1_WinsVolley( ) / p2_WinsVolley( ) / ω simulateVolley( ) simulateVolley( ) Player 1 Player 2 Player 3 Served Served Served p2_WinsVolley( ) / p3_WinsVolley( ) / simulateVolley( ) simulateVolley( ) ThreePlayerGame p1_WinsVolley( ) p3_WinsVolley( ) [this.p1_Score( ) == 20] / [this.p3_Score( ) == 20] / Th ree P la y erG a m e ( ) / Two P la y erG am e ( ) this.p1AddPoint( ) this.p3AddPoint( ) α p3_WinsVolley( ) / p 3_ S tart ( ) / simulateVolley( ) s im ulat eV o lley ( ) p2_WinsVolley( ) ThreePlayerGame G a m e S ta rt ed p 3_ W ins V o lle y( ) / [this.p2_Score( ) == 20] / +ThreePlayerGame() s im ulat eV o lley ( ) this.p1AddPoint( ) +p3_Start( ) p 3_ W ins V o lle y( ) [t his .p 3_ S co re ( ) < 2 0] / +p3_WinsVolley( ) th is . p3 A dd P oint ( ) Tw oP lay erG am e ( ) -p3_AddPoint( ) s im ulat eV o lley ( ) p 1_ W ins V o lle y( ) / +p3_IsWinner( ) s im ulat eV o lley ( ) p1_IsWinner( ) / p2_IsWinner( ) / p3_IsWinner( ) / +p3_IsServer( ) P la y er 3 return TRUE; Player 1 return TRUE; Player 2 Player 3 return TRUE; +p3_Points( ) S erv e d Won Won Won +~( ) p 2_ W ins V o lle y( ) / s im ulat eV o lley ( ) p 3_ W ins V o lle y( ) ~( ) [t his .p 3_ S co re ( ) = = 2 0] / ~( ) ~( ) th is . p3 A dd P oint ( ) ω P la y er 3 W on p 3_ Is W in ne r( ) / ret urn TR UE ; ~( ) ω 9
  • 10. Test Plan and Test Size • K events 1 2 ThreePlayerGame( ) p1_Start( ) 8 Player 2 Served 3 p2_Start( ) • N states 4 5 p3_Start( ) p1_WinsVolley( ) Player 1 Served 11 Player 3 Served 17 omega 6 p1_WinsVolley( )[this.p1_Score( ) < 20] *7 7 p1_WinsVolley( ) [this.p1_Score( ) == 20] Player 1 W on 14 8 p2_WinsVolley( ) Player 1 W on 9 p2_WinsVolley( ) [this.p2_Score( ) < 20] *6 Player 1 Served • With LSIFs 10 p2_WinsVolley( ) [this.p2_Score( ) == 20] 2 *9 Player 2 Served – KN tests alpha 1 Gam eStarted 3 Player 2 Served 11 Player 3 Served 17 omega * 10 Player 2 W on 15 Player 2 W on 5 Player 1 Served • No LSIFs 4 * 12 Player 3 Served 17 omega – K× N3 tests 11 12 p3_WinsVolley( ) p3_WinsVolley( ) [this.p3_Score( ) < 20] Player 3 Served * 13 Player 3 W on 16 Player 3 W on 13 p3_WinsVolley( ) [this.p3_Score( ) == 20] 8 Player 2 Served 14 p1_IsWinner( ) 15 p2_IsWinner( ) 16 p3_IsWinner( ) 5 Player 1 Served 17 ~( ) 10
  • 11. Test Design Patterns • Subsystem Scope • Reusable Components – Class Associations – Abstract Class – Round-Trip Scenarios – Generic Class – Mode Machine – New Framework – Controlled Exceptions – Popular Framework © 2000 Robert V. Binder, all 11 rights reserved
  • 12. Test Design Patterns • Intra-class Integration • Integration Strategy – Small Pop – Big Bang – Alpha-Omega Cycle – Bottom up – Top Down – Collaborations – Backbone – Layers – Client/Server – Distributed Services – High Frequency © 2000 Robert V. Binder, all 12 rights reserved
  • 13. Test Design Patterns • System Scope • Regression Testing – Extended Use Cases – Retest All – Covered in CRUD – Retest Risky Use Cases – Allocate by Profile – Retest Profile – Retest Changed Code – Retest Within Firewall © 2000 Robert V. Binder, all 13 rights reserved
  • 14. Test Oracle Patterns • Smoke Test • Reversing • Judging • Simulation – Testing By Poking Around – Code-Based Testing • Approximation – Post Test Analysis • Regression • Pre-Production • Voting • Built-in Test • Substitution • Gold Standard – Custom Test Suite • Equivalency – Random Input Generation – Live Input – Parallel System © 2000 Robert V. Binder, all 14 rights reserved
  • 15. Test Automation Patterns • Test Case Implementation • Test Drivers – Test Case/Test Suite – TestDriver Super Class Method – Percolate the Object Under – Test Case /Test Suite Class Test – Catch All Exceptions – Symmetric Driver • Test Control – Subclass Driver – Server Stub – Private Access Driver – Server Proxy – Test Control Interface – Drone – Built-in Test Driver © 2000 Robert V. Binder, all 15 rights reserved
  • 16. Test Automation Patterns • Test Execution • Built-in Test – Command Line Test – Coherence idiom Bundle – Percolation – Incremental Testing – Built-in Test Driver Framework (e.g. Junit) – Fresh Objects © 2000 Robert V. Binder, all 16 rights reserved
  • 17. Percolation Pattern Base + + + Base() ~Base() foo() + bar() # invariant() # fooPre() • Enforces Liskov Subsitutability # # # fooPost() barPre() barPost() • Implement with No Code Left Behind Derived1 + Derived1() + ~Derived1() + foo() + bar() + fum() # invariant() # fooPre() # fooPost() # barPre() # barPost() # fumPre() # fumPost() Derived2 + Derived2() + ~Derived2() + foo() + bar() + fee() # invariant() # fooPre() # fooPost() # barPre() # barPost() # feePre() # feePost() 17
  • 18. Ten Years After … • Many new design patterns for hand-crafted test automation – Elaboration of Incremental Test Framework (e.g. JUnit) – Platform-specific or application-specific – Narrow scope • Few new test design patterns • No new oracle patterns • Attempts to generate tests from design patterns • To date 10,000+ copies of TOOSMPT 18
  • 19. What Have We Learned? • TP are effective for articulation of insight and practice – Requires discipline to develop – Supports research and tool implementation • Do not “work out of the box” – Requires discipline in application – Enabling factors • Irrelevant to the uninterested, undisciplined – Low incremental benefit – Readily available substitutes • Broadly influential, but not compelling 19
  • 21. What is good testing? • Value creation (not technical merit) – Effectiveness (reliability/quality increase) – Efficiency (average cost per test) • Levels – 1: Testing by poking around – 2: Manual Testing – 3: Automated Test Script/Test Objects – 4: Model-based – 5: Full Test Automation Each Level 10x Improvement © 2004 mVerify Corporation 21
  • 22. Level 1: Testing by Poking Around Manual “Exploratory” Testing •Low Coverage •Not Repeatable •Can’t Scale •Inconsistent System Under Test © 2004 mVerify Corporation 22
  • 23. Level 2: Manual Testing Test Setup Manual Manual Test Design/ Test Input Generation •1 test per hour •Not repeatable Test Results System Under Test Evaluation © 2004 mVerify Corporation 23
  • 24. Level 3: Automated Test Script Test Setup Manual Test Script Test Design/ Programming Generation •10+ tests per hour •Repeatable •High change cost Test Results System Under Test Evaluation © 2004 mVerify Corporation 24
  • 25. Level 4: Automated Model-based Test Setup Model-based Automatic Test Design/ Test Generation Execution •1000+ tests per hour •High fidelity Test Results System Under Test Evaluation © 2004 mVerify Corporation 25
  • 26. Level 5: Total Automation Automated Test Setup Model-based Automatic Test Design/ Test Generation Execution •10,000 TPH Automated Test Results System Under Test Evaluation © 2004 mVerify Corporation 26
  • 28. CBOE Direct ® • Electronic technology platform built and maintained in-house by Chicago Board Options Exchange (CBOE) – Multiple trading models configurable by product – Multiple matching algorithms (options, futures, stocks, warrants, single stock futures) – Best features of screen-based trading and floor-based markets • Electronic trading on CBOE, the CBOE Futures Exchange (CFX), and the CBOE Stock Exchange (CBSX), others • As of April 2008: – More than 188,000 listed products – More than 3.8 billion industry quotes handled from OPRA on peak day – More than two billion quotes on peak day – More than 684,000 orders on peak day – More than 124,000 peak quotes per second – Less than 5 ms response time for quotes
  • 29. Development • Rational Unified process – Six development increments – 3 to 5 months – Test design/implementation parallel with app dev • Three + years, version 1.0 live Q4 2001 • About 90 use-cases, 650 KLOC Java • CORBA/IDL distributed objects • Java (services and GUI), some XML • Oracle DBMS • HA Sun server farm • Many legacy interfaces © 2004 mVerify Corporation 29
  • 30. Test Models Used • Extended Use Case – Defines feature usage profile – Input conditions, output actions • Mode Machine – Use case sequencing • Invariant Boundaries Stealth Requirements Engineering © 2004 mVerify Corporation 30
  • 31. Behavior Model • Extended Use Case pattern 1 2 3 4 5 Conditions Variable/Object Value/State Test Input Widget 1 Query T T Conditions for Widget 2 Set Time T T Logic combinations Widget 3 DEL T automatic test Host Name Pick Valid T F T F control test input DC input generation Host Name Enter Host Name data selection Actions Variable/Interface Value/Result Host Name Display No Change T T T T Deleted T Added Host Time Display No Change Usage Profile T Required Actions Host Time T T controls statistical for automatic CE Time Display Last Local Time T T T distribution of test T result checking Host Time T cases Error Message F T F T F Relative Frequency 0.35 0.20 0.30 0.10 0.05 © 2004 mVerify Corporation 31
  • 32. Load Model • Vary input rate, any quantifiable pattern – Arc – Flat – Internet Fractal – Negative ramp – Positive ramp – Random – Spikes – Square wave – Waves Actual “Waves” Load Profile © 2004 mVerify Corporation 32
  • 33. MBT Challenges/Solutions • One time sample not • Simulator generates fresh, effective, but fresh test accurate sample on demand suites too expense • Too expensive to develop • Oracle generates expected results expected on demand • Too many test cases to • Comparator automates evaluate checking • Profile/Requirements • Incremental changes to change rule base • SUT Interfaces change • Common agent interface
  • 34. Simulator • Discrete event simulation of user behavior • 25 KLOC, Prolog – Rule inversion – “Speaks” • Load Profile – Time domain variation – Orthogonal to operational profile • Each event assigned a "port" and submit time © 2004 mVerify Corporation 34
  • 35. Test Environment • Simulator, etc. on typical desktop • Dedicated, but reduced server farm • Live data links • ~10 client workstations for automatic test agents – Adapter, each System Under Test (SUT) Interface – Test Agents execute independently • Distributed processing/serialization challenges – Loosely coupled, best-effort strategy – Embed sever-side serialization monitor © 2004 mVerify Corporation 35
  • 36. Automated Run Evaluation • Post-process evaluation • Oracle accepts output of simulator • About 500 unique rules (20 KLOC Prolog) • Verification – Splainer: result/rule backtracking tool (Prolog, 5 KLOC) – Rule/Run coverage analyzer • Comparator (Prolog, 3 KLOC) – Extract transaction log – Post run database state – End-to-end invariants © 2004 mVerify Corporation 36
  • 37. Daily Test Process • Plan each day's test run – Load profile, total volume – Configuration/operational scenarios • Run Simulator – 100,000 events per hour – FTP event files to test agents • Test agents submit • Run Oracle/Comparator • Prepare bug reports 1,000 to 750,000 unique tests per day © 2004 mVerify Corporation 37
  • 38. Technical Achievements • AI-based user simulation generates test suites • All inputs generated under operational profile • High volume oracle and evaluation • Every test run unique and realistic (about 200) • Evaluated functionality and load response with fresh tests • Effective control of many different test agents (COTS/ custom, Java/4Test/Perl/Sql/proprietary) © 2004 mVerify Corporation 38
  • 39. Technical Problems • Stamp coupling – Simulator, Agents, Oracle, Comparator • Re-factoring rule relationships, Prolog limitations • Configuration hassles • Scale-up constraints • Distributed schedule brittleness • Horn Clause Shock Syndrome © 2004 mVerify Corporation 39
  • 40. Results • Revealed about 1,500 bugs over two years – ~ 5% showstoppers • Five person team, huge productivity increase • Achieved proven high reliability – Last pre-release test run: 500,000 events in two hours, no failures detected – No production failures © 2004 mVerify Corporation 40
  • 41. Q&A