4. Test Design Patterns
• Software testing, c. 1995
– A large and fragmented body of knowledge
– Few ideas about testing OO software
• Challenges
– Re-interpret wealth of knowledge for OO
– Address unique OO considerations
– Systematic presentation
– Uniform analytical framework
• Patterns looked like a useful schema
– Existing templates didn’t address unique testing
issues
4
5. Some Footprints
1995 Design Patterns 2003 Briand’s Experiments
1995 Beizer, Black Box 2003 Dot Net Test Objects
Testing
1995 Firesmith PLOOT 2003 Microsoft Patterns Group
1995 McGregor 2004 Java Testing Patterns
1999 TOOSMPT 2005 JUnit Anti Patterns
2000 Tutorial Experiment 2007 Test Object Anti Patterns
2001 POST Workshops (4) 2007 Software QS-TAG
5
6. Test Design Patterns
• Pattern schema for
test design
– Methods
– Classes
– Package and System
Integration
– Regression
– Test Automation
– Oracles
6
7. Test Design Patterns
• Pattern schema for
Name/Intent test design
Context
Test Model
Fault Model
Test Procedure
Strategy
Oracle
Entry Criteria
Automation
Exit Criteria
Consequences
Known Uses
7
9. Modal Class:
Implementation and Test Models
TwoPlayerGame
Two Play erG am e ( )
α
TwoPlayerGame p1 _S tart ( ) / p2 _S tart ( ) /
+TwoPlayerGame() s im u lat eV olle y( ) s im u lat eV olle y( ) ThreePlayerGame( ) /TwoPlayerGame( )
G am e S tarte d α
+p1_Start( )
+p1_WinsVolley( ) p1_Start( ) / p3_Start( )/
p1 _W ins V olle y ( ) p2 _W ins V olle y ( )
-p1_AddPoint( ) simulateVolley( ) simulateVolley( )
+p1_IsWinner( )
[th is .p 1_ Sc ore ( ) < 20 ] /
th is .p 1A ddP oin t( )
[th is .p 2_ Sc ore ( ) < 20 ] /
th is .p 2A ddP oin t( )
Game Started
+p1_IsServer( ) s im u lat eV olle y( ) p1 _W ins V olle y ( ) / s im u lat eV olle y( )
+p1_Points( ) s im u lat eV olle y( ) p2_Start( ) /
+p2_Start( ) P la ye r 1 P la ye r 2
simulateVolley( )
+p2_WinsVolley( ) S erv ed S erv ed p1_WinsVolley( ) /
-p2_AddPoint( ) p2 _W ins V olle y ( ) / simulateVolley( )
s im u lat eV olle y( )
+p2_IsWinner( ) p1 _W ins V olle y ( ) p2 _W ins V olle y ( )
+p2_IsServer( ) [th is .p 1_ Sc ore ( ) = = 20] / [th is .p 2_ Sc ore ( ) = = 20] /
+p2_Points( ) th is .p 1A ddP oin t( ) th is .p 1A ddP oin t( )
p2_WinsVolley( )
+~( ) p1_WinsVolley( ) [this.p2_Score( ) < 20] / p3_WinsVolley( )
[this.p1_Score( ) < 20] / this.p2AddPoint( ) [this.p3_Score( ) < 20] /
P la ye r 1 P la ye r 2
p1 _Is W in ner( ) / Won Won
this.p1AddPoint( ) simulateVolley( ) this.p3AddPoint( )
p2 _Is W in ner( ) /
retu rn TR UE ; retu rn TR UE ; simulateVolley( ) simulateVolley( )
~( ) ~( )
p1_WinsVolley( ) / p2_WinsVolley( ) /
ω simulateVolley( ) simulateVolley( )
Player 1 Player 2 Player 3
Served Served Served
p2_WinsVolley( ) / p3_WinsVolley( ) /
simulateVolley( ) simulateVolley( )
ThreePlayerGame p1_WinsVolley( ) p3_WinsVolley( )
[this.p1_Score( ) == 20] / [this.p3_Score( ) == 20] /
Th ree P la y erG a m e ( ) / Two P la y erG am e ( )
this.p1AddPoint( ) this.p3AddPoint( )
α p3_WinsVolley( ) /
p 3_ S tart ( ) / simulateVolley( )
s im ulat eV o lley ( ) p2_WinsVolley( )
ThreePlayerGame G a m e S ta rt ed
p 3_ W ins V o lle y( ) / [this.p2_Score( ) == 20] /
+ThreePlayerGame() s im ulat eV o lley ( ) this.p1AddPoint( )
+p3_Start( ) p 3_ W ins V o lle y( )
[t his .p 3_ S co re ( ) < 2 0] /
+p3_WinsVolley( ) th is . p3 A dd P oint ( )
Tw oP lay erG am e ( )
-p3_AddPoint( ) s im ulat eV o lley ( )
p 1_ W ins V o lle y( ) /
+p3_IsWinner( ) s im ulat eV o lley ( ) p1_IsWinner( ) / p2_IsWinner( ) / p3_IsWinner( ) /
+p3_IsServer( ) P la y er 3 return TRUE; Player 1 return TRUE; Player 2 Player 3 return TRUE;
+p3_Points( ) S erv e d Won Won Won
+~( ) p 2_ W ins V o lle y( ) /
s im ulat eV o lley ( )
p 3_ W ins V o lle y( )
~( )
[t his .p 3_ S co re ( ) = = 2 0] /
~( ) ~( )
th is . p3 A dd P oint ( ) ω
P la y er 3
W on p 3_ Is W in ne r( ) /
ret urn TR UE ;
~( )
ω
9
10. Test Plan and Test Size
• K events 1
2
ThreePlayerGame( )
p1_Start( )
8 Player 2 Served
3 p2_Start( )
• N states 4
5
p3_Start( )
p1_WinsVolley( )
Player 1 Served
11 Player 3 Served
17 omega
6 p1_WinsVolley( )[this.p1_Score( ) < 20]
*7
7 p1_WinsVolley( ) [this.p1_Score( ) == 20] Player 1 W on
14
8 p2_WinsVolley( ) Player 1 W on
9 p2_WinsVolley( ) [this.p2_Score( ) < 20]
*6 Player 1 Served
• With LSIFs
10 p2_WinsVolley( ) [this.p2_Score( ) == 20]
2
*9 Player 2 Served
– KN tests alpha
1
Gam eStarted
3
Player 2 Served
11 Player 3 Served
17 omega
* 10
Player 2 W on
15
Player 2 W on
5 Player 1 Served
• No LSIFs 4
* 12 Player 3 Served
17 omega
– K× N3 tests 11
12
p3_WinsVolley( )
p3_WinsVolley( ) [this.p3_Score( ) < 20]
Player 3 Served
* 13 Player 3 W on
16
Player 3 W on
13 p3_WinsVolley( ) [this.p3_Score( ) == 20] 8
Player 2 Served
14 p1_IsWinner( )
15 p2_IsWinner( )
16 p3_IsWinner( ) 5 Player 1 Served
17 ~( )
10
18. Ten Years After …
• Many new design patterns for hand-crafted test
automation
– Elaboration of Incremental Test Framework (e.g. JUnit)
– Platform-specific or application-specific
– Narrow scope
• Few new test design patterns
• No new oracle patterns
• Attempts to generate tests from design patterns
• To date 10,000+ copies of TOOSMPT
18
19. What Have We Learned?
• TP are effective for articulation of insight and
practice
– Requires discipline to develop
– Supports research and tool implementation
• Do not “work out of the box”
– Requires discipline in application
– Enabling factors
• Irrelevant to the uninterested, undisciplined
– Low incremental benefit
– Readily available substitutes
• Broadly influential, but not compelling
19
28. CBOE Direct ®
• Electronic technology platform built and maintained in-house by
Chicago Board Options Exchange (CBOE)
– Multiple trading models configurable by product
– Multiple matching algorithms (options, futures, stocks, warrants,
single stock futures)
– Best features of screen-based trading and floor-based markets
• Electronic trading on CBOE, the CBOE Futures Exchange (CFX), and
the CBOE Stock Exchange (CBSX), others
• As of April 2008:
– More than 188,000 listed products
– More than 3.8 billion industry quotes handled from OPRA on peak day
– More than two billion quotes on peak day
– More than 684,000 orders on peak day
– More than 124,000 peak quotes per second
– Less than 5 ms response time for quotes
33. MBT Challenges/Solutions
• One time sample not • Simulator generates fresh,
effective, but fresh test accurate sample on demand
suites too expense
• Too expensive to develop • Oracle generates
expected results expected on demand
• Too many test cases to • Comparator automates
evaluate checking
• Profile/Requirements • Incremental changes to
change rule base
• SUT Interfaces change • Common agent
interface