Más contenido relacionado
Similar a The Science of Software Testing - Experiments, Evolution & Emergence (2011) (20)
Más de Neil Thompson (16)
The Science of Software Testing - Experiments, Evolution & Emergence (2011)
- 1. The Science of Software Testing: SIGiST
Experiments, Evolution & Emergence Specialist Interest Group in
Software Testing 21 Jun 2011
via Value Flow
v1.0
Neil Thompson ©Thompson
information
Thompson information Systems Consulting Ltd Systems
Consulting Ltd 1
- 2. In the beginning testing was
Methods (or Psychology?) – then came SIGiST
Specialist Interest Group in
the Arts & Crafts movement(s)! Software Testing 21 Jun 2011
Contrary to popular belief, the first book devoted to software testing was
1973, ed. Bill Hetzel:
• (technically, a conference proceedings,
Chapel Hill North Carolina 1972)
• contains the first V-model?!
But NB Jerry Weinberg had written in 1961 & 1971 of
testing as intriguing, puzzle, a psychological problem
Glenford Myers (1979) The Art...:
• but 1976 “unnatural, destructive
process”... “problem in economics”
Brian Marick (1995) The Craft...:
• specifically for subsystem testing &
object-oriented
Paul Jorgensen (1995) ...A Craftsman’s approach:
• “Mathematics is a descriptive device that helps ©Thompson
information
Systems
us better understand software to be tested” Consulting Ltd 2
- 3. More about Gerald M. Weinberg SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• Ph.D. in Psychology (dissertation 1965 “Experiments in Problem Solving”)
• In 1961’s Computer Programming Fundamentals
with Herbert Leeds (revised 1966 & 1970):
– “testing... is by far the most intriguing part of programming”
– “seldom a step-by-step procedure”... “normally must circle around”
• 1967 Natural Selection as applied to Computers & Programs (!)
• 1971 The Psychology of Computer Programming:
– “testing is first and foremost a psychological problem”
– “one way to guard against... stopping testing too soon... is to prepare
the tests in advance of testing and, if possible in advance of coding”
– General Systems Thinking (1975):
– the science of modelling and
simplifying complex, open systems
– Systems Thinking (1992)
– “observe what’s happening and ...understand the significance”
– feedback loops, locking on to patterns, controlling & changing
Acknowledgement to James Bach’s summary in “The Gift of Time” (2008 ©Thompson
information
essays in honour of Jerry Weinberg on his 75th birthday) Systems
Consulting Ltd 3
- 4. But what is software testing now? SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Engineering? Graham Bath & Judy McKay (2008):
• But “engineering” isn’t in glossary, or even index!
• “What is a Test Analyst? Defining a role at the international
level is not easy...”
Profession?
• EuroSTAR 2010, Isabel Evans and others
• Magazine(s)
Context-Driven? Yes but:
• Context-Driven school and artistic? C-D and engineering???
• “Craft” is often used in Context-Driven discussions
• Plus science & passion!
• Also, more later about the ©Thompson
information
“schools” of software testing! Systems
Consulting Ltd 4
- 5. Defining Quality is even more
SIGiST
difficult! Specialist Interest Group in
Software Testing 21 Jun 2011
Robert M Pirsig:
• Zen and the Art of Motorcycle
Maintenance – an Inquiry into Values
(Bodley Head 1974, also see
http://en.wikipedia.org/wiki/Zen_and_the_Art_of_
Motorcycle_Maintenance )
• Lila – an Inquiry into Morals (Bantam
1991, also see
http://en.wikipedia.org/wiki/Lila:_An_Inquiry_into_
Morals )
©Thompson
information
Systems
Consulting Ltd 5
- 6. “Quality is value to some person(s)” SIGiST
Specialist Interest Group in
Jerry Weinberg, Software Testing 21 Jun 2011
Quality Software Management 1992
Quality is Yes, but...
Quality is value to me
value to me
Quality is
Quality is value to me
value to me
Quality is
value to me Quality is
Quality is value to me
value to me
©Thompson
information
“Summit” image from www.topnews.in Systems
Consulting Ltd 6
- 7. “The Science of Software Testing”
SIGiST
isn’t a book yet, but... Specialist Interest Group in
Software Testing 21 Jun 2011
• Boris Beizer
(1984) experimental process, and
(1995) falsifiability (the well-known Popper principle)
• Rick Craig & Stefan Jaskiel (2002)
black-box science & art, “white-box” science
• Marnie Hutcheson (2003)
software art, science & engineering
Kaner, Bach & Pettichord (2002) explicit science:
• theory that software works, experiments to falsify
• testers behave empirically, think sceptically,
recognise limitations of “knowledge”
• testing needs cognitive psychology,
inference, conjecture & refutation ©Thompson
information
• all testing is based on models Systems
Consulting Ltd 7
- 8. Some bloggers have been more
SIGiST
specific and detailed Specialist Interest Group in
Software Testing 21 Jun 2011
Paul Carvalho (www.staqs.com) – testing skills include:
• learning / relearning scientific method (multiple sources!)
• knowledge of probability & statistics
Randy Rice (www.riceconsulting.com) – science rigour decreasing? but:
• testing analogies with observation, experiment, hypothesis, law etc
David Coutts (en.wikipedia.org/wiki/User:David_Coutts) – yes “context” but:
• both science & software testing have “right / wrong answers”, so...
• test passes/fails based on its requirements. However, science & testing...
• go beyond falsificationism to economy (few theories explaining many
observations), consistency, maths foundation & independent verification
• retesting is a different theory
BJ Rollison (blogs.msdn.com/b/imtesty/ - note this is on his old blog) :
• too many hypotheses to test in a reasonable time
• debugging is scientific also
Cem Kaner (www.kaner.com) – software testing as a social science :
• software is for people, we should measure accordingly
• bad theories & models give blind spots & impede trade-offs ©Tihompson
nformation
Apologies to anyone I’ve so far missed! Systems
Consulting Ltd 8
- 9. SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Part A
Experiments
© Thompson
information
Systems
Consulting Ltd 9
- 10. Why should it be useful to treat
SIGiST
testing as a science? Specialist Interest Group in
Software Testing 21 Jun 2011
System Requirements/
Specification/ Design Expected Test
result Y “passes”
Test result =
Test Expected?
Test N Test
Product result “fails”
Hypothesis Expected Hypothesis
result Experiment Y confirmed
result =
Expected?
Experiment N Hypothesis
Part of the Experiment
rejected
cosmos result
©Thompson
information
Note: this is starting with the “traditional” views of testing & science Systems
Consulting Ltd 10
- 11. What is software testing? Definitions
SIGiST
through the ages Specialist Interest Group in
Software Testing 21 Jun 2011
PERIOD EXEMPLAR OBJECTIVES SCOPE APPROACH
DEBUGGING Weinberg Test + Debug Programs Think, Iterate
Pre- (Psychology) (1961 & 71)
1957 DEMONSTRATION Hetzel Show meets Verify, +maybe
Programs
(Method) (1972) requirements Prove, Validate, “Certify”
1976 DESTRUCTION Myers Find bugs Programs, Sys, + Walkthroughs, Reviews
(Art) (1976 & 79) Acceptance & Inspections
1983 ? Measure
EVALUATION
quality
1984 PREVENTION Beizer Find bugs, + Integration
(Craft?) (1984) show meets
requirements,
+prevent bugs
2000
SCHOOL(S) Kaner et al Find bugs, in service Realistic, pragmatic,
(1988 & 99) of improving quality, normal
for customer needs
2011 Science? Experiment &
Neo-Holistic?
Evolve?
©Thompson
Overall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, information
1988 CACM 31 (6) as quoted on Wikipedia Systems
Consulting Ltd 11
- 12. So, how would these “methods”
SIGiST
look if we adopt Myers & Popper? Specialist Interest Group in
Software Testing 21 Jun 2011
System Requirements/
Specification/ Design “Aim to Test is
find bugs” Y “successful”
Test result =
Test “as aim”?
Test N Test is so far
Product result “unsuccessful”
“Aim to
Hypothesis Falsification
falsify
Experiment Y confirmed
hypothesis”
result =
“as aimed”?
Experiment N Hypothesis
Part of the Experiment
not yet
cosmos result
falsified
©Thompson
information
Note: this is starting with the “traditional” views of testing & science Systems
Consulting Ltd 12
- 13. A current hot topic: testing versus
SIGiST
“just checking” Specialist Interest Group in
Software Testing 21 Jun 2011
System Requirements/
Specification/ Design Expected Check
result Check Y “passes”
result =
“Check” Expected?
Check N Check
Product result “fails”
Other oracles
Other quality-related
System Requirements/
Ways criteria Quality-
Specification/ Design
could fail Test result = Y related info
appropriate
Test ? Info on
Test N
quality
Product result
issues
©Thompson
information
Systems
Consulting Ltd 13
- 14. Exploratory testing is more sophisticated
SIGiST
than pre-designed, and does not demand Specialist Interest Group in
Software Testing 21 Jun 2011
a system specification
Context Heuristics Epistemology Cognitive
psychology
Other oracles
Other quality-related Abductive
System Requirements/
Ways criteria inference Quality-
Specification/ Design
could fail Y related info
Test result =
appropriate
Test ? Info on
Test N
quality
Product result Bug
issues
advocacy
• Test Framing:
– context, mission, requirements, principles, oracles, risks
– models, value ideas, skills, heuristics, cost/value/time “issues”
– mechanisms, techniques, procedures, execution methods
– deliverables ©Thompson
information
Systems
Consulting Ltd 14
- 15. Science should help to understand
SIGiST
overlapping models, and to derive Specialist Interest Group in
Software Testing 21 Jun 2011
better test models
• Development models & test models each cover REAL WORLD (desired)
subsets of actual & potential reality
DEV TEST MODEL
• Examples of development model techniques:
MODEL (verified /
– entity relationships (expected) validated)
– state transitions
• Examples of test model techniques – the above plus:
– equivalence partitioning, domain testing & boundaries
– transaction, control & data flows
– entity life history (CRUD)
– classification trees; decision tables after
– timing SOFTWARE TESTING:
A CRAFTSMAN’S
– opportunity for more? APPROACH
Paul Jorgensen
• In “checking”, the test model tries to cover the SOFTWARE
development model (observed)
• Testing (sapient) should expand its model beyond
©Thompson
that, as far into actual/potential real behaviour as information
stakeholders want and can pay for Systems
Consulting Ltd 15
- 16. Heuristics, patterns & techniques –
SIGiST
and scientific analogues? Specialist Interest Group in
Software Testing 21 Jun 2011
Heuristics † Conjectures w
• art of discovery in logic • proposition that is unproven but is
• education method in which thought to be true and
student discovers for self has not been disproven
• principle used in making decisions when ?
all possibilities cannot be fully explored!
? Hypotheses w
Patterns * • testable statement based on
• catchy title; accepted grounds
• description of the problem addressed
?
• solution to the problem
• context in which pattern applies Theories w
• some examples. • proposed explanation of empirical
phenomena, made in a way consistent with
? scientific method and satisfactorily tested
Techniques † or proven
• method of performance
• manipulation ?
• mechanical part of an Tests Experiments
artistic performance!
©Thompson
† Definitions based on Chambers 1981
+Laws! information
w
Systems
* Definitions based on Software Testing Retreat #2, 2003 Definitions based on Wikipedia 2011 Consulting Ltd 16
- 17. But wait! Is there one, agreed,
SIGiST
“scientific method”? Specialist Interest Group in
Software Testing 21 Jun 2011
• No! These are the first dozen I found (unscientifically*)
• Only two are near-identical, so here are eleven variants, all
with significant differences! (extent, structure & content)
• The philosophy of science has evolved (see later slides) ©Tihompson
nformation
Systems
* Images from various websites, top-ranking of Google image search May 2011 Consulting Ltd 17
- 18. So... a post-Popper view of theories:
SIGiST
how science could help coverage Specialist Interest Group in
Software Testing 21 Jun 2011
• Coverage is multi-dimensional – so
General Systems Thinking helps analyse
dimensions, eg choose 2-D projections
• And models need to map this multi-dimensional space –
not quite a hierarchy:
• Development Model, of which...
• Test Model should be a superset, of which...
• Real World is an unattainable superset – but Test
should get as close as appropriate
• Hybridise & innovate new techniques based on heuristics Sources:
Neil Thompson EuroSTAR 1993
& patterns – and taking inspiration from conjectures, Doug Hoffman via www.testingeducation.org
hypotheses, theories...
• Remember multiple ways software can fail: + see various
Program state Program state, including
uninspected outputs
Bug
System state System state Taxonomies
System
Intended inputs under Monitored outputs
test
Configuration and Impacts on connected
system resources devices / system resources ©Thompson
information
From other cooperating
processes, clients or servers
To other cooperating
processes, clients or servers
Systems
Consulting Ltd 18
- 19. SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Part B
Evolution &
Value Flow ScoreCards
© Thompson
information
Systems
Consulting Ltd 19
- 20. Traditional Darwinian evolution (ie
SIGiST
biological) Specialist Interest Group in
Software Testing 21 Jun 2011
Image from www.qwickstep.com
©Thompson
• Nearly everyone is familiar with this, but...
information
Systems
Consulting Ltd 20
- 21. ...arguably Darwinan evolutionary
SIGiST
principles apply beyond biology Specialist Interest Group in
Software Testing 21 Jun 2011
• There is a cascade (& approx symmetry!):
– Biology depends on Organic Chemistry
– Organic chemistry depends on the special
properties of Carbon
– Chemical elements in the upper part of
the periodic table come from supernovae
– Elements in the lower part of the periodic
table come from ordinary stars
– Elements are formed from protons,
neutrons, electrons (Physics)
– ... quarks... string theory?? etc
• It just so happens that humans are about
equidistant in scale from the smallest (Ouroboros: Greek
things we can measure to the largest Οὐροβόρος or οὐρηβόρος,
from οὐροβόρος ὄφις
• Humans have evolved to use tools, build "tail-devouring snake”)
societies, read, invent computers...
• So, it is possible to think of pan-scientific Inventions
evolution as a flow of value by humans,
• Now, back to software lifecycles... eg Social Sciences
Sources: Daniel Dennett “Darwin’s Dangerous Idea” © Thompson
information
“cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc) Systems
Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF Consulting Ltd 21
- 22. The software lifecycle as a flow of value SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• Working systems have value; documents in themselves do not; so
this is the RAW MATERIALS FINISHED
PRODUCT
quickest
route! a b c Stated
requirements
Demonstrations &
acceptance tests
Programming
• SDLCs are necessary, but introduce impediments to value flow:
misunderstandings, disagreements…
documents are like inventory/stock, or “waste”
Implicit
a b c a b d requirements
’
Documented
? Acceptance tests
I
I
requirements
?
Meeting / escalation to agree Intermediate documentation!
©Thompson
information
Programming Systems
Consulting Ltd 22
- 23. To improve value flow: agile methods
SIGiST
following principles of lean manufacturing Specialist Interest Group in
Software Testing 21 Jun 2011
LEVELS OF DOCUMENTATION, FLOW OF FULLY-WORKING
pushed by specifiers SOFTWARE, pulled by
customer demand
+ Test Specifications
Requirements Accepted
System-
tested
+ Func
Spec
WORKING
SOFTWARE
Integrated
+ Technical
Design
Unit /
Component
-tested
+ Unit / Component
©Thompson
information
specifications Systems
Consulting Ltd 23
- 24. But any lifecycle should be improvable
SIGiST
by considering the value flow through it Specialist Interest Group in
Software Testing 21 Jun 2011
• The context influences what deliverables are
mandatory / optional / not wanted
• Use reviews to find defects & other difficulties fast
• Do Test Analysis before Test Design (again, this finds
defects early, before a large pile of detailed test scripts
has been written)
• Even if pre-designed testing is wanted by stakeholders,
do some exploratory testing also
• “Agile Documentation”*:
– use tables & diagrams
– consider wikis etc
– care with structure
© Thompson
information
* These points based on a book of that name, by Andreas Rüping
Systems
Consulting Ltd 24
- 25. Testing has a hierarchy, eg... SIGiST
Levels of Specialist Interest Group in
Software Testing 21 Jun 2011
system & Risks &
Levels of Levels of service testing
specification stakeholders integration responsib’s
+ Business Users may
Business, processes
Acceptance be unhappy (so
Requirements Users, Testing
generate
Business Analysts,
Acceptance Testers confidence)
System may contain
Functional & Architects, bugs not found by
System
NF specifica-
“independent” Testing lower levels (so seek
tions
testers bugs of type z)
Technical Designers, Integration
Units may not interact
spec, Hi-level properly (so seek bugs
integration Testing
design
testers of type y)
Individual units may
Detailed Developers, Unit
designs Testing
malfunction (so seek bugs
unit testers of type x)
Remember: not only for waterfall or V-model SDLCs, rather
iterative / incremental go down & up ©Thompson
through layers of stakeholders, information
specifications & system integrations Systems
Consulting Ltd 25
- 26. ...Quality and Science can also be seen as
SIGiST
hierarchies, which testing can parallel Specialist Interest Group in
Software Testing 21 Jun 2011
Levels of
system &
Layers of Levels of service Layers of
quality stakeholders integration science
+ Business
Static values: Business, processes
• Intellectual Users, Philosophy
Business Analysts,
• Social Acceptance Testers Social sciences
Architects, Biology (& systems
• Biological “independent”
thinking)
testers
Chemistry: Organic
Designers,
integration Chemistry: Inorganic
testers
• Inorganic
Developers, ©Thompson
unit testers
Physics information
Systems
Consulting Ltd 26
- 27. Value flows down through,
SIGiST
then up through, these layers Specialist Interest Group in
Software Testing 21 Jun 2011
Levels of
system &
Desired Levels of service Tested (“known”)
quality stakeholders integration quality
+ Business
Business, processes
Users,
Business Analysts,
Acceptance Testers
Architects,
“independent”
testers
Designers,
integration
testers
Developers, ©Thompson
unit testers information
Systems
Consulting Ltd 27
- 28. So, test appropriately to your scale SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Physics Physics (gravity
(quantum end)
end)
Unit
Testing
Chemistry:
Integration Inorganic
Testing
Organic
Biology
Systems Social sciences
System
thinking)
Testing Acceptance
Testing
©Thompson
information
Understanding of solution Systems
Consulting Ltd 28
- 29. How different sciences can inspire
SIGiST
different levels of testing Specialist Interest Group in
Software Testing 21 Jun 2011
• First, Unit/Component Testing (Physics):
– think quanta (smallest things you can do to the software),
equivalence partitions, data values
• For Integration Testing (Chemistry):
– think about interactions, what reactions should be,
symmetry, loops, valencies, performance of interfaces
• For System Testing (Biology):
– fitness for purpose, entity life histories, ecosystems,
palaeontology (historic bugs)
• For Acceptance Testing:
– think “Social Sciences”: what are contractual obligations?
• For each test level, consider and tune the value which
that level is adding... ©T hompson
information
Systems
Consulting Ltd 29
- 30. Value Flow ScoreCards SIGiST
(...have been presented previously, so these slides are for background Specialist Interest Group in
Software Testing 21 Jun 2011
and will be skimmed through quickly in the presentation)
Financial
• Based on Kaplan & Norton Efficiency
Productivity
Balanced Business Scorecard On-time,
in budget
and other “quality” concepts Supplier
- Cost of quality
Customer
VALIDATION
Upward
• Value chain ≈ Supply chain: management
Risks
Benefits
Acceptance
– in the IS SDLC, each participant Information
gathering
Satisfaction
- Complaints
should try to ‘manage their Improve-
supplier’ ment
– for example, development eg TPI/TMM…
Predictability
supplies testing Learning
Innovation
(in trad lifecycles, at least!)
– we add supplier viewpoint to the
other 5, giving a 6th view of quality
• So, each step in the Process Product
Compliance VERIFICATION
value chain can manage its eg ISO9000
Repeatability
Risks
Test coverage
inputs, outputs and - Mistakes - Faults
©Thompson
other stakeholders - Failures
information
Systems
Consulting Ltd 30
- 31. Value Flow ScoreCards can be SIGiST
cascaded (...but you don’t necessarily need all of these!) Specialist Interest Group in
Software Testing 21 Jun 2011
Business Analysts Requirements Reviewers Acceptance Test Analysts AT Designers & Scripters Acceptance Testers
Architects Func Spec Reviewers Sys Test Analysts ST Designers & Scripters Sys Testers
Designers Tech Design Reviewers Int Test Analysts IT Designers, Scripters & Executers
Component Test Analysts, Designers & Executers?
Pieces of a jig-saw
In addition to “measuring” quality
information within the SDLC: via pair programming?
• can use to align SDLC principles with ©Thompson
Developers
higher-level principles from the information
Systems
organisation Consulting Ltd 31
- 32. The Value Flow ScoreCard in action SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Financial Customer Financial Improv’t
Supplier Process Product
Supplier Improv’t Customer
Process Product
• Yes – it’s just a table!
…Into which we can put
useful things…
• We start with repositionable
paper notes, then can ©Thompson
put in spreadsheet(s) information
Systems
Consulting Ltd 32
- 33. Use #1, Test Policy:
SIGiST
All views included? Specialist Interest Group in
Software Testing 21 Jun 2011
Why-What-How (G-Q-M) thought through?
Organisation’s Organisation’s ScoreCards
Goals &
Supplier Process Product Customer Financial Improvement & Infrastructure
Objectives Upward Compliance VERIFICATION VALIDATION Efficiency
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Information Acceptance in budget Learning
Why gathering
- Mistakes
- Faults
- Failures
Satisfaction
- Complaints - Cost of quality
Innovation
• IS actively • Products • Products • Proj Mgr is • Staff must be • Constant • Use TestFrame
Objectives supports to satisfy to be fit responsible certified improv’t of for test analysis
employees specified for purpose for quality dev & test & execution
GOAL requirements processes
• Indep- • (comprehensive • Automate regr
• Bus Mgt is • Testing
endence scope) • Detect tests as much
responsible prioritised &
increases defects as possible
for enforcing managed
with early
Test Policy
What test type • Defect source
analysis
• Both static • Defect • Product • TMM levels
Measures & dynamic Detection risks
• ISTQB • Freq of process
adjustments
• Planning, Percentage
QUESTION preparation • Importance heeding metrics
& evaluation of req’ts
• Software & • Advisors • TMM level 2 • Twice per year
Targets related work Expert at least, now
products • Managers
METRIC Advanced • TMM level 3
• Analysts within 2 years
How Foundation
Initiatives
©Thompson
Source: summarised from an example in TestGrip by Marselis, van Royen, Schotanus & Pinkster (CMG, 2007) information
Systems
Consulting Ltd 33
- 34. Use #2, test coverage: SIGiST
Test conditions as measures & targets (not test cases!) Specialist Interest Group in
Software Testing 21 Jun 2011
(from Supplier Process Product Customer Financial Improvement &
LEVEL TEST PLAN Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
and Info from other Repeatability Test coverage Benefits On-time, Predictability
TEST BASES) levels of Acceptance in budget Learning
Treble-V model - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
Test Items Product Constraints
(level of benefits
Objectives Features to be
integration)
Test Basis Features to be
tested References tested
Product Product Product Product Product
Risks Risks Risks Risks Risks
Measures
Areas we
could cover
Agreed
Test Conditions with
Targets we intend
to cover stakeholders
Objectives for
Initiatives (to next level
Test Cases
of sys design) ©Thompson
(to test design information
& execution) Systems
Consulting Ltd 34
- 35. Use #3: process improvement, eg via
SIGiST
Goldratt’s Theory of Constraints: Specialist Interest Group in
Software Testing 21 Jun 2011
“Swimlane” symptoms, causes & proposed remedies
Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Information Acceptance in budget Learning
gethring - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
CURRENT ILLS
Objectives
CONFLICT
RESOLUTION
FUTURE REMEDIES
Measures
PRE-
REQUISITES
Targets
TRANSITION
Initiatives
©Thompson
Note: this is similar to Kaplan & Norton’s “Strategy Maps” (Harvard Business School Press 2004) information
Systems
When cause-effect branches form feedback loops, this becomes part of Systems Thinking Consulting Ltd 35
- 36. Use #4a: context-driven testing, eg
Goldratt conflict resolution on process areas with choices SIGiST
Specialist Interest Group in
From Software Testing 21 Jun 2011
Supplier Process Product Customer Financial Improvement &
Context / Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
Circumstances management eg ISO9000
Repeatability
Risks
Test coverage
Risks
Benefits
Productivity
On-time,
eg TPI/TMM…
Predictability
Information Acceptance in budget Learning
gathering - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
Legal: Process Application Sector Job type & size
• regulation constraints, eg: characteristics Resources:
Objectives • standards • quality mgmt
Culture
• money ( skills, environments) CURRENT
• configuration Technical • time SITUATION
Moral: mgmt risks
Business risks
• safety
Technology
CHOICE AREAS...
Test specifications Handover & acceptance etc (about 30
criteria categories)
Measures
informal formal informal formal CONFLICT
RESOLUTION
Where in the range Where in the range DESIRED
Targets (specific aspects) (specific aspects) SITUATION
©Thompson
information
Initiatives Appropriate Testing in this context / circumstances Systems
Consulting Ltd 36
- 37. Use #4b: lifecycle methodology selection
SIGiST
/ design, Value Flow ScoreCard as unifying framework Specialist Interest Group in
Software Testing 21 Jun 2011
Supplier Process Product Customer Financial Improvement &
Upward Compliance VERIFICATION VALIDATION Efficiency Infrastructure
management eg ISO9000 Risks Risks Productivity eg TPI/TMM…
Repeatability Test coverage Benefits On-time, Predictability
Information Acceptance in budget Learning
gathering - Faults Satisfaction Innovation
- Mistakes - Failures - Complaints - Cost of quality
Objectives Risks Risks Risks Risks Risks Risks BALANCE
Game Theory......................................................................................................
“METHODOLOGY PER PROJECT”
...........................................(any other approaches?)...........................................
Measures
Conflicts & balances
One hand The other
Targets
Appropriate lifecycle methodology in this ©Thompson
Initiatives information
context / circumstances Systems
Consulting Ltd 37
- 38. A further use, #4c?? SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• Could Value Flow ScoreCard ideas help discuss & bridge
the (arguably) growing divide between traditional &
agile software practitioners, eg:
- waterfall, V-model, W-model, iterative, incremental...?
- “schools” of software testing, eg Analytic, Standard,
Quality, Context-Driven, Agile... Factory, Oblivious...?
- scripted (or at least pre-designed) & exploratory testing?
• To attempt this, let’s extend the evolution & value flow
concepts to...
©Thompson
information
Systems
Consulting Ltd 38
- 39. SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Part C
Emergence &
Value Flow Science
© Thompson
information
Systems
Consulting Ltd 39
- 40. Evolution as Sophistication plotted
SIGiST
against Diversity Specialist Interest Group in
Software Testing 21 Jun 2011
Sophistication
Diversity
©Thompson
information
Systems
Source: Daniel Dennett “Darwin’s Dangerous Idea” Consulting Ltd 40
- 41. Punctuated equilibria SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Sophistication Sophistication
(equilibrium)
Spread into new niche,
eg Mammals
Mass extinction,
(equilibrium) eg Dinosaurs
“Explosion” in species,
eg Cambrian
(equilibrium)
“Gradual” Diversity Punctuated Diversity
Darwinsim equilibria
Number of
Sophistication
species
Diversity
©Thompson
“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay Gould
information
Systems
Images from www.wikipedia.org Consulting Ltd 41
- 42. Evolution of Science overall SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Social sciences
Biology
Organic
Inorganic
Chemistry
©Thompson
Physics information
Systems
Consulting Ltd 42
- 43. Not only Evolution, but Emergence:
SIGiST
progress along order-chaos edge Specialist Interest Group in
Software Testing 21 Jun 2011
• For best innovation & progress, need
neither too much order
nor too much chaos
• “Adjacent Possible”
Social sciences
Biology
Chemistry
Physics ©Thompson
information
Extrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations” Systems
Consulting Ltd 43
- 44. OK, what’s this got to do with
SIGiST
software testing? Specialist Interest Group in
Software Testing 21 Jun 2011
Computers
• We have an
Books important
and difficult
job to do
Language
here!
©Thompson
Tools information
Systems
Social sciences Consulting Ltd 44
- 45. ...and computers are evolving, in
SIGiST
both sophistication and diversity, Specialist Interest Group in
Software Testing 21 Jun 2011
faster than software testing?
Artificial
Intelligence?!
4GL
Internet, • Are we
Object Mobile ready
3GL Orientation devices
to test
2GL AI??
©Thompson
1GL information
Systems
Computers Consulting Ltd 45
- 46. The Philosophy of Science is also
SIGiST
evolving! Specialist Interest Group in
Software Testing 21 Jun 2011
##
Laudan
Lakatos Bayesianism,
Grounded
Kuhn
Theory...
Popper
Empiricism
• So, perhaps the
Positivism Philosophy of Software Testing
Logical could learn from this,
perhaps it’s also evolving?...
©Thompson
Classical information
Systems
Consulting Ltd 46
- 47. SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Part D
Platforms & Cranes
(Genes to Memes)
© Thompson
information
Systems
Consulting Ltd 47
- 48. Biological reproduction & evolution
SIGiST
is controlled by Genes Specialist Interest Group in
Software Testing 21 Jun 2011
Replication & Selection
Sophist-
ication
Mutation
Diversity ©Thompson
information
Image from www.qwickstep.com Image from .schools.wikipedia.org Systems
Consulting Ltd 48
- 49. Memes as an extension of the
SIGiST
Genes concept Specialist Interest Group in
Software Testing 21 Jun 2011
Replication & Selection
Cranes
“Other
imitable
phenomena”
Writing
Platforms
Speech
Rituals
Gestures
Mental, social &
cultural evolution
Symbols Ideas Beliefs Practices
Image from .www.salon.com Mutation
Taxonomy from www.wikipedia.org
©Thompson
Biological evolution information
Theme developed from Daniel Dennett “Darwin’s Dangerous Idea”
Systems
Consulting Ltd 49
- 50. Some candidates for Memes in
SIGiST
software testing Specialist Interest Group in
Software Testing 21 Jun 2011
Effectiveness Always-consider
Efficiency
Risk management Quality management
Decide process targets Assess where errors originally made
& improve over time
Insurance Assurance Be pragmatic over quality targets
Plan early, then Define & use metrics
Give confidence (AT) rehearse-run, Use handover & acceptance criteria
Define & detect errors (UT,IT,ST) acceptance tests
V-model: what testing against W-model: quality management Use independent system & acceptance testers
Risks: list & evaluate Tailor risks & priorities etc to factors Use appropriate skills mix
Refine test specifications progressively: Define & agree roles & responsibilities
Prioritise tests based on risks
Plan based on priorities & constraints
Design flexible tests to fit Use appropriate techniques & patterns
Define & measure
Allow appropriate script format(s)
test coverage
Use synthetic + lifelike data Use appropriate tools
Allow & assess for coverage changes Document execution & management procedures Optimise efficiency
Distinguish problems from change requests
Measure progress & problem significance Prioritise urgency & importance
Quantify residual risks & confidence Distinguish retesting from regression testing
©Thompson
information
Source: Neil Thompson STAREast 2003 Systems
Consulting Ltd 50
- 51. Four, five, six... schools of
SIGiST
software testing? Specialist Interest Group in
Software Testing 21 Jun 2011
(Updated version) March 2007
Copyright © 2003-2007 Bret Pettichord. Permission to reproduce granted with attribution
Emphasis on
Oblivious / policing developers and
Groucho? acting as “gatekeeper”
Emphasis on analytical methods
for assessing the quality
of the software, including
improvement of testability by
improved precision of specifications Factory: Emphasis on reduction of
and many types of modeling testing tasks to
(Control): routines that can be
Emphasis on automated or
standards and delegated to cheap labour
processes that
enforce or
rely heavily
Emphasis on on standards (Test-Driven):
adapting to emphasis on
the circumstances code-focused testing Neo-
under which by programmers Holistic?
the product is
Axiomatic
developed and used ? (like C-D)
©Thompson
Annotations by Neil Thompson after the Bret Pettichord ppt (blue text), information
Systems
the list in Cem Kaner’s blog December 2006 (black text) , and other sources! (red text) Consulting Ltd 51
- 52. Learning from the “Schools”
SIGiST
situation Specialist Interest Group in
Software Testing 21 Jun 2011
• Think in terms of memes: evolution and transmission
• Separate “what people have been taught” from:
– what their bosses say they want (or should want??)
– what their personalities push them towards
• Is school behaviour volatile? (context-driven!?)
• Things we could adapt from other disciplines:
– see various conference talks, eg oil exploration
– but what about insurance, and their actuaries?!
• Preparing for the future, eg testing Artificial Intelligence:
– what happened to Genetic Algorithms?
– what’s the latest Bayesian application?
© Thompson
information
Systems
Consulting Ltd 52
- 53. SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
Conclusions & Summary
© Thompson
information
Systems
Consulting Ltd 53
- 54. Conclusions SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• The Ouroboros looks better with
software testing at the top! Value
flows upwards :
• (right) from the Big Bang to planet Earth
and human habitations; and
• (left) from subatomic particles to humans
• (the “origin”, Quantum Gravity?, yet to be
agreed)
• Value Flow ScoreCards are useful, but
this talk is more about applying the
layered principles of science to IT
quality
• Humans now evolve in terms of
technology-aided Memes, and we can
use that to understand & develop the
future of software testing
©Thompson
information
Systems
Consulting Ltd 54
- 55. Recap of messages for Testing &
SIGiST
Quality Specialist Interest Group in
Software Testing 21 Jun 2011
• When strategising, planning and performing testing:
– test according to your scale, using analogies from different
sciences to help “frame” your tests
– use Value Flow Scorecards to understand and balance your
stakeholders
– design experiments to seek different bug types at different
levels (don’t just “falsify” the opposite experiment)
• When considering your position & future in the
testing industry:
– it’s not just teaching but also psyche, and what bosses
want
– “stand on the shoulders of giants” ie make use of the
platforms which give huge leverage (eg exploratory
automation) ©T hompson
information
Systems
Consulting Ltd 55
- 56. Some wider advice SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• When reading new material, use the Adjacent
Possible – consider reading two authors at
once (or maybe three):
– either different representations of similar
opinions, or
– apparently opposing opinions
• (I’ve been quoting pairs of books on Twitter,
“Things To Read Together”)
© Thompson
information
Systems
Consulting Ltd 56
- 57. References SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• Already quoted slide-by-slide – for summary of main
sources see associated article in “The Tester”
• Stop Press (since preparing this talk) – see also related
views:
– The Software Testing Timeline, www.testingreferences.com
– Stuart Reid, “Lines of Innovation in Software Testing”, 2011
paper
– Jurgen Appelo, “Management 3.0” (2011 book and website,
about agile leadership practices – makes significant use of
complexity theory, eg Kauffman)
© Thompson
information
Systems
Consulting Ltd 57
- 58. SIGiST
Specialist Interest Group in
Software Testing 21 Jun 2011
• Thanks for listening!
©Thompson
• Questions? information
Systems
Consulting Ltd 58