2. Case study
Who: Large financial institution
Where: Johannesburg South Africa
When: 2006 to 2008
What: Adoption of test automation with the
aim of reducing costs, increasing
coverage and significantly
increasing productivity
3. Contents
• The challenges facing the organisation
• The approach to solving the problem
• How automation tools are looked at and classified (Why)
• The automation process
• The automation framework
• Contemporary automation issues
• Automation benefit calculations
• Lessons learnt
4. Introduction
“ On the island of testers they were forever doomed to search for
what should not and could not exist, knowing to succeed would
bring misery to the gods.”
Lessons Learned in Software Testing
Cem Kaner
James Bach
Bret Pettichord
“ A fool with a tool is still a fool.”
5. Key Challenges
• No end-to-end defined automation process.
• Insufficient re-use of testing artefacts including the tools themselves.
• Testing was involved late in the process.
• No accurate documentation.
• Resistance to change.
• Difficult to prove automation benefits and cost justify the automation
tools.
• Dependency on key resources for specific application knowledge.
6. Solving the problem
• Formal project launched (“Project Testing”).
• Project had to be based on a sound financials and commit to
significant savings.
• The project business case was approved at a high level in the
organisation.
• Failure was not an option.
• Resourced with specialist skilled resources.
• Driven by a person passionate about software quality and
productivity
7. “Project testing”
• The project launched in May 2006.
• Project had an appropriate budget.
• Detailed business case was documented to facilitate return on
investment.
• Steering committee tracked progress on a monthly basis.
• Macro project broken down into specialist workgroups one of which
was automation.
• Dedicated group responsible for communication and change
management.
8. “Project Testing” work-streams
• Test process work-stream.
• Test governance work-stream.
• Mechanisation and tools work-stream.
• Test repository work-stream.
• Test environment work-stream.
• Test change management, communication and training work-stream.
9. Mechanisation & tools work-stream.
• Automation process.
• Automation process integrated with SDLC.
• Tool selection, procurement and training was key.
• Automation framework.
• Tool classification.
• Quantification of automation benefits.
11. Analysis and design
• Understand the solutions test
requirements.
• Recommend Automated testing
solution approach
• Prototyping the solution.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
12. Scripting & Configuration
• Scripts configured with solution
requirements.
• Activities in this phase could
include test data discovery and
creation, capturing test
scenarios, scripting and where
necessary, tool configuration.
• Internal testing of automated
test solution.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
13. Parameter identification
• Scripts assessed against user-
defined scenarios.
• Components identified for
parameterisation.
• Components categorised based
on nature and function.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
14. Parameter management
• Data created in previous steps
is managed.
• Customised spreadsheets are
created.
• Requirement exists that data in
these sheets can be maintained
by non-technical personnel.
• All scenarios are described in
both technical and non-
technical terms.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
15. Scenario collection
• Spreadsheet populated with
scenarios provided by
stakeholders of the system.
• Manual test cases could provide
input.
• Actual production outage
experience used as input.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
16. Validation
• Automated validation of results
important due to volume.
• Spreadsheet updated with
expected results.
• Validation done by script using
information from spreadsheets.
• Clear “Pass” or “Fail”
indicators provided.
• Summary of results provided.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
17. Testing of scripts
• Like all new software, the
scripts must be tested.
• Important that when defects are
reported during the actual test
cycle that the tool is above
reproach.
• Scripts must be able to react to
anomalies in a predicted
fashion.
• Must still be able to report
effectively on operational
scenarios.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
18. Script execution
• Scripts are run against target
application.
• Stability of system determines
support required.
• Strive to provide as much
meaningful data in the shortest
possible time.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
19. Review of results
• Results are reviewed internally.
• Focus on abnormal failures due
to environmental or other
conditions.
• Checks are done to ensure data
integrity and scope coverage.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
20. Result communication
• Test results are saved in a test
artefact repository.
• Screen dumps of all defects
encountered are kept.
• Benefit of automated approach
established and communicated.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
22. Categorising our toolset
Why?
This is about utilising the right tool for the right job. The strength in
categorising tools is to ensure the ability to provide appropriately
customised, automated testing solutions which, when complimented
by manual testing, aim to mitigate both the product and project risks
associated with solution deployment.
26. Front-end user interface (Functional)
• Simulate user interaction
with applications.
• Provide support for unit,
integration, system and
acceptance testing.
• User regression testing
particularly suits the
nature of these tools.
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
27. Back-end or network interaction
(Functional)
• Ability to simulate user
interaction in the
absence of a front-end.
• Supports bottom-up
integration.
• Provides the ability to
find defects much
sooner in the SDLC.
• Valuable in testing SOA
implementations.
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
28. Front-end user interface (Non-
functional)
• Ability to generate the
required load for
performance, load and
stress testing.
• These types of tests can
typically not be performed
manually.
• Tool is dependent on the
functional quality of the
application under test.
• Test environment
comparison with
production .
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
29. Back-end or network interaction
(Non-functional)
• Due to the nature of
implementation, it is not
practical to use the front-
end to generate the
required load (ATM, point-
of-sale devices).
• Typically these kinds of
tests cannot be performed
manually.
• Test environment
comparison with
production .
• Valuable in testing the
performance of services
in SOA implementations.
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
30. General automated testing utilities
• Provides general support for manual and automated
testing activities.
• Compares large amounts of data.
• Generates unique data sets from production data where
applicable.
• Can generate vast amount of test data very quickly.
• Provides support to all previously-mentioned tool
categories.
31. Contemporary automation issues
Automation lag
• Refers to the time lost between the initial implementation of the
application, and the point at which automated functional scripts
can be used during testing.
• In future regression tests this ‘lag’ is significantly reduced and
may not even exist at all
33. Contemporary automation issues
(continued)
User acceptance testing and automation
• To perform successful user acceptance testing, the involvement of
the user is critical – both in defining the test plans (I.e. Strategy),
execution and validation of results.
• Users responsible for acceptance are asked to complete the
scenario spreadsheet.
• The user is required to monitor a few initial script executions.
34. Automation benefit calculations
• Historically problematic to get accurate measures.
• Focus initially on quality.
• Difficult to quantify quality in monetary terms.
• Decision taken to focus on productivity improvement and
associated cost reduction.
35. Automation benefit calculations
process
• Identify the smallest common element.
• In most cases it would be the parameters identified in the
scripting process.
• Compare effort (duration) associated with manual testing
with that of automated testing per parameter.
• Describe the improvement in monetary terms by using an
average resource cost.
36. Automation benefit
• Total cost saving for the period 2005 to end-2008:
R124 183 171.00 ($ 12 418 000 US)
• Benefit confirmed by system owners and users.
• Calculation method reviewed by the Finance department.
• Return of this nature justified the significant investment
made in both project testing and automation tools.
37. What have we learned?
• Having a test tool is not a strategy.
• Automation does not test in the same way that a manual tester does.
• Automated test scripts are software programs and must be treated as
such.
• The value lies in maintenance.
• Customise your toolset to suit your needs.
• Categorise your toolset, and design integrated solutions.