SlideShare una empresa de Scribd logo
1 de 38
Descargar para leer sin conexión
Software
Test Automation Success
A case study
Case study
Who: Large financial institution
Where: Johannesburg South Africa
When: 2006 to 2008
What: Adoption of test automation with the
aim of reducing costs, increasing
coverage and significantly
increasing productivity
Contents
• The challenges facing the organisation
• The approach to solving the problem
• How automation tools are looked at and classified (Why)
• The automation process
• The automation framework
• Contemporary automation issues
• Automation benefit calculations
• Lessons learnt
Introduction
“ On the island of testers they were forever doomed to search for
what should not and could not exist, knowing to succeed would
bring misery to the gods.”
Lessons Learned in Software Testing
Cem Kaner
James Bach
Bret Pettichord
“ A fool with a tool is still a fool.”
Key Challenges
• No end-to-end defined automation process.
• Insufficient re-use of testing artefacts including the tools themselves.
• Testing was involved late in the process.
• No accurate documentation.
• Resistance to change.
• Difficult to prove automation benefits and cost justify the automation
tools.
• Dependency on key resources for specific application knowledge.
Solving the problem
• Formal project launched (“Project Testing”).
• Project had to be based on a sound financials and commit to
significant savings.
• The project business case was approved at a high level in the
organisation.
• Failure was not an option.
• Resourced with specialist skilled resources.
• Driven by a person passionate about software quality and
productivity
“Project testing”
• The project launched in May 2006.
• Project had an appropriate budget.
• Detailed business case was documented to facilitate return on
investment.
• Steering committee tracked progress on a monthly basis.
• Macro project broken down into specialist workgroups one of which
was automation.
• Dedicated group responsible for communication and change
management.
“Project Testing” work-streams
• Test process work-stream.
• Test governance work-stream.
• Mechanisation and tools work-stream.
• Test repository work-stream.
• Test environment work-stream.
• Test change management, communication and training work-stream.
Mechanisation & tools work-stream.
• Automation process.
• Automation process integrated with SDLC.
• Tool selection, procurement and training was key.
• Automation framework.
• Tool classification.
• Quantification of automation benefits.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Automation process
Analysis and design
• Understand the solutions test
requirements.
• Recommend Automated testing
solution approach
• Prototyping the solution.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Scripting & Configuration
• Scripts configured with solution
requirements.
• Activities in this phase could
include test data discovery and
creation, capturing test
scenarios, scripting and where
necessary, tool configuration.
• Internal testing of automated
test solution.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Parameter identification
• Scripts assessed against user-
defined scenarios.
• Components identified for
parameterisation.
• Components categorised based
on nature and function.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Parameter management
• Data created in previous steps
is managed.
• Customised spreadsheets are
created.
• Requirement exists that data in
these sheets can be maintained
by non-technical personnel.
• All scenarios are described in
both technical and non-
technical terms.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Scenario collection
• Spreadsheet populated with
scenarios provided by
stakeholders of the system.
• Manual test cases could provide
input.
• Actual production outage
experience used as input.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Validation
• Automated validation of results
important due to volume.
• Spreadsheet updated with
expected results.
• Validation done by script using
information from spreadsheets.
• Clear “Pass” or “Fail”
indicators provided.
• Summary of results provided.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Testing of scripts
• Like all new software, the
scripts must be tested.
• Important that when defects are
reported during the actual test
cycle that the tool is above
reproach.
• Scripts must be able to react to
anomalies in a predicted
fashion.
• Must still be able to report
effectively on operational
scenarios.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Script execution
• Scripts are run against target
application.
• Stability of system determines
support required.
• Strive to provide as much
meaningful data in the shortest
possible time.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Review of results
• Results are reviewed internally.
• Focus on abnormal failures due
to environmental or other
conditions.
• Checks are done to ensure data
integrity and scope coverage.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Result communication
• Test results are saved in a test
artefact repository.
• Screen dumps of all defects
encountered are kept.
• Benefit of automated approach
established and communicated.
Scripting &
Configuration
Parameter
definition
Analysis &
Design
Review of
results
Script
execution
Testing of
scripts
Parameter
management
Result
communication
Validation
Scenario
collection
Automated testing cost benefit example
0
500000
1000000
1500000
2000000
2500000
3000000
3500000
4000000
4500000
5000000
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Regression cycles
ValueinSouthAfricanRands
Investment
Benefits
Categorising our toolset
Why?
This is about utilising the right tool for the right job. The strength in
categorising tools is to ensure the ability to provide appropriately
customised, automated testing solutions which, when complimented
by manual testing, aim to mitigate both the product and project risks
associated with solution deployment.
Categorising our toolset
(continued)
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
Categorised toolset
• Preparation
• Front-end user interface (Functional)
• Back-end or network interaction (Functional)
• Front-end user interface (Non-functional)
• Back-end or network interaction (Non-functional)
• General automated testing utilities
Preparation
• Configure appropriate
tools
• Plan automation strategy
• Verify identified scenarios
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
Front-end user interface (Functional)
• Simulate user interaction
with applications.
• Provide support for unit,
integration, system and
acceptance testing.
• User regression testing
particularly suits the
nature of these tools.
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
Back-end or network interaction
(Functional)
• Ability to simulate user
interaction in the
absence of a front-end.
• Supports bottom-up
integration.
• Provides the ability to
find defects much
sooner in the SDLC.
• Valuable in testing SOA
implementations.
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
Front-end user interface (Non-
functional)
• Ability to generate the
required load for
performance, load and
stress testing.
• These types of tests can
typically not be performed
manually.
• Tool is dependent on the
functional quality of the
application under test.
• Test environment
comparison with
production .
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
Back-end or network interaction
(Non-functional)
• Due to the nature of
implementation, it is not
practical to use the front-
end to generate the
required load (ATM, point-
of-sale devices).
• Typically these kinds of
tests cannot be performed
manually.
• Test environment
comparison with
production .
• Valuable in testing the
performance of services
in SOA implementations.
Preparation
Front-end
user interface
(Functional)
Back-end
network
interaction
(Functional)
Front-end
Back-end
Middleware
Back-end
network
interaction
(Non-Functional)
Front-end user
interface
(Non-Functional)
General automated testing utilities
• Provides general support for manual and automated
testing activities.
• Compares large amounts of data.
• Generates unique data sets from production data where
applicable.
• Can generate vast amount of test data very quickly.
• Provides support to all previously-mentioned tool
categories.
Contemporary automation issues
Automation lag
• Refers to the time lost between the initial implementation of the
application, and the point at which automated functional scripts
can be used during testing.
• In future regression tests this ‘lag’ is significantly reduced and
may not even exist at all
Automation lag
Scripting
Parameters
Spreadsheet
Testing
Validation
Script
Execution
Data
Prep
Analysis
Feedbac
k
Application available for testingDesign & Build
Automation Lag
Manuel test execution
Automat
ed
testing
Implementation
Contemporary automation issues
(continued)
User acceptance testing and automation
• To perform successful user acceptance testing, the involvement of
the user is critical – both in defining the test plans (I.e. Strategy),
execution and validation of results.
• Users responsible for acceptance are asked to complete the
scenario spreadsheet.
• The user is required to monitor a few initial script executions.
Automation benefit calculations
• Historically problematic to get accurate measures.
• Focus initially on quality.
• Difficult to quantify quality in monetary terms.
• Decision taken to focus on productivity improvement and
associated cost reduction.
Automation benefit calculations
process
• Identify the smallest common element.
• In most cases it would be the parameters identified in the
scripting process.
• Compare effort (duration) associated with manual testing
with that of automated testing per parameter.
• Describe the improvement in monetary terms by using an
average resource cost.
Automation benefit
• Total cost saving for the period 2005 to end-2008:
R124 183 171.00 ($ 12 418 000 US)
• Benefit confirmed by system owners and users.
• Calculation method reviewed by the Finance department.
• Return of this nature justified the significant investment
made in both project testing and automation tools.
What have we learned?
• Having a test tool is not a strategy.
• Automation does not test in the same way that a manual tester does.
• Automated test scripts are software programs and must be treated as
such.
• The value lies in maintenance.
• Customise your toolset to suit your needs.
• Categorise your toolset, and design integrated solutions.
Thank you

Más contenido relacionado

La actualidad más candente

Testing Tool Evaluation Criteria
Testing Tool Evaluation CriteriaTesting Tool Evaluation Criteria
Testing Tool Evaluation Criteria
basma_iti_1984
 
Tutorial ranorex
Tutorial ranorexTutorial ranorex
Tutorial ranorex
radikalzen
 
Keyword Driven Testing
Keyword Driven TestingKeyword Driven Testing
Keyword Driven Testing
Maveryx
 
Final Automation Testing
Final Automation TestingFinal Automation Testing
Final Automation Testing
priya_trivedi
 

La actualidad más candente (19)

Testing Tool Evaluation Criteria
Testing Tool Evaluation CriteriaTesting Tool Evaluation Criteria
Testing Tool Evaluation Criteria
 
Top ten software testing tools
Top ten software testing toolsTop ten software testing tools
Top ten software testing tools
 
QA Process Overview
QA Process OverviewQA Process Overview
QA Process Overview
 
Automation using ibm rft
Automation using ibm rftAutomation using ibm rft
Automation using ibm rft
 
Test automation
Test automationTest automation
Test automation
 
Tutorial ranorex
Tutorial ranorexTutorial ranorex
Tutorial ranorex
 
Keyword-driven Test Automation Framework
Keyword-driven Test Automation FrameworkKeyword-driven Test Automation Framework
Keyword-driven Test Automation Framework
 
Rft courseware
Rft coursewareRft courseware
Rft courseware
 
Automation testing material by Durgasoft,hyderabad
Automation testing material by Durgasoft,hyderabadAutomation testing material by Durgasoft,hyderabad
Automation testing material by Durgasoft,hyderabad
 
Introduction to Automation Testing
Introduction to Automation TestingIntroduction to Automation Testing
Introduction to Automation Testing
 
Software testing tools
Software testing toolsSoftware testing tools
Software testing tools
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testing
 
Software testing tools
Software testing toolsSoftware testing tools
Software testing tools
 
Keyword Driven Testing
Keyword Driven TestingKeyword Driven Testing
Keyword Driven Testing
 
Final Automation Testing
Final Automation TestingFinal Automation Testing
Final Automation Testing
 
RFT Tutorial - 9 How To Create A Properties Verification Point In Rft For Tes...
RFT Tutorial - 9 How To Create A Properties Verification Point In Rft For Tes...RFT Tutorial - 9 How To Create A Properties Verification Point In Rft For Tes...
RFT Tutorial - 9 How To Create A Properties Verification Point In Rft For Tes...
 
Software Testing Tools | Edureka
Software Testing Tools | EdurekaSoftware Testing Tools | Edureka
Software Testing Tools | Edureka
 
Software testing tools (free and open source)
Software testing tools (free and open source)Software testing tools (free and open source)
Software testing tools (free and open source)
 
Practical unit testing in c & c++
Practical unit testing in c & c++Practical unit testing in c & c++
Practical unit testing in c & c++
 

Destacado

Destacado (20)

RUSSEL-C.-RUIZ
RUSSEL-C.-RUIZRUSSEL-C.-RUIZ
RUSSEL-C.-RUIZ
 
37894743
3789474337894743
37894743
 
Alexis Chateau – Portfolio
Alexis Chateau – PortfolioAlexis Chateau – Portfolio
Alexis Chateau – Portfolio
 
RUSSEL-C.-RUIZ
RUSSEL-C.-RUIZRUSSEL-C.-RUIZ
RUSSEL-C.-RUIZ
 
Educaciòn sexual en los jòvenes
Educaciòn sexual en los jòvenesEducaciòn sexual en los jòvenes
Educaciòn sexual en los jòvenes
 
La leyenda
La leyendaLa leyenda
La leyenda
 
Ttic27
Ttic27Ttic27
Ttic27
 
Con Cariño
Con CariñoCon Cariño
Con Cariño
 
Tic 7
Tic 7Tic 7
Tic 7
 
Curso tic (1)
Curso tic (1)Curso tic (1)
Curso tic (1)
 
Lidia
LidiaLidia
Lidia
 
Edilma
EdilmaEdilma
Edilma
 
Con Cariño
Con CariñoCon Cariño
Con Cariño
 
IMPORTANCIA DE LA FAMILIA
IMPORTANCIA DE LA FAMILIAIMPORTANCIA DE LA FAMILIA
IMPORTANCIA DE LA FAMILIA
 
La investigación , el conocimiento y la ciencia
La investigación , el conocimiento y la cienciaLa investigación , el conocimiento y la ciencia
La investigación , el conocimiento y la ciencia
 
jee-advanced
jee-advancedjee-advanced
jee-advanced
 
Que son las tic
Que son  las ticQue son  las tic
Que son las tic
 
Redvolución Institución Educativa Gonzalo Mejía
Redvolución Institución Educativa Gonzalo MejíaRedvolución Institución Educativa Gonzalo Mejía
Redvolución Institución Educativa Gonzalo Mejía
 
RUSSEL-C.-RUIZ
RUSSEL-C.-RUIZRUSSEL-C.-RUIZ
RUSSEL-C.-RUIZ
 
Tic 18
Tic 18Tic 18
Tic 18
 

Similar a TEST_AUTOMATION_CASE_STUDY_(2)2[1]

Sucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QASucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QA
Sucheta Kale
 
Chowdappa Resume
Chowdappa ResumeChowdappa Resume
Chowdappa Resume
chowdappa o
 
Chowdappa Resume
Chowdappa ResumeChowdappa Resume
Chowdappa Resume
chowdappa o
 
Introduction to Total Data Driven Test Automation
Introduction to Total Data Driven Test AutomationIntroduction to Total Data Driven Test Automation
Introduction to Total Data Driven Test Automation
VNITO Alliance
 
Software test management
Software test managementSoftware test management
Software test management
Vishad Garg
 

Similar a TEST_AUTOMATION_CASE_STUDY_(2)2[1] (20)

Qtp - Introduction values
Qtp - Introduction valuesQtp - Introduction values
Qtp - Introduction values
 
My Profile
My ProfileMy Profile
My Profile
 
Michael Snyman - Software Test Automation Success
Michael Snyman - Software Test Automation Success Michael Snyman - Software Test Automation Success
Michael Snyman - Software Test Automation Success
 
How to Improve Automation Test Coverage_.pptx
How to Improve Automation Test Coverage_.pptxHow to Improve Automation Test Coverage_.pptx
How to Improve Automation Test Coverage_.pptx
 
Test planning and software's engineering
Test planning and software's engineeringTest planning and software's engineering
Test planning and software's engineering
 
Sucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QASucheta_kale_4.8years_QA
Sucheta_kale_4.8years_QA
 
Best Practices for Implementing Automated Functional Testing
Best Practices for Implementing Automated Functional TestingBest Practices for Implementing Automated Functional Testing
Best Practices for Implementing Automated Functional Testing
 
Chowdappa Resume
Chowdappa ResumeChowdappa Resume
Chowdappa Resume
 
Chowdappa Resume
Chowdappa ResumeChowdappa Resume
Chowdappa Resume
 
Introduction to Total Data Driven Test Automation
Introduction to Total Data Driven Test AutomationIntroduction to Total Data Driven Test Automation
Introduction to Total Data Driven Test Automation
 
Questions for successful test automation projects
Questions for successful test automation projectsQuestions for successful test automation projects
Questions for successful test automation projects
 
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbsModule 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
Module 4.pptxbsbsnsnsnsbsbbsjsjzbsbbsbsbsbs
 
Software Engineering (Software Quality Assurance & Testing: Supplementary Mat...
Software Engineering (Software Quality Assurance & Testing: Supplementary Mat...Software Engineering (Software Quality Assurance & Testing: Supplementary Mat...
Software Engineering (Software Quality Assurance & Testing: Supplementary Mat...
 
Small is Beautiful- Fully Automate your Test Case Design
Small is Beautiful- Fully Automate your Test Case DesignSmall is Beautiful- Fully Automate your Test Case Design
Small is Beautiful- Fully Automate your Test Case Design
 
Ramachandra (1)
Ramachandra (1)Ramachandra (1)
Ramachandra (1)
 
Resume_Sahida Sultana
Resume_Sahida SultanaResume_Sahida Sultana
Resume_Sahida Sultana
 
Fundamentals of software 2 | Test Case | Test Suite | Test Plan | Test Scenario
Fundamentals of software 2 | Test Case | Test Suite | Test Plan | Test ScenarioFundamentals of software 2 | Test Case | Test Suite | Test Plan | Test Scenario
Fundamentals of software 2 | Test Case | Test Suite | Test Plan | Test Scenario
 
Software test management
Software test managementSoftware test management
Software test management
 
Object oriented testing
Object oriented testingObject oriented testing
Object oriented testing
 
Neev QA Offering
Neev QA OfferingNeev QA Offering
Neev QA Offering
 

TEST_AUTOMATION_CASE_STUDY_(2)2[1]

  • 2. Case study Who: Large financial institution Where: Johannesburg South Africa When: 2006 to 2008 What: Adoption of test automation with the aim of reducing costs, increasing coverage and significantly increasing productivity
  • 3. Contents • The challenges facing the organisation • The approach to solving the problem • How automation tools are looked at and classified (Why) • The automation process • The automation framework • Contemporary automation issues • Automation benefit calculations • Lessons learnt
  • 4. Introduction “ On the island of testers they were forever doomed to search for what should not and could not exist, knowing to succeed would bring misery to the gods.” Lessons Learned in Software Testing Cem Kaner James Bach Bret Pettichord “ A fool with a tool is still a fool.”
  • 5. Key Challenges • No end-to-end defined automation process. • Insufficient re-use of testing artefacts including the tools themselves. • Testing was involved late in the process. • No accurate documentation. • Resistance to change. • Difficult to prove automation benefits and cost justify the automation tools. • Dependency on key resources for specific application knowledge.
  • 6. Solving the problem • Formal project launched (“Project Testing”). • Project had to be based on a sound financials and commit to significant savings. • The project business case was approved at a high level in the organisation. • Failure was not an option. • Resourced with specialist skilled resources. • Driven by a person passionate about software quality and productivity
  • 7. “Project testing” • The project launched in May 2006. • Project had an appropriate budget. • Detailed business case was documented to facilitate return on investment. • Steering committee tracked progress on a monthly basis. • Macro project broken down into specialist workgroups one of which was automation. • Dedicated group responsible for communication and change management.
  • 8. “Project Testing” work-streams • Test process work-stream. • Test governance work-stream. • Mechanisation and tools work-stream. • Test repository work-stream. • Test environment work-stream. • Test change management, communication and training work-stream.
  • 9. Mechanisation & tools work-stream. • Automation process. • Automation process integrated with SDLC. • Tool selection, procurement and training was key. • Automation framework. • Tool classification. • Quantification of automation benefits.
  • 10. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection Automation process
  • 11. Analysis and design • Understand the solutions test requirements. • Recommend Automated testing solution approach • Prototyping the solution. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 12. Scripting & Configuration • Scripts configured with solution requirements. • Activities in this phase could include test data discovery and creation, capturing test scenarios, scripting and where necessary, tool configuration. • Internal testing of automated test solution. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 13. Parameter identification • Scripts assessed against user- defined scenarios. • Components identified for parameterisation. • Components categorised based on nature and function. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 14. Parameter management • Data created in previous steps is managed. • Customised spreadsheets are created. • Requirement exists that data in these sheets can be maintained by non-technical personnel. • All scenarios are described in both technical and non- technical terms. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 15. Scenario collection • Spreadsheet populated with scenarios provided by stakeholders of the system. • Manual test cases could provide input. • Actual production outage experience used as input. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 16. Validation • Automated validation of results important due to volume. • Spreadsheet updated with expected results. • Validation done by script using information from spreadsheets. • Clear “Pass” or “Fail” indicators provided. • Summary of results provided. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 17. Testing of scripts • Like all new software, the scripts must be tested. • Important that when defects are reported during the actual test cycle that the tool is above reproach. • Scripts must be able to react to anomalies in a predicted fashion. • Must still be able to report effectively on operational scenarios. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 18. Script execution • Scripts are run against target application. • Stability of system determines support required. • Strive to provide as much meaningful data in the shortest possible time. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 19. Review of results • Results are reviewed internally. • Focus on abnormal failures due to environmental or other conditions. • Checks are done to ensure data integrity and scope coverage. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 20. Result communication • Test results are saved in a test artefact repository. • Screen dumps of all defects encountered are kept. • Benefit of automated approach established and communicated. Scripting & Configuration Parameter definition Analysis & Design Review of results Script execution Testing of scripts Parameter management Result communication Validation Scenario collection
  • 21. Automated testing cost benefit example 0 500000 1000000 1500000 2000000 2500000 3000000 3500000 4000000 4500000 5000000 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Regression cycles ValueinSouthAfricanRands Investment Benefits
  • 22. Categorising our toolset Why? This is about utilising the right tool for the right job. The strength in categorising tools is to ensure the ability to provide appropriately customised, automated testing solutions which, when complimented by manual testing, aim to mitigate both the product and project risks associated with solution deployment.
  • 23. Categorising our toolset (continued) Preparation Front-end user interface (Functional) Back-end network interaction (Functional) Front-end Back-end Middleware Back-end network interaction (Non-Functional) Front-end user interface (Non-Functional)
  • 24. Categorised toolset • Preparation • Front-end user interface (Functional) • Back-end or network interaction (Functional) • Front-end user interface (Non-functional) • Back-end or network interaction (Non-functional) • General automated testing utilities
  • 25. Preparation • Configure appropriate tools • Plan automation strategy • Verify identified scenarios Preparation Front-end user interface (Functional) Back-end network interaction (Functional) Front-end Back-end Middleware Back-end network interaction (Non-Functional) Front-end user interface (Non-Functional)
  • 26. Front-end user interface (Functional) • Simulate user interaction with applications. • Provide support for unit, integration, system and acceptance testing. • User regression testing particularly suits the nature of these tools. Preparation Front-end user interface (Functional) Back-end network interaction (Functional) Front-end Back-end Middleware Back-end network interaction (Non-Functional) Front-end user interface (Non-Functional)
  • 27. Back-end or network interaction (Functional) • Ability to simulate user interaction in the absence of a front-end. • Supports bottom-up integration. • Provides the ability to find defects much sooner in the SDLC. • Valuable in testing SOA implementations. Preparation Front-end user interface (Functional) Back-end network interaction (Functional) Front-end Back-end Middleware Back-end network interaction (Non-Functional) Front-end user interface (Non-Functional)
  • 28. Front-end user interface (Non- functional) • Ability to generate the required load for performance, load and stress testing. • These types of tests can typically not be performed manually. • Tool is dependent on the functional quality of the application under test. • Test environment comparison with production . Preparation Front-end user interface (Functional) Back-end network interaction (Functional) Front-end Back-end Middleware Back-end network interaction (Non-Functional) Front-end user interface (Non-Functional)
  • 29. Back-end or network interaction (Non-functional) • Due to the nature of implementation, it is not practical to use the front- end to generate the required load (ATM, point- of-sale devices). • Typically these kinds of tests cannot be performed manually. • Test environment comparison with production . • Valuable in testing the performance of services in SOA implementations. Preparation Front-end user interface (Functional) Back-end network interaction (Functional) Front-end Back-end Middleware Back-end network interaction (Non-Functional) Front-end user interface (Non-Functional)
  • 30. General automated testing utilities • Provides general support for manual and automated testing activities. • Compares large amounts of data. • Generates unique data sets from production data where applicable. • Can generate vast amount of test data very quickly. • Provides support to all previously-mentioned tool categories.
  • 31. Contemporary automation issues Automation lag • Refers to the time lost between the initial implementation of the application, and the point at which automated functional scripts can be used during testing. • In future regression tests this ‘lag’ is significantly reduced and may not even exist at all
  • 32. Automation lag Scripting Parameters Spreadsheet Testing Validation Script Execution Data Prep Analysis Feedbac k Application available for testingDesign & Build Automation Lag Manuel test execution Automat ed testing Implementation
  • 33. Contemporary automation issues (continued) User acceptance testing and automation • To perform successful user acceptance testing, the involvement of the user is critical – both in defining the test plans (I.e. Strategy), execution and validation of results. • Users responsible for acceptance are asked to complete the scenario spreadsheet. • The user is required to monitor a few initial script executions.
  • 34. Automation benefit calculations • Historically problematic to get accurate measures. • Focus initially on quality. • Difficult to quantify quality in monetary terms. • Decision taken to focus on productivity improvement and associated cost reduction.
  • 35. Automation benefit calculations process • Identify the smallest common element. • In most cases it would be the parameters identified in the scripting process. • Compare effort (duration) associated with manual testing with that of automated testing per parameter. • Describe the improvement in monetary terms by using an average resource cost.
  • 36. Automation benefit • Total cost saving for the period 2005 to end-2008: R124 183 171.00 ($ 12 418 000 US) • Benefit confirmed by system owners and users. • Calculation method reviewed by the Finance department. • Return of this nature justified the significant investment made in both project testing and automation tools.
  • 37. What have we learned? • Having a test tool is not a strategy. • Automation does not test in the same way that a manual tester does. • Automated test scripts are software programs and must be treated as such. • The value lies in maintenance. • Customise your toolset to suit your needs. • Categorise your toolset, and design integrated solutions.