Más contenido relacionado La actualidad más candente (20) Similar a Sanitized tb swstmppp1516july (20) Sanitized tb swstmppp1516july1. Testing Metrics
Process, Project, and Product
Sponsored by
Version 1.7: 30 June 2014
References if missed on individual slides are given at the end of the presentation
2. Agenda – Day 1
Why Metric ?
Process for Defining Metrics
– How should we develop metrics ?
Exercise – on basis of your understanding
– Create process, project and product metrics and set of goals for
those metrics
Process Metrics
– Use of Process Metrics
– Developing Good Process Metrics
– How Test Process Metrics can measure the software process
– Process Metrics in Test Dashboard and Assessment
Why Metric ?
Process for Defining Metrics
– How should we develop metrics ?
Exercise – on basis of your understanding
– Create process, project and product metrics and set of goals for
those metrics
Process Metrics
– Use of Process Metrics
– Developing Good Process Metrics
– How Test Process Metrics can measure the software process
– Process Metrics in Test Dashboard and Assessment
3. Agenda – Day 2
Project Metrics
– Use of Project Metrics
– Best practices developing good Project Metrics
– Understanding Balance in Project Metrics
– Project Metrics for Test Management
Exercise
Product Metrics
– Use of Product Metrics
– Best Practices developing good Product Metrics
– Product Risk Metrics
Exercise
Dashboard
Other new Trends
Project Metrics
– Use of Project Metrics
– Best practices developing good Project Metrics
– Understanding Balance in Project Metrics
– Project Metrics for Test Management
Exercise
Product Metrics
– Use of Product Metrics
– Best Practices developing good Product Metrics
– Product Risk Metrics
Exercise
Dashboard
Other new Trends
4. Steering Committee Member, Agile Testing Alliance
Co-Author of a book on Selenium
Certified Trainer Agile Testing - ATA,Qualified Project
Management Professional (QPMP), Six Sigma Black Belt,
ISTQB certified foundation and advanced level Tester, Sun
Certified Java Programmer, Presenter in International
conferences on Project Management, Quality and Testing
19+ years of IT Industry experience with:
Larsen & Toubro Infotech Ltd, India, NSE.IT (National Stock Exchange, India)
Celox Networks, USA, Netscout Systems, USA
BE CSE, MBA Finance from UMASS Lowell.
Principal Consultant
Aditya Garg
Steering Committee Member, Agile Testing Alliance
Co-Author of a book on Selenium
Certified Trainer Agile Testing - ATA,Qualified Project
Management Professional (QPMP), Six Sigma Black Belt,
ISTQB certified foundation and advanced level Tester, Sun
Certified Java Programmer, Presenter in International
conferences on Project Management, Quality and Testing
19+ years of IT Industry experience with:
Larsen & Toubro Infotech Ltd, India, NSE.IT (National Stock Exchange, India)
Celox Networks, USA, Netscout Systems, USA
BE CSE, MBA Finance from UMASS Lowell.
Aditya has been an automation test architect and principal consultant excelling in designing, strategizing and
architecting manual and automation testing solutions. His primary focus has been in the area of establishing and leading
testing center of excellence and practices, managing large IT projects, undertaking testing process studies using Six
Sigma, CMMi and TMM models, proposing QA solutions, performance engineering and architecture reviews, designing
automation frameworks, exploring open source test automation tools/frameworks and Business development
Aditya’s current research area is use of pair wise testing in agile projects specially in optimizing test automation
regression packs build around BDT with tools like Cucumber, Selenium and Capybara.
5. Why Metric ?
Allow us to measure attributes and help understand
Helps in decision making
Allow us to know whether our decisions are actually right ? –
By assessing consequences
They are rational in nature unless forged (prefer
automated metrics report)
Can identify areas that have the highest re-work rates?
Some Myths
– Waste of time when everything is going right
– I am updating everyone in my team why to track
– Not everything that you measure matters and not everything
that matters can be measured
– Let me be a little reasonable
Allow us to measure attributes and help understand
Helps in decision making
Allow us to know whether our decisions are actually right ? –
By assessing consequences
They are rational in nature unless forged (prefer
automated metrics report)
Can identify areas that have the highest re-work rates?
Some Myths
– Waste of time when everything is going right
– I am updating everyone in my team why to track
– Not everything that you measure matters and not everything
that matters can be measured
– Let me be a little reasonable
Confidential | Copyright © QAAgility Technologies Pvt Ltd
6. Take Away’s
Testing in isolation has no value. For value it has to be
effectively communicated
Make people aware of the facts rather than assumptions(tell
quantitative figures)
Talk about impact of attribute
Show breakdown into sub attributes – e.g. Bugs by severity
and Priority
As part of regular status reporting use dashboards (focused
on process, project or product)
Testing in isolation has no value. For value it has to be
effectively communicated
Make people aware of the facts rather than assumptions(tell
quantitative figures)
Talk about impact of attribute
Show breakdown into sub attributes – e.g. Bugs by severity
and Priority
As part of regular status reporting use dashboards (focused
on process, project or product)
Confidential | Copyright © QAAgility Technologies Pvt Ltd
8. How to Develop a Metric?
Use top-down approach rather bottom-up
approach
Ask yourself, Do we have well-defined,
realistic, documented, agreed-upon objectives
for our testing process?
It is not enough to just have a metric
Baselining and Benchmarking
Use top-down approach rather bottom-up
approach
Ask yourself, Do we have well-defined,
realistic, documented, agreed-upon objectives
for our testing process?
It is not enough to just have a metric
Baselining and Benchmarking
Confidential | Copyright © QAAgility Technologies Pvt Ltd
9. Process for deriving metrics
• Identify business goals
• Determine purpose, focus, need, context
of the goal
Define
objectives/Goals
• Clarify the goal, refine the goal, identify
source
• Effective, Efficient and Elegant way to
realize the objective
Ask Questions
• Meet stakeholders, remove ambiguities
• Quantitative, Direct and Indirect
Measures
Devise Measurable
metrics
Confidential | Copyright © QAAgility Technologies Pvt Ltd
• Meet stakeholders, remove ambiguities
• Quantitative, Direct and Indirect
Measures
Devise Measurable
metrics
• Preparation
• Collection
• Analysis and interpretation
Implementation
and Interpretation
• Identify area of improvement
• Improve effectiveness, efficiency and eleganceImprove
10. Process for deriving metrics
• Identify business goals
• Determine purpose, focus, need, context
of the goal
Define
objectives/Goals
• Clarify the goal, refine the goal, identify
source
• Effective, Efficient and Elegant way to
realize the objective
Ask Questions
• Meet stakeholders, remove ambiguities
• Quantitative, Direct and Indirect
Measures
Devise Measurable
metrics
Confidential | Copyright © QAAgility Technologies Pvt Ltd
• Meet stakeholders, remove ambiguities
• Quantitative, Direct and Indirect
Measures
Devise Measurable
metrics
• Preparation
• Collection
• Analysis and interpretation
Implementation
and Interpretation
• Identify area of improvement
• Improve effectiveness, efficiency and eleganceImprove
11. Goal Driven Top down approach
Confidential | Copyright © QAAgility Technologies
12. Typical/Generic Goals and Objectives
• Find Bugs, especially important ones
• Build confidence in the product
• Reduce Risk of post-release failure
• Provide useful timely information
about testing and quality
Confidential | Copyright © QAAgility Technologies
• Find Bugs, especially important ones
• Build confidence in the product
• Reduce Risk of post-release failure
• Provide useful timely information
about testing and quality
13. E.g.
Objective : I want to know whether we have finished finding
new bugs ?? And obviously resolution of known bugs.
- As a metric we can plot the trend of bug discovery over time
during test execution. Our goal flattening of the cumulative
bug opened curve
Objective : I want to know whether we have finished finding
new bugs ?? And obviously resolution of known bugs.
- As a metric we can plot the trend of bug discovery over time
during test execution. Our goal flattening of the cumulative
bug opened curve
Confidential | Copyright © QAAgility Technologies
14. e.g. 2
Question : Objective for testing: Build confidence in the system !!
How can we measure confidence directly or indirectly ?
- The more through testing we have done, the more confident we
are !!
So we are talking about Coverage here, Its tricky !!
- Code Coverage
- Design Coverage
- Configuration Coverage
- Test Design Technique Coverage
- Requirement Coverage and more …
Question : Objective for testing: Build confidence in the system !!
How can we measure confidence directly or indirectly ?
- The more through testing we have done, the more confident we
are !!
So we are talking about Coverage here, Its tricky !!
- Code Coverage
- Design Coverage
- Configuration Coverage
- Test Design Technique Coverage
- Requirement Coverage and more …
Confidential | Copyright © QAAgility Technologies
15. e.g. contd..
So our metric has three elements
– How many requirements are completely tested without any failures?
– How many requirements has failures?
– How many requirements are untested?
Our GOAL is to help collect and analyse quantitative date for building
confidence
So our metric has three elements
– How many requirements are completely tested without any failures?
– How many requirements has failures?
– How many requirements are untested?
Our GOAL is to help collect and analyse quantitative date for building
confidence
Confidential | Copyright © QAAgility Technologies
21. Use of Process Metrics
Help us understand process capabilities and measuring
process effectiveness & efficiency
Benchmarking your process metrics against industry norms
can help you see where your process stand in comparison
Don’t link this with team or individual, wrong notion
Help us understand process capabilities and measuring
process effectiveness & efficiency
Benchmarking your process metrics against industry norms
can help you see where your process stand in comparison
Don’t link this with team or individual, wrong notion
Confidential | Copyright © QAAgility Technologies
22. Process for deriving metrics for Testing Process
• Identify business goals
• Keep testing process in mind
Define
objectives/Goals
• Clarify the goal, refine the goal
• Effective, Efficient and Elegant way to
realize the objectives of the testing
process
Ask Questions
• Find out Quantitative, Direct and
Indirect Measures satisfying the testing
process objectives
Devise Measurable
metrics
Confidential | Copyright © QAAgility Technologies Pvt Ltd
• Find out Quantitative, Direct and
Indirect Measures satisfying the testing
process objectives
Devise Measurable
metrics
• Preparation
• Collection
• Analysis and interpretation
Implementation
and Interpretation
• Identify area of improvement
• Improve effectiveness, efficiency and eleganceImprove
23. Developing Good Process Metrics - Exercise
Let us start developing a metric that evaluates the test process
effectiveness at accomplishing its bug finding
objective
(Think Testing)
Let us start developing a metric that evaluates the test process
effectiveness at accomplishing its bug finding
objective
(Think Testing)
Confidential | Copyright © QAAgility Technologies
24. Developing Good Process Metrics – Simple Metric
A good testing effectiveness question is:
“What % of the bugs present in a system during testing is found
by the testing process?”
Answer : Defect Detection Effectiveness (DDE) or (% DDP)
DDE = Defects Detected / Defects Present
(Defects Present - counting all defects found in and subsequent to the testing activity)
DDE (@final stage of lifecycle before release)
A good testing effectiveness question is:
“What % of the bugs present in a system during testing is found
by the testing process?”
Answer : Defect Detection Effectiveness (DDE) or (% DDP)
DDE = Defects Detected / Defects Present
(Defects Present - counting all defects found in and subsequent to the testing activity)
DDE (@final stage of lifecycle before release)
Confidential | Copyright © QAAgility Technologies
25. Developing Good Process Metrics – Simple Metric
With out metric in place let’s set some goals now
Based on our assessment of a number of clients, typical defects
detection effectiveness is 85% (not more than 100% )
Obvious thinking if it comes down from 85% we should improve
effectiveness and develop process improvement plan
(don’t link with the individual)
If you see the example this process metric shares similar objective
of a project metric as well (lets us see the improvement)
With out metric in place let’s set some goals now
Based on our assessment of a number of clients, typical defects
detection effectiveness is 85% (not more than 100% )
Obvious thinking if it comes down from 85% we should improve
effectiveness and develop process improvement plan
(don’t link with the individual)
If you see the example this process metric shares similar objective
of a project metric as well (lets us see the improvement)
Confidential | Copyright © QAAgility Technologies
26. How Test Process Metrics can measure the Testing process
Defect Acceptance
This metric determine the number of valid defects that testing team
has identified during execution.
The value of this metric can be compared with previous release for
getting better picture
NumberofValidDefects
Defect Acceptance= *100 %
TotalNumberof Defects
Confidential | Copyright © QAAgility Technologies
Defect Acceptance
This metric determine the number of valid defects that testing team
has identified during execution.
The value of this metric can be compared with previous release for
getting better picture
NumberofValidDefects
Defect Acceptance= *100 %
TotalNumberof Defects
27. How Test Process Metrics can measure the Testing process
Defect Acceptance
Confidential | Copyright © QAAgility Technologies
28. How Test Process Metrics can measure the Testing process
Defect Rejection
This metric determine the number of defects rejected during
execution.
It gives the percentage of the invalid defect the testing team has
opened and one can control, whenever required
Number of Defect(s)Rejected
Defect Rejection = * 100 %
Total Number of Defects
Confidential | Copyright © QAAgility Technologies
Defect Rejection
This metric determine the number of defects rejected during
execution.
It gives the percentage of the invalid defect the testing team has
opened and one can control, whenever required
Number of Defect(s)Rejected
Defect Rejection = * 100 %
Total Number of Defects
29. How Test Process Metrics can measure the Testing process
Defect Rejection
Confidential | Copyright © QAAgility Technologies
30. How Test Process Metrics can measure the software process
efficiency
Longer defects detected to close ratio prevent software
delivery
Good metric for bug resolution is the defect closure period
(DCP) tied up with SLA
DCP = date (discovery) – date(resolution)
It is also known as Age of Defects
Longer defects detected to close ratio prevent software
delivery
Good metric for bug resolution is the defect closure period
(DCP) tied up with SLA
DCP = date (discovery) – date(resolution)
It is also known as Age of Defects
Confidential | Copyright © QAAgility Technologies
32. Defect Age and Testing Process efficiency?
Average Defect Turn Around Time (Verification from Testers)
Once the defect is fixed – how soon does testing team verify the
defect.
Confidential | Copyright © QAAgility Technologies
33. Defect Age and Testing Process efficiency?
Average Response Time (Response from Testers)
After the defect is analyzed and queries raised by development
team how quickly does testing team responds to such queries
Confidential | Copyright © QAAgility Technologies
34. How Test Process Metrics can measure the software process
Another Test Process Metric
– Bug Opened count (Tracks the number of times each bug is
opened)
– Count is set to 1 when it is first submitted incremented each
time
Confidential | Copyright © QAAgility Technologies
35. How Test Process Metrics can measure the software process
In the above example we can see that 17% of the bug reports
failed the confirmation test at least once and were reopened
Only 83% of the bug report had an opened count of one (1)
If we assume that each additional confirmation test and
regression test associated with each bug fix required (1) per
hour effort
Rework Inefficiency = 1*112 + 2*26 + 3*6 + 4*5 + 10*1 = 212 (hours)
9% of planned test effort was consumed for inefficiency if we take
up original test effort for 10 people for 6 weeks.
In the above example we can see that 17% of the bug reports
failed the confirmation test at least once and were reopened
Only 83% of the bug report had an opened count of one (1)
If we assume that each additional confirmation test and
regression test associated with each bug fix required (1) per
hour effort
Rework Inefficiency = 1*112 + 2*26 + 3*6 + 4*5 + 10*1 = 212 (hours)
9% of planned test effort was consumed for inefficiency if we take
up original test effort for 10 people for 6 weeks.
Confidential | Copyright © QAAgility Technologies
36. Exercise
Revisit the Testing Process and think of any other way you
would like to increase
Effectiveness of the process
Efficiency of the process
Summarize findings and discussions.
Revisit the Testing Process and think of any other way you
would like to increase
Effectiveness of the process
Efficiency of the process
Summarize findings and discussions.
Confidential | Copyright © QAAgility Technologies
39. Project Metrics & Use of it
Help understand our status in terms of the progress of
testing
Understanding current project status is a pre-requisite to
rational, fact driven project management decisions
Help understand our status in terms of the progress of
testing
Understanding current project status is a pre-requisite to
rational, fact driven project management decisions
Confidential | Copyright © QAAgility Technologies
41. Cost of testing phase wise
Confidential | Copyright © QAAgility Technologies
42. Cost of testing per component/area
Confidential | Copyright © QAAgility Technologies
44. Project Metrics: Measuring Test Progress curve
Confidential | Copyright © QAAgility Technologies
Reference: http://www.mindlance.com/
45. Project Metrics: Test Progress curve (Score)
Confidential | Copyright © QAAgility Technologies
Reference: http://www.mindlance.com/
46. Can we show progress in terms of planned
and actual tests completed?
Confidential | Copyright © QAAgility Technologies
48. Metric Issues
Metrics derived using: Test case Execution
Mistake
– Tracking test progress only on the number of test cases executed over
a time frame
Mitigation
– This fails to represent the true project state because clustering of
functions is common over many applications. Around 75% of the
execution surrounds pre-requisite and test environment setup.
Reaching this function cluster takes time. Many managers do not wish
to consider the setup time as a milestone in testing. Testers and
Reviewers can overcome the problem by spreading-even the
functionality across test cases rather than concentrating it in the last
few test cases. Projects can also consider assigning proportionate
weight according to complexity of the program
Metrics derived using: Test case Execution
Mistake
– Tracking test progress only on the number of test cases executed over
a time frame
Mitigation
– This fails to represent the true project state because clustering of
functions is common over many applications. Around 75% of the
execution surrounds pre-requisite and test environment setup.
Reaching this function cluster takes time. Many managers do not wish
to consider the setup time as a milestone in testing. Testers and
Reviewers can overcome the problem by spreading-even the
functionality across test cases rather than concentrating it in the last
few test cases. Projects can also consider assigning proportionate
weight according to complexity of the program
Confidential | Copyright © QAAgility Technologies Pvt Ltd
49. What do you mean by productivity ?
This metric gives the test cases execution productivity which on further analysis can
give conclusive result.
Total No.of TC executed (Te)
Test ExecutionProductivity = * 8 Execution(s)/Day
ExecutionEfforts(hours)
Confidential | Copyright © QAAgility Technologies
This metric gives the test cases execution productivity which on further analysis can
give conclusive result.
Total No.of TC executed (Te)
Test ExecutionProductivity = * 8 Execution(s)/Day
ExecutionEfforts(hours)
Similarly we can calculate productivity for preparation too
50. Project Metric for Test Management
Tell us what are the Metric you use for the Test Management
When you write ask yourself “Is it taking us more effort or less
effort to execute our testcase”
Tell us what are the Metric you use for the Test Management
When you write ask yourself “Is it taking us more effort or less
effort to execute our testcase”
Confidential | Copyright © QAAgility Technologies
51. Two most important Metric are :
(ensures test execution proceeding well)
– Planned Test Hours – Shows number of test cases execution
hours planned for each day
• Calculated as total number of hours of testcases planned per week
divided by 5
– Actual Test Hours – Shows the number of testcase execution
hours actually achieved each day (You can use a tool here and
get the value)
Confidential | Copyright © QAAgility Technologies
Two most important Metric are :
(ensures test execution proceeding well)
– Planned Test Hours – Shows number of test cases execution
hours planned for each day
• Calculated as total number of hours of testcases planned per week
divided by 5
– Actual Test Hours – Shows the number of testcase execution
hours actually achieved each day (You can use a tool here and
get the value)
53. Exercise
As a Test Manager I want to find out whether the time has come to stop
testing?
How do I do that ?
54. Bugs opened and closed on a project
Confidential | Copyright © QAAgility Technologies
55. What’s the point ?
Single chart can provide a graphical view of a number of
metrics
All five metrics shown in the last diagram gives us sense of
the progress of bug management for this project
Confidential | Copyright © QAAgility Technologies
56. Understanding Balances in Project Metrics - Examples
If the total daily opened and average daily opened metrics
don’t show a trend towards zero newly discovered bugs,
then we have a situation where quality of the system under
test is not improving.
If the total daily closed and average daily backlog metric do
not show trend towards zero unresolved bugs, then the
quality of the system is not sufficient for the release
Ensure that all the test cases have been run and currently
pass (critical ones)
If the total daily opened and average daily opened metrics
don’t show a trend towards zero newly discovered bugs,
then we have a situation where quality of the system under
test is not improving.
If the total daily closed and average daily backlog metric do
not show trend towards zero unresolved bugs, then the
quality of the system is not sufficient for the release
Ensure that all the test cases have been run and currently
pass (critical ones)
Confidential | Copyright © QAAgility Technologies
60. Is there a way to show trends in effective test case
preparation ?
Confidential | Copyright © QAAgility Technologies
62. Product Metrics
Product Metrics are often forgotten during testing but help
understand the quality status of the system under test
Exercise
Suppose we give you some information as :
- 95% of the tests have been run
- 90% of the tests have passed
- 5% of the tests are failed
- 4% of the tests are ready for run
- 1% of the tests are blocked
Assume we are on track of test execution, Does this tell us
good or bad news ?
Product Metrics are often forgotten during testing but help
understand the quality status of the system under test
Exercise
Suppose we give you some information as :
- 95% of the tests have been run
- 90% of the tests have passed
- 5% of the tests are failed
- 4% of the tests are ready for run
- 1% of the tests are blocked
Assume we are on track of test execution, Does this tell us
good or bad news ?
Confidential | Copyright © QAAgility Technologies Pvt Ltd
63. Product metrics - Balances the Project Metrics
Building confidence that the system will work properly
(critical use cases)
Achieving a sufficiently low residual level of quality risk
(critical quality risks)
Building confidence that the system will work properly
(critical use cases)
Achieving a sufficiently low residual level of quality risk
(critical quality risks)
Confidential | Copyright © QAAgility Technologies
64. Key Objectives for Product Metrics
Test Coverage related Objective
Quality related Objectives for the product
Any other ?
Test Coverage related Objective
Quality related Objectives for the product
Any other ?
Confidential | Copyright © QAAgility Technologies Pvt Ltd
EXERCISE
How would you measure these ?
65. Best Practices developing good Product Metrics
Two of the realistic goals that we should consider are:
– Requirement Coverage
– Risks to Quality
Every requirement needs to be tested prior to the release
Confidential | Copyright © QAAgility Technologies Pvt Ltd
Requirement Coverage
67. Product Risk Metrics
In Risk Based Testing the objective is typically reduce
product quality risk to an acceptable level
Two Questions?
– How effectively are we reducing quality risk overall?
– For each quality risk category, how effectively are we reducing
quality risk?
Have bidirectional traceability between tests and risk items
so that coverage can be measured and assured
During test execution the test runs and defects are reported.
Have bidirectional traceability test results to risk items and
defects to risk items
In Risk Based Testing the objective is typically reduce
product quality risk to an acceptable level
Two Questions?
– How effectively are we reducing quality risk overall?
– For each quality risk category, how effectively are we reducing
quality risk?
Have bidirectional traceability between tests and risk items
so that coverage can be measured and assured
During test execution the test runs and defects are reported.
Have bidirectional traceability test results to risk items and
defects to risk items
Confidential | Copyright © QAAgility Technologies Pvt Ltd
69. Measuring Risk
RISK = Likelihood * Impact
Likelihood
High
High Impact and Likelihood
– Needs to be addressed ASAP
Impact
Likelihood
HighLow
Low
Low Impact and Likelihood
– Needs no action
70. C
Likelihood
Must TestCould Test
Won’t Test Should Test
A
Statement Coverage 70%
Pair inspection
EP/BVA
Decision Tables
Branch coverage
Functional Risk Matrix - MoSCoW
Likelihood
Impact
Won’t Test Should Test
D B
Informal Test specification
Error Guessing
Formal Test Specification
Statement Coverage 100%
Likelihood
Impact
71. Green region is rapidly
increasing
At this juncture the testers
focus on running confidence-
building tests (turning black
to green) and the developers
fix the bugs that were found
(turning red to green)
End of the Project
Confidential | Copyright © QAAgility Technologies Pvt Ltd
Green region is rapidly
increasing
At this juncture the testers
focus on running confidence-
building tests (turning black
to green) and the developers
fix the bugs that were found
(turning red to green)
Quality Risk Status – End of Test Execution
72. Group Exercise - 20 minutes
Using the sample representation below – design a quality risk coverage Dashboard
Confidential | Copyright © QAAgility Technologies
73. Defect Severity Index
(Severity Index* No.of OpenValid Defect(s) for this severity)
DSI(Open) =
Total Number of OpenValid Defects
Defect Severity Index Trend
Confidential | Copyright © QAAgility Technologies
(Severity Index* No.of OpenValid Defect(s) for this severity)
DSI(Open) =
Total Number of OpenValid Defects
Defect Severity Index Trend
74. Another one – please analyse
Confidential | Copyright © QAAgility Technologies
79. Mostly tool or excel based..
Day 8
Modules
Total#ofCases
Planned
Executed
OnHold
Passed
Failed
Planned
Executed
OnHold
Passed
Failed
289 38 38 29 9 0 289 289 76 204 9 74% 4%
429 74 74 0 64 10 429 429 0 386 43 100% 10%
583 74 327 304 23 0 583 583 314 264 5 46% 2% Assigned 5 38 3 47 40 35 2 9 179
1510 217 287 145 126 16 1510 1510 285 1141 84 81% 7% Closed by Business 0 1 0 1 4 2 0 0 8
1250 148 531 154 346 31 1250 1250 247 916 87 80% 9% Closed by Testing 1 2 1 3 20 6 0 2 35
1397 148 363 362 0 1 1397 1397 652 731 14 53% 2% Deferred 0 0 1 9 8 1 0 0 19
1127 207 239 0 237 2 1127 1127 87 986 54 92% 5% Fixed 0 0 0 0 0 0 0 0 0
19 4 17 17 0 19 19 0 17 2 100% 11% Raised 2 1 0 10 13 6 0 3 35
Total 6604 910 1876 994 822 60 6604 6604 1661 4645 298 75% 6% Re-raised 0 0 0 1 1 1 0 0 3
Retest - It 1 499 81 54 27 451 0 316 135 Retest 1 1 0 13 1 3 0 0 19
Total 9 43 5 84 87 54 2 14 298
Re-raised - It 1 3 14 58 27 33 135
Showstopper 0 0 0 1 12 4 0 0 17
Critical 5 34 2 44 33 33 0 1 152
Major 3 1 0 23 31 9 2 9 78
Minor 1 8 3 16 11 8 0 4 51
Total 9 43 5 84 87 54 2 14 298
0
22.30
0.00
0.00
22.30
Others (Environment Related) 48.40
0.00
Application non availability 251.2
Total Down time 300.00 24%
Functionality related 0.00
Incident Status
Issues
Note:
1.
Incident Severity
DownTime (in Person hours) Today So far %
Hardware related
Total
Today So far
Incident Metrics
TEST EXECUTION DASH BOARD
TEST EXECUTION PROGRESS AS ON INCIDENT REPORT AS ON
CORE BANKING
Test cases Executed
%ofTestCases
ExecutedtoPlannedso
far
%ofFailedTestCases
Confidential | Copyright © QAAgility Technologies
Day 8
Modules
Total#ofCases
Planned
Executed
OnHold
Passed
Failed
Planned
Executed
OnHold
Passed
Failed
289 38 38 29 9 0 289 289 76 204 9 74% 4%
429 74 74 0 64 10 429 429 0 386 43 100% 10%
583 74 327 304 23 0 583 583 314 264 5 46% 2% Assigned 5 38 3 47 40 35 2 9 179
1510 217 287 145 126 16 1510 1510 285 1141 84 81% 7% Closed by Business 0 1 0 1 4 2 0 0 8
1250 148 531 154 346 31 1250 1250 247 916 87 80% 9% Closed by Testing 1 2 1 3 20 6 0 2 35
1397 148 363 362 0 1 1397 1397 652 731 14 53% 2% Deferred 0 0 1 9 8 1 0 0 19
1127 207 239 0 237 2 1127 1127 87 986 54 92% 5% Fixed 0 0 0 0 0 0 0 0 0
19 4 17 17 0 19 19 0 17 2 100% 11% Raised 2 1 0 10 13 6 0 3 35
Total 6604 910 1876 994 822 60 6604 6604 1661 4645 298 75% 6% Re-raised 0 0 0 1 1 1 0 0 3
Retest - It 1 499 81 54 27 451 0 316 135 Retest 1 1 0 13 1 3 0 0 19
Total 9 43 5 84 87 54 2 14 298
Re-raised - It 1 3 14 58 27 33 135
Showstopper 0 0 0 1 12 4 0 0 17
Critical 5 34 2 44 33 33 0 1 152
Major 3 1 0 23 31 9 2 9 78
Minor 1 8 3 16 11 8 0 4 51
Total 9 43 5 84 87 54 2 14 298
0
22.30
0.00
0.00
22.30
Others (Environment Related) 48.40
0.00
Application non availability 251.2
Total Down time 300.00 24%
Functionality related 0.00
Incident Status
Issues
Note:
1.
Incident Severity
DownTime (in Person hours) Today So far %
Hardware related
Total
Today So far
Incident Metrics
TEST EXECUTION DASH BOARD
TEST EXECUTION PROGRESS AS ON INCIDENT REPORT AS ON
CORE BANKING
Test cases Executed
%ofTestCases
ExecutedtoPlannedso
far
%ofFailedTestCases
80. Mostly tool or excel based..
Ref: http://www.inflectra.com/SpiraTest/
Confidential | Copyright © QAAgility Technologies
82. Exercise
Compare the two dashboards – suggest which one is better and why ?
Confidential | Copyright © QAAgility Technologies
85. Exercise
Prepare a Table of content for a test summary report or Test
Closure report
Time 20 minutes
Discussion 5 minutes
Prepare a Table of content for a test summary report or Test
Closure report
Time 20 minutes
Discussion 5 minutes
Confidential | Copyright © QAAgility Technologies
87. Metric Issues
Metrics derived using: Defects
Mistake
– Setting targets on number of defects to be found per unit time and
linking personnel appraisal, awards and recognitions to the number
of defects found
Mitigation
– Defect target reduces the efficiency of testing and diverts the focus to
unessential aspects rather than improving quality from Customer’s
view point. Worst of all would be the build-up of low morale within
the team. Though defects can indicate that the product quality is
not up to the mark, but never would it indicate the performance of
the team. A more positive approach would be looking for
improvisation in the “test techniques” by using this base metric,
“Defect”
Confidential | Copyright © QAAgility Technologies Pvt Ltd
Metrics derived using: Defects
Mistake
– Setting targets on number of defects to be found per unit time and
linking personnel appraisal, awards and recognitions to the number
of defects found
Mitigation
– Defect target reduces the efficiency of testing and diverts the focus to
unessential aspects rather than improving quality from Customer’s
view point. Worst of all would be the build-up of low morale within
the team. Though defects can indicate that the product quality is
not up to the mark, but never would it indicate the performance of
the team. A more positive approach would be looking for
improvisation in the “test techniques” by using this base metric,
“Defect”
88. Metric Issues
Metrics derived using: Defects
Mistake
– Wrong assignment of severity and criticality of the defects
Mitigation
– Managers are most often misguided in setting priority to solve the
problem. For example a typo in the label on a GUI would probably
be classified as “Cosmetic” defect and priority is set to low. But it
may be the scenario where this defect may cause “Serious”
damage to the user. Hence imparting end-to-end functionality view
of the application to all the team members shall positively help in
assignments of priority of solving defects and thereby increasing
the efficiency
Confidential | Copyright © QAAgility Technologies Pvt Ltd
Metrics derived using: Defects
Mistake
– Wrong assignment of severity and criticality of the defects
Mitigation
– Managers are most often misguided in setting priority to solve the
problem. For example a typo in the label on a GUI would probably
be classified as “Cosmetic” defect and priority is set to low. But it
may be the scenario where this defect may cause “Serious”
damage to the user. Hence imparting end-to-end functionality view
of the application to all the team members shall positively help in
assignments of priority of solving defects and thereby increasing
the efficiency
89. Metric Issues
Metrics derived using: Cost and Time
Mistake
– Not considering Static processes like reviews into the testing cost & time
estimation
Mitigation
– Most managers do realize that static processes play an important role
where nearly 25% of the defects could be found, but still hold
reluctant in its inclusion. Difficulty magnifies when re-estimation is
not done even after realization. Estimation mostly uses ratios, like
Test Cost to the Total cost and Actual cost to the Budgeted Cost,
obtained from historical data. Projects would be better if
breathing time for recovery process is planned well ahead
Confidential | Copyright © QAAgility Technologies Pvt Ltd
Metrics derived using: Cost and Time
Mistake
– Not considering Static processes like reviews into the testing cost & time
estimation
Mitigation
– Most managers do realize that static processes play an important role
where nearly 25% of the defects could be found, but still hold
reluctant in its inclusion. Difficulty magnifies when re-estimation is
not done even after realization. Estimation mostly uses ratios, like
Test Cost to the Total cost and Actual cost to the Budgeted Cost,
obtained from historical data. Projects would be better if
breathing time for recovery process is planned well ahead
90. Metric Issues
Metrics derived using: Cost and Time
Mistake
– Team members failing to report findings or issues to the managers within
time causing overhead in recalculation
Mitigation
– Educating Testers on the test process and timeframe for reporting would
prove beneficial
Confidential | Copyright © QAAgility Technologies Pvt Ltd
Metrics derived using: Cost and Time
Mistake
– Team members failing to report findings or issues to the managers within
time causing overhead in recalculation
Mitigation
– Educating Testers on the test process and timeframe for reporting would
prove beneficial
97. Throughput – outcome in terms of delivered features / Value points
Confidential | Copyright © QAAgility Technologies
Reference: Agile Metrics that matter: Erik Weber
99. New metric trend – Code Analysis
Confidential | Copyright © QAAgility Technologies
Reference: http://www.slideshare.net/mgaewsj/agile-kpis-5853270
101. Good Metrics
• Simple and Effective
• simple to gather, calculate, understand and is effective in
what we want
• Efficient
• Producing metric without too much amount of work
• Elegant
• Presentation. How do we present the metric
• Snapshot in table format
• Trend over a period of time
• Analytical of causes and relations.
• Sufficient, Diversified and Balanced
• Together metrics give more and balanced information
Confidential | Copyright © QAAgility Technologies
• Simple and Effective
• simple to gather, calculate, understand and is effective in
what we want
• Efficient
• Producing metric without too much amount of work
• Elegant
• Presentation. How do we present the metric
• Snapshot in table format
• Trend over a period of time
• Analytical of causes and relations.
• Sufficient, Diversified and Balanced
• Together metrics give more and balanced information
106. References
http://www.inflectra.com/SpiraTest/
Commontestsense.blogspot.com
http://www.slideshare.net/BimleshGundurao/agile-metrics-atpmi-bangalore
help.rallydev.com
http://www.slideshare.net/pporchuk/creating-qa-dashboard
Realizing efficiency and effectiveness in software testing through a comprehensive metrics model - Article
by Infosys
Testing metrics - Article by Mindlance
Landmines of software Testing metrics article by HCL
http://www.developsense.com/blog/2012/02/braiding-the-stories/
http://www.developsense.com/blog/2012/02/delivering-the-news-test-reporting-part-3/
12 Steps to Useful Software Metrics,Linda Westfall, The Westfall Team
Software Test Metrics. Key metrics and measures for use within the test function. Discussion Document
By Mark Crowther, Empirical Pragmatic Tester
Metrics for Software Testing: Managing with Facts: Part 1,2,3,4: The Why and How of Metrics,Provided by
Rex Black Consulting Services (www.rbcs-us.com)
Confidential | Copyright © QAAgility Technologies
http://www.inflectra.com/SpiraTest/
Commontestsense.blogspot.com
http://www.slideshare.net/BimleshGundurao/agile-metrics-atpmi-bangalore
help.rallydev.com
http://www.slideshare.net/pporchuk/creating-qa-dashboard
Realizing efficiency and effectiveness in software testing through a comprehensive metrics model - Article
by Infosys
Testing metrics - Article by Mindlance
Landmines of software Testing metrics article by HCL
http://www.developsense.com/blog/2012/02/braiding-the-stories/
http://www.developsense.com/blog/2012/02/delivering-the-news-test-reporting-part-3/
12 Steps to Useful Software Metrics,Linda Westfall, The Westfall Team
Software Test Metrics. Key metrics and measures for use within the test function. Discussion Document
By Mark Crowther, Empirical Pragmatic Tester
Metrics for Software Testing: Managing with Facts: Part 1,2,3,4: The Why and How of Metrics,Provided by
Rex Black Consulting Services (www.rbcs-us.com)
108. We are a TESTING training company that
brings to you -
• Thought leadership in Testing area
• Book published on Selenium by Tata McGraw Hill
• Agile Testing Licensed Trainers for ATA
• People behind ITB Mumbai chapter and TeStride
Mumbai Conference
• Huge Experience in training more than 3000
professionals across major IT Companies in India
and abroad in Testing
• Practical insight into all training assignments
due to our pedigree working and managing
testing for large multinationals for more than 18-
19 years
AND WE ARE PASSIONATE ABOUT IT!
Confidential | Copyright © QA Agility Technologies
We are a TESTING training company that
brings to you -
• Thought leadership in Testing area
• Book published on Selenium by Tata McGraw Hill
• Agile Testing Licensed Trainers for ATA
• People behind ITB Mumbai chapter and TeStride
Mumbai Conference
• Huge Experience in training more than 3000
professionals across major IT Companies in India
and abroad in Testing
• Practical insight into all training assignments
due to our pedigree working and managing
testing for large multinationals for more than 18-
19 years
AND WE ARE PASSIONATE ABOUT IT!
109. Offerings - Comprehensive training programs in QA/Testing area
• Agile Testing Alliance - Certifications
• CP-BAT, CP-MAT, CP-AAT, CP-AAST
• Specialized Workshops/Niche Trainings
• Practical Test Strategy Formulation
• Architecting Testing Solutions
• Risk Based Testing
• Certified Agile Tester and Agile Testing
• ISTQB Certification
• Foundation Level
• Advanced Level
• Test Automation
• QTP
• Foundation
• Advanced
• Framework driven
• Selenium
• Foundation
• Advanced
• Test Management
• Quality Center
• Test Link
• Performance Testing
• Load Runner
• Silk Performer
• Open STA, JMeter
• Testing and QA Processes
• Testing Metrics
• Extended programs and diploma
on Testing Talent Development
• Test Automation
• Manual Testing
• Testing Leaderships
• Customized Corporate Trainings
• Specific Testing topic
• Web based Application Testing
• SOA Testing
• DB Testing
• GHTester. SOAP UI
• Tailored to project and
organization needs
Confidential | Copyright © QA Agility Technologies
• Agile Testing Alliance - Certifications
• CP-BAT, CP-MAT, CP-AAT, CP-AAST
• Specialized Workshops/Niche Trainings
• Practical Test Strategy Formulation
• Architecting Testing Solutions
• Risk Based Testing
• Certified Agile Tester and Agile Testing
• ISTQB Certification
• Foundation Level
• Advanced Level
• Test Automation
• QTP
• Foundation
• Advanced
• Framework driven
• Selenium
• Foundation
• Advanced
• Test Management
• Quality Center
• Test Link
• Performance Testing
• Load Runner
• Silk Performer
• Open STA, JMeter
• Testing and QA Processes
• Testing Metrics
• Extended programs and diploma
on Testing Talent Development
• Test Automation
• Manual Testing
• Testing Leaderships
• Customized Corporate Trainings
• Specific Testing topic
• Web based Application Testing
• SOA Testing
• DB Testing
• GHTester. SOAP UI
• Tailored to project and
organization needs
111. Surest Way to Master Agile Testing
Pick up newer ways of doing Testing
Differentiate how old ways of testing may
not work in swift moving agile projects
Learn optimized test design – utmost
necessary to reduce defect leakage in a
project where time is always a constraint.
Mind Map Test Design Technique
Pairwise/Combinatorial Techniques
Exploratory Test Design Technique
Increasing agility in finding defects
Real Agile Project (Multiple Drops, Multiple
Sprints)
Certified
Professional –
Master Agile Testing
Surest Way to Master Agile Testing
Pick up newer ways of doing Testing
Differentiate how old ways of testing may
not work in swift moving agile projects
Learn optimized test design – utmost
necessary to reduce defect leakage in a
project where time is always a constraint.
Mind Map Test Design Technique
Pairwise/Combinatorial Techniques
Exploratory Test Design Technique
Increasing agility in finding defects
Real Agile Project (Multiple Drops, Multiple
Sprints)
112. Surest Way to learn cutting edge
automation trends
Understanding huge importance of
automation in today’s testing world
Hands on BDD,ATDD and TDD
Practice automation from concept to
regression and test hardening iteration
using:
Real Case Study
Real tools like – Cucumber,
Fitnesse, Selenium, Hudson/Jenkins
Certified Professional –
Automation Agile
Testing
Surest Way to learn cutting edge
automation trends
Understanding huge importance of
automation in today’s testing world
Hands on BDD,ATDD and TDD
Practice automation from concept to
regression and test hardening iteration
using:
Real Case Study
Real tools like – Cucumber,
Fitnesse, Selenium, Hudson/Jenkins