Más contenido relacionado La actualidad más candente (20) Similar a Software Engineering Practice - Software Quality Management (20) Más de Radu_Negulescu (14) Software Engineering Practice - Software Quality Management2. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 2
About this course module
Quality assurance is:
• A necessary condition for success in a software project
• Complex and costly: explicit costs (testing, review) and hidden costs
(lumped with dev or planning)
SQA requires elaborate techniques and planning
• In general, a vast topic
• Here we discuss main points, most likely to impact your practice
• A bit more depth than 321
Recommended reading:
• Jalote ch. 7 “Quality planning and defect estimation”, ch. 12 “Peer
review”
• McConnell Survival Guide, ch. 9 “Quality assurance”, ch. 15 “System
testing”
• McConnell Rapid Dev., ch. 18 “Daily build and smoke test”
3. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 3
Overview
Basic concepts
Test case design
• Engineering principles
• Advanced techniques
Formal reviews
QA planning
Test automation
4. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 4
QA notions
Recall the following notions
• Bugs: failure, fault, defect, error
• Major QA types: verification vs. validation
Quality may refer to:
• System crashes
• Stated specifications
• Unstated user expectations
• “Degree to which the software satisfies both stated and implied
requirements” [McC, survival guide]
• Delivered defect density: de facto standard
5. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 5
What is considered a defect?
Defect = cause of inconsistency with requirements or needs of the
customer
• In several artifacts (SRS, DD, code)
• Several types: Jalote p. 269
• Possible severity scheme [after Jalote p. 270]:
Critical = Show stopper!
User cannot carry out a function, or
Affects the whole project schedule
Major = High impact, but not a show stopper
Across many modules, or
Stop the user from proceeding, but has workaround
Minor = Inconvenience the user, but does not stop from proceeding
Cosmetic = In no way affects performance or function
Example: grammar mistakes or misaligned buttons
6. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 6
Purpose of QA
Enhance quality
• Detect/manifest defects (search for failures)
• Locate/identify defects (debugging)
• Eliminate defects (a dev. job, but supported by QA followup)
Measure quality
• E.g. estimate reliability
Communicate about quality
• Raise issues
• Provide baseline data
• Psychological effect
• Morale (developer and customer)
7. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 7
A business perspective
We pay our QA department to:
• Detect defects in-process (before our customers do!)
• Provide objective input for business decisions
• Keep stakeholders aware of concerns related to shipping a product
8. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 8
QA activities
[Survival guide]
Debugging, code tracing
• E.g. trace just before an integration
Defect tracking
• For each defect record dates, conditions, etc.
• Statistics on code, developers, QA activities
Unit testing: executed by developer of unit
Technical reviews: usually by peers
Integration testing: by the developer of the new code
System testing: by independent QA group
Acceptance testing: as specified; done for the customer
9. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 9
QA dynamics
Most organizations err on the side of self-defeating QA shortcuts
• Schedule optimum: 95% defects removed! [Rapid dev., p. 69]
• Most organizations: < 50%!
• Repairs cost more downstream
• Repeat defect detection is a waste
• Low quality costs more in customer support
• Users remember low quality rather than timely delivery
On the other hand, QA can theoretically continue forever
• Infinite or huge number of input combinations
• Hidden QA costs often exceed development costs
• Need to put a cap to it
Main question:
• Exactly how much QA should suffice? What kind? When? How?
10. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 10
Test cases
Elements of a test case:
• Input data
• Execution conditions
• Expected output
• Link to a specific test objective
11. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 11
Test cases
Example: web-based reservation system
• Reserve by clicking on time slots at least 24 hours in advance
• Up to 100 time slots can be reserved
Input Output
Test # Description studentID hrsLeft time-crt time-slot slot status hrsLeft slot status
sel.eq.1 Select free slot 1234 38 3/day14 5/day22 free 37 res.1234
sel.eq.2 Select slot owned by other 1234 38 3/day14 5/day22 res.5678 38 res.5678
sel.eq.3 Select same day 1234 38 3/day14 3/day14 free 38 free
sel.eq.4 Select <24hrs next day 1234 38 3/day14 2/day15 free 38 free
sel.eq.5 Select with no hours left 1234 0 3/day14 5/day22 free 0 free
sel.bd.1 Select slot owned by oneself 1234 38 3/day14 5/day22 res.1234 38 res.1234
sel.bd.2 Select first day 1234 100 1/day0 3/day1 free 99 res.1234
sel.bd.3 Select last day 1234 12 3/day5 3/day100 free 11 res.1234
sel.bd.4 Select first slot 1234 12 3/day5 1/day22 free 11 res.1234
sel.bd.5 Select last slot 1234 12 3/day5 8/day22 free 11 res.1234
sel.bd.6 Select exactly 24h before 1234 12 3/day5 3/day6 free 11 res.1234
sel.bd.7 Select with one hours left 1234 1 3/day14 5/day22 free 0 res.1234
EQUIVALENCE TESTS - SELECT OPERATION
BOUNDARY TESTS - SELECT OPERATION
12. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 12
Test cases
Test cases may be interactive
Example:
Black-box test memory in Windows calculator
Entry:
Calculator window displayed
Steps:
Type in 26
Click MS
Type in 38
Click MR
Check output 26
Click MC
Click MR
Check output 0
13. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 13
Test procedures
Test procedure = detailed instructions for setup, execution and
evaluation of test results
• Same for a large group of test cases
• Often in form of checklists
• Example:
Platform Windows 2000, RAM 128K
Calculator only application running
Read test case from file CalcTest112.xls
Apply input data
Record actual output
Report differences from expected output
Test script = program that automates execution of a test procedure
• Often in a specialized testing language akin to Visual Basic or C++
Rational suite: “SQA basic”
14. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 14
Overview
Basic concepts
Test case design
• Engineering principles
• Advanced techniques
Formal reviews
QA planning
Test automation
15. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 15
Tests must be repeatable
Why? A number of reasons
• Avoid judgment calls after test execution
Pass or fail, bug or feature, validity of input data, execution conditions, …
• Inform the tester on what needs done
Testers shouldn’t have to understand the workings of the programs
Different people understand different things…
• Enable automated test execution
• Report failure back to development
What do you mean “it doesn’t work” when I know it works!
How? Full details!
• Bad: save number, type number, restore number, see first number
What if numbers are the same? type in –0? over limit? truncation? “restore”
is interpreted as “store another number”?
• Good: type 12, click MS, type 33, click MR, see 12
• Keep the principle even if interaction or dynamic input is required!
16. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 16
Complete input coverage is not feasible
Example: purchase something online
How many possible inputs?
• Text fields
• Button clicks
• Hyperlink clicks
And, how many possible input combinations?
• One string: 26 x 26 x 26 x …
• Infinite sequences of mouse clicks
• Delays, ...
And, how many possible interleavings?
• Client events vs. server events
• Different threads on server
17. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 17
Testing cannot prove complete absence of bugs
Misconception: “my code is fully tested so it has no problems”
Reality: “testing can only prove presence of bugs” [Dijkstra]
Edsger Wybe Dijkstra 1930-2002
• A pioneer of structured and disciplined programming
• “Goto considered harmful”
• Key contributions to concurrency, system design, formal methods,
software engineering, ...
18. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 18
Coverage
Coverage = percentage of a class of IUT elements that have been
exercised at least once by a set of tests
• Code coverage
• State coverage
• Input coverage
• Use case coverage
• ...
A compromise must be reached between coverage and effort
• What is a reasonable compromise?
19. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 19
Targeted tests are more efficient!
Example: typo-like faults
• Simplifying assumptions:
Uniformly distributed in the code
Perfect detection
• Assume random tests
Probability of detecting a fault at time t ~= defect concentration ~=
remaining number of faults
Exponential decay: 1/f0 * e^(- f0 t)
The effort to detect a new fault ~= 1/probability = exponential in t
• Assume tests targeted to code coverage
Each test covers a few new statements
Equally likely to discover a fault
The effort per fault stabilizes to constant
All such faults are eventually discovered
• The simplifying assumptions hold not perfectly, but well enough
• Similar ones hold well enough for other types of faults
20. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 20
Get a feel for coverage effects!
Naval combat game: shooting in a pattern is much more effective than
shooting at random
* *
*
* *
*
*
*
*
*
*
*
21. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 21
Typical test targets
Code coverage
• Statement
• Condition
Black-box coverage
• Code coverage won’t detect omission faults
• Use case coverage
• UI state transitions
• Activate / not activate each particular output
E.g. buy a book or not; change address or not
Invalid data: one invalid test vector for each constraint
• Preconditions
• Invalid formats (text in number fields)
Boundary conditions: one test for each boundary
Configurations of the deployment platform
Typical faults
22. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 22
Typical test targets
A frequently encountered pattern:
• For each use case
A “best test” that exercises most of the code with typical values
One test for each invalid or boundary condition
Example: testing “select” in web reservation system
• For each non-functional requirement
Prove that it is met
Use typical data
23. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 23
Typical test targets
What else to test
• Install, setup
• Help
• OS versions and locales
• Screen resolutions and depths
• Multiple instances
• Window minimize, drag/redraw, close
• Memory leaks, invasiveness
• User manual and help
Go through each keystroke in manual; check result
24. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 24
Multidimensional coverage
Same test case may count in several targets
• E.g. several memory functions in Windows calculator
• Keep error and invalid conditions separate
Example: platform coverage
• Platform = hardware and software environment
May include other concurrently running applications (Why?)
• Pattern approach
List all likely combinations
For each combination, assign
Projected usage value
Technical risk value
• Cycle through list changing platform periodically so that different
tests are run on different platforms
Use based: most tests should run on the highly used platform
Risk based: test high risk exposure first
Requires organized lab - Swap Ready
At the end we reasonably cover all platforms and all tests
25. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 25
And finally...
Test cases must be easy to maintain!
• Will evolve with the system
• Executed in an organized manner
• Coverage estimation
Not too simple, not too complex
• Depending on needs
Structured, linked to some test goal
• Grouped by function tested
• Grouped by type of coverage / type of testing
26. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 26
Scenario-based testing
Based on
• User experience: create account, purchase, access affiliate, etc.
• User type: novice user, power user, intermittent user
Elaboration
• In parallel with development
• Separate test procedure from test data
Procedure: instructions to follow to run the test; navigation instructions;
same for all tests
Test data: any data typed into a form or field by a user; variable part of a
test
Other data: any other field; any decision
Execution
• QA, contractors, students, support staff
• Test script
27. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 27
Example: purchase a book from Amazon.com
Procedure: can be applied to many test cases
• Go to URL www.amazon.com
• Wait for home page to load
• Type in ISBN number of book
• Add book to shopping cart
• Proceed to checkout
• Log in with user name and password
• Place order
Data: make sure the test case is repeatable
• Book ISBN
• Customer user name, password
• Exactly which buttons to click, in order
28. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 28
Exploratory testing
Example: explore statistical functions in
Windows calculator
• Press “Sta”
• Load statistics box
• This activates the blue buttons
• Cover all the activated functions
Exercise the function
Exercise all mouse events
Right-click brings up help!
• Use help
Help→index→statistical calculations
→using key sequences as functions
• Surprise: interoperability Notepad
E.g. type 123:m for memory function
• High priority? Explore that!
Test cases defined and executed on the fly
Dynamically targeted!
29. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 29
Result of exploratory testing
During exploratory test we must capture
• Functions, options or sub-functions being explored
• Test cases attempted
• State of the application: comments, notes, images
• Hints, reminders and observations that may be useful to future
testers
• Date, platform, configuration under test
(Test must be repeatable!)
• Oracles, “strategy to assess correctness”
Specified property
OS-crash, help active
Or, just mapping
• Other relevant details
About the decisions taken, state of product, and result of test
Records: concise, tabular, chronological
30. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 30
Daily build and smoke test
Build daily
• Keep development in sync
• Ensure few open defects
Smoke test
• Exercise the system end-to-end
• Not exhaustive: detect major problems
• Evolve as the system grows
• Build only ready-to-build code, after private builds
Managing the daily builds
• Run by QA, not DEV
• Stop work for broken builds
• Release in the morning
31. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 31
Overview
Basic concepts
Test case design
• Engineering principles
• Advanced techniques
Formal reviews
QA planning
Test automation
32. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 32
Reviews
Manually examine a software artifact according to specified criteria
• Inspections, reviews, walkthroughs, ...
Reviews are more effective than testing
• Number of defects found
• Effort per defect
• See Jalote p. 247: review capability baseline
• Point to cause, not just symptom
• Can be used for artifacts other than code, earlier in the process
• Training value for novices
• Good leverage for experts
Reviews cannot replace testing
• Subjective interpretation not completely avoidable
• Little or no automation
• Different types of defects
• Different expertise
33. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 33
Typical review process
[After Jalote]
Review meeting
• Moderated
• Focused on defects, not fixes
• Focused on artifact, not author
• Check preparedness
• Go through the artifact line by line
• Reconcile defect reports
Planning
Individual
Review
Review
meeting
Rework&
followup
Schedule
Review team
Defect logs
Time spent
Defect list
Issue list
Metrics
Entry
criteria
34. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 34
Variations
Different focus for different artifacts
• Jalote p. 242
Level of formality
• Inspections, walkthroughs, etc.
One-person reviews
• Reduce costs and efficiency
• Keep psychological effects
Active design reviews
• Really, a type of inspection
• No meeting
• Looks like several one-person reviews
• Opportunity to compare defects found
35. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 35
Overview
Basic concepts
Test case design
• Engineering principles
• Advanced techniques
Formal reviews
QA planning
Test automation
36. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 36
Test task breakdown
Establish test goals
• Cohesive functionality
• Easy to communicate
Task granularity
• Aim for 90 minute chunks
• Time will go down to 45 and 30 minutes on second and third pass
Test breakdown should match design breakdown
• Test a piece of code as soon as it is created
• Test cases evolve with code
Risk dictates order of testing
• Important bugs can be fixed while lower-risk ones are being tested
• Technical risks based on work done and possible impacts
• Commercial risks based on known business opportunities or current customers
• Risks should be re-evaluated on every test cycle! (situations change)
37. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 37
Example
Windows calculator
• Test objectives / tasks:
UI components
Arithmetic and logic
Statistical functions
Typical usage scenarios
Platforms, install
...
38. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 38
QA stages
Every major activity should have sign-off criteria
• Sound motivation, visibility, control
Recommended QA activities from [Survival guide, p. 217]:
• UI prototype review
• User manual/requirements specification review *
• Architecture review
• Detailed design review
• Code review
• Unit testing **
• Integration testing *
• Daily smoke test
• System testing **
Widespread but not best practice:
• * sometimes used, ** always used
39. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 39
Varying levels of QA
If more stages are needed
• More detailed testing
More criteria to cover
Higher coverage of each criterion
• More artifacts under review
Test plan / test suite review
• More critical reviewing
Extended review checklist
Cross-reviewing by more experienced reviewers
If fewer stages are needed
• Merge several QA steps
• Careful not to decrease development efficiency
40. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 40
QA allocation
How much QA effort is needed, and how much is too much?
#stages = log(#injected / #on-release) / log(DRE)
Numbers depend on
• Effort and size metrics for DEV and QA activities
• Baseline metrics
• Definition and type of defects
• Organization capability
• Developer, training and experience
• Technology used
41. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 41
Defect injection and removal
DIR = defect injection rate
• # defects introduced at each development stage
• Normalized, per function point or per person-hour
Fault potential = Sum((DIR per stage)*(FP or effort))
DRE = defect removal efficiency
• % of existing defects removed per stage of testing or review
• In-process defect removal = Product(DRE per stage)
42. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 42
Example
A small project
• 200 FP
• Aim for fewer than 2 defects/KLOC on release
• Assume that black-box, structural, integration, and system testing will be
performed.
• How many inspection steps do we need to perform?
Use Jones’ rules of thumb
• Fault potential = FP^1.25 = 753
• Size = 100 * FP = 20 KLOC
• Injected faults/KLOC = 37.65
• 4 types of testing (black-box, structural, integration, system) leave in 0.7^4 =
24% faults
• Inspections need to leave in less than 2 / (37.65 * .24) = 22% of faults
• Two types of inspection leave 0.4^2 = 16% of faults
• Therefore, two inspection steps (e.g. SRS and design inspections) suffice
Coefficients should be adapted to project type / organization
• Also see Jalote, ch. 12 for calculations using baselines
43. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 43
Quality management
Procedural: defined procedures but no measures
• Hard to tell whether the QA procedures were executed well or not
Quantitative: quality measures, procedure independent
• Setting a quality goal
Defect concentration
Reliability
• Monitoring and control
Set intermediate goals
Refer to intermediate goals
Take action on deviations
44. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 44
Setting a quality goal
Defect concentration on release
• Industry averages, past performance:
# defects per FP or KLOC
Estimated size → total # defects on release
Needs to be a tangible goal
• Convert to in-process DRE, based on estimated DIR
• Convert to # defects in acceptance testing, based on past data
Infosys: 5..10% of total defects! [Jalote p. 155]
45. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 45
Monitoring
Good quality management should send warning signs early
• Set intermediate goals and take action if not met
Intermediate project goals based on
• Rayleigh curve of defect discovery
• Baseline data on relative effectiveness [Jalote p.150; p.155]
• Baseline data per type of QA activity
If current data is out of norm, determine cause and take action
• Look into related factors: effort, experience, etc.
[Jalote table 12.3/p. 249]
Action:
• Additional QA steps, as seen
• Review QA procedure & redo
• Training, prototyping, ...
46. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 46
Release criteria
Should be based on objective, measurable detail
• E.g. “no open high-severity defects” and “less than 10 defects on AT”
• Avoid need for a judgment call, and associated pressures!
Decision lays with product management
• DEV may be oblivious to shortcomings of its brainchild
• QA can propose alternatives but not decide what testing to skip
• MGMT knows best the business goals and priorities
Read the trend of defect discovery to estimate remaining defects
• Use the failure data from the final stages of testing
In simplest form, remaining defects ~= #new defects; extrapolate
A bit too simplistic
• Compare actual defect data to quality goals
Actual vs. planned defect ratio is probably the same for the discovered and
remaining defects
Corrections for actual development effort and number of QA stages
47. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 47
Overview
Basic concepts
Test case design
• Engineering principles
• Advanced techniques
Formal reviews
QA planning
Test automation
48. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 48
Applying regression tests
Needed to support refactoring and other processes
Run a test harness / test script
• Read tests from a user-edited test manifest
• Can be a spreadsheet file
• CSV (comma separated variable) → a handy format supported by
many databases
• Open a ODBC or JDBC link to read test cases directly into test
harness program
Rational Robot
49. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 49
Test case elaboration
Rational TestFactory
• Produce a “best test” using typical values
• Target tests to uncovered code
• Low coverage rates
UI mapping
• Based on component registry / OS API
Coverage analysis
• Based on automated code instrumentation
50. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 50
Static analysis
Automated tools to study code
• Compiler warnings
• Lint
• W3C tools to see if HTML conforms to standards
• Posix compliance checking
• Win API checking
• Code complexity measures including Loc, McCabe etc.
51. McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software quality management—Slide 51
Discussion
Thank you!
Any questions?