SlideShare una empresa de Scribd logo
1 de 38
OHT 10.1
1
• The testing process
• Determining the test methodology phase
• Planning the tests
• Test design
• Test implementation
• Test case design
• Test case data components
• Test case sources
• Automated testing
• The process of automated testing
• Types of automated testing
• Advantages and disadvantages of automated testing
• Alpha and beta site testing programs
Chapter 10 – Software Testing -
Implementation
OHT 10.2
2
Introduction
• Need to discuss
– testing effectiveness and
– testing efficiency.
• Equivalently, want to reduce number of undetected errors
and yet test with fewer resources over less time.
• Need to design testing procedures to be effective in
detecting errors and doing so as efficiently as possible.
• Also want to look at automatic testing too.
OHT 10.3
3
Introduction – Topics to Cover
• Much of this set of slides focuses on testing at
various levels:
– Unit tests
– Integration tests
– System tests.
OHT 10.4
4
Ultimate Desired Outcomes for the
Chapter
• Today:
– Describe the process of planning and designing tests
– Discuss the sources for test cases, with their advantages
and disadvantages
• Next:
– List the main types of automated software tests
– Discuss the advantages and disadvantages of automated
computerized testing as compared to manual testing
– Explain alpha and beta site test implementation and
discuss their advantages and disadvantages.
OHT 10.5
5
10.1 The Testing Process
• Testing is done throughout the development process.
• Testing is divided into phases beginning in the
design phase and ending at the customer’s site.
• Testing process is illustrated in the next slide.
• The two fundamental decisions that must be made
before planning for testing can occur are:
– What is the required software quality standard, and
– What is the software testing strategy.
OHT 10.6
6
Determining the test
methodology
Planning the tests
Designing the tests
Performing the tests
(implementation)
OHT 10.7
7
Determining the Appropriate
Software Quality Standard
• Different standards required for different software applications.
• e.g. safety-critical software or aircraft instrumentation - critical.
• In other cases, a medium-level quality would be sufficient, and
• So, the expected damage resulting from failed software impacts
standard of software quality.
• Samples of damage to customers and users as well as to
developers are shown on the next two slides:
OHT 10.8
8
1. Endangers the safety of human beings
2. Affects an essential organizational function with no system
replacement capability available
3. Affects functioning of firmware, causing malfunction of an entire
system
4. Affects an essential organizational function but a replacement is
available
5. Affects proper functioning of software packages for business
applications
6. Affects proper functioning of software packages for a private
customer
7. Affects functioning of a firmware application but without affecting the
entire system.
8. Inconveniences the user but does not prevent accomplishment of the
system’s capabilities
OHT 10.9
9
1. Financial losses
* Damages paid for physical injuries
Aircraft or auto instrumentation; health equipment…. Law suites!!
* Damages paid to organizations for malfunctioning of
software
Companies have many lawyers on staff!!!
* Purchase cost reimbursed to customers
* High maintenance expenses for repair of failed systems
2. Non-quantitative damages
* Expected to affect future sales
* Substantially reduced current sales
OHT 10.10
10
Determining Software Testing Strategy
• Big Bank or Incremental? So, do we want the
testing strategy to be big bang or incremental?
– Major testing at end in the past….
– If incremental, top down or bottom up?
• Which parts of the testing plan should be done
using white box testing?
• Black box?
• Which parts of the test plan should be done using
an automated test model?
OHT 10.11
11
Planning the Tests
• We need to undertake:
– Unit tests
– Integration tests, and
– System Tests.
• Unit tests deal with small hunks – modules,
functions, objects, classes;
• Integration tests deal with units constituting a
subsystem or other major hunks of capability, and
• System tests refer to the entire software package
or system.
• These are often done by different constitutencies!!
OHT 10.12
12
Lots of Questions
• So we first need to consider five basic issues:
– What to test
– Which sources do we use for test cases
– Who is to perform the tests
– Where to perform the tests, and
– When to terminate the tests.
• Questions with not so obvious answers!
OHT 10.13
13
What to Test
• We would like to test everything.
• Not very practical.
• Cannot undertake exhaustive testing…
– Number of paths is infinite….
• Consider:
– Do we totally test modules that are 98% reused?
– Do we really need to test things that have been
repeatedly tested with only slight changes?
– How about testing by newbees?
– Testing on sensitive modules that pose lots of risk?
OHT 10.14
14
• So, which modules need to be unit tested?
• Which integrations should be tested?
• Maybe low priority applications tested in unit
testing may not be needed or included in the
system tests….
• Lots of planning is needed, as testing IS a very
expensive undertaking!
OHT 10.15
15
Rating Units, Integrations, and
Applications
• We need to rate these issues to determine their
priority in the testing plan.
• Rate based on two factors:
• 1. Damage severity level – severity of results if
module / application fails.
– How much damage is done??
– Will it destroy our business? Our reputation??
• 2. Software risk level – what is the probability
of failure.
• Factors affecting risk – next slide:
OHT 10.16
16
Module/Application Issues
1. Magnitude (size)
2. Complexity and difficulty
3. Percentage of original software (vs. percentage of
reused software)
Programmer Issues
4. Professional qualifications
5. Experience with the module's specific subject matter.
6. Availability of professional support (backup of
knowledgeable and experience).
7. Acquaintance with the programmer and the ability to
evaluate his/her capabilities.
OHT 10.17
17
Computations
• Essential to calculate risk levels
– We must budget our testing due to high cost
– But we must temper our testing with cost/risk!
• Helps to determine what to test and to what extent.
• Use a ‘combined’ rating bringing together
damage severity (how serious would the damage
be?) and risk severity / probability.
• Sample calculations: A is damage; B is risk.
– C = A + B
– C = k*A + m * B
– C = A * B // most familiar with this one…
OHT 10.18
18
• If we are including unit, integration, and applications in a
test plan, we need to know how much/many resources are
needed.
– Thus, we need to prioritize.
• Higher the rating, and priority more allocation of
testing resources are needed.
• Consider:
– Some tests are based on high percentages of reused code.
– Some applications are developed by new employees
• We typically use a 5 point scale, where 5 is high.
• (Variations include a 1-5 severity level and a probability of
0.0 to 1.0 of their occurrence.)
• Can see results for Super Teacher application in next slide.
OHT 10.19
19
Combined rating method
Application Damage
Severity
Factor A
Damage
Severity
Factor B
A +B 7xA+2xB A x B
1. Input of test results 3 2 5 (4) 25 (5) 6 (4)
2. Interface for input and output of pupils’data to
and from other teachers
4 4 8 (1) 36 (1) 16 (1)
3. Preparation of lists of low achievers 2 2 4 (5-6) 18 (7) 4 (5-6)
4. Printing letters to parents of low achievers 1 2 3 (7-8) 11 (8) 2 (8)
5. Preparation of reports for the school principal 3 3 6 (3) 27 (3) 9 (3)
6. Display of a pupil’s achievements profile 4 3 7 (2) 34 (2) 12 (2)
7. Printing of pupil’s term report card 3 1 3 (7-8) 23 (6) 3 (7)
8. Printing of pupil’s year-end report card 4 1 4 (5-6) 26 (4) 4 (5-6)
(Damage / p())
OHT 10.20
20
1. Which Sources Should be Used for
Test Cases?
• Do we use live test cases or synthetic test cases.
• All three types of tests should consider these.
• Use live data or contrived (dummy) data??
• What do you think??
• Also need to consider single / combined tests and
the number of tests.
• How about if the testing is top down? Bottom up?
What sources do you think might be needed then??
OHT 10.21
21
Who Performs the Tests?
• Unit Testing – done by the programmer and/or
development team.
• Integration Testing – can be the development team or
a testing unit.
• System Testing – usually done by an independent
testing team (internal or external (consultants) team.
• For small companies, another testing team from another
development team can be used and swapped.
• Can always outsource testing too.
OHT 10.22
22
Where to Perform the Tests?
• Typically at the software developer’s site..
• For system tests, test at developer’s or
customer’s site (target site).
• If outsourced, testing can be done at consultant’s
site.
OHT 10.23
23
When are Tests Terminated?
• This is always the $64,000 question!!
• Decision normally applies to system tests.
• Five typical alternatives
• 1. Completed Implementation Route
– Test until all is error free. (good luck)
– All testing, regression testing;
– Disregards budget and timetable constraints.
– Applies to perfection approach
OHT 10.24
24
• 2. Mathematical Models Application Route:
• Here modeling us used to estimate percentage of
undetected errors based on rate of error detection.
• When detection rate falls below a certain level, stop.
• Disadvantage: math model may not fully represent
the project’s characteristics.
– Thus testing may be cut short or extended too far.
• Advantage: Well-defined stopping point.
OHT 10.25
25
• 3. Error Seeding Route
• Here, we seed errors prior to testing.
• Underlying assumption is that percentage of discovered
seeded errors will correspond to the percentage of real
errors detected.
• Stop once residual percentage of undetected seeded
errors reaches a predefined level considered acceptable for
‘passing’ the system.
• Disadvantages: extra workload for testers; also based on
past experiences of some testers;
• Too, seeding method can not accurately estimate the
residual rate of undetected errors in unfamiliar systems.
OHT 10.26
26
• 4. The dual independent testing teams route:
• Here two teams implement the testing process
independently.
• Compare lists of detected errors.
• Calculate the number of errors left undetected
• Lots of statistics here.
• High costs. Justified when??
OHT 10.27
27
• 5. Termination after resources have petered out.
• This means stop when budgets or time for testing has run out.
• Very common in industry
OHT 10.28
28
Test Plan
• Lastly, system testing is documented in a software
test plan.
• Common formats are available.
OHT 10.29
29
Test Design and Software Test Plan
• Products of Test Design
– Detailed design and procedures for each test
– The input database / files for testing.
• There are standard software test plans (STP)
templates
OHT 10.30
30
1 Scope of the tests
1.1 The software package to be tested (name, version and revision)
1.2 The documents that provide the basis for the planned tests
2 Testing environment
2.1 Sites
2.2 Required hardware and firmware configuration
2.3 Participating organizations
2.4 Manpower requirements
2.5 Preparation and training required of the test team
OHT 10.31
31
3 Tests details (for each test)
3.1 Test identification
3.2 Test objective
3.3 Cross-reference to the relevant design document and the requirement
document
3.4 Test class
3.5 Test level (unit, integration or system tests)
3.6 Test case requirements
3.7 Special requirements (e.g., measurements of response times, security
requirements)
3.8 Data to be recorded
4 Test schedule (for each test or test group) including time
estimates for:
4.1 Preparation
4.2 Testing
4.3 Error correction
4.4 Regression tests
OHT 10.32
32
1 Scope of the tests
1.1 The software package to be tested (name, version and
revision)
1.2 The documents providing the basis for the designed tests
(name and
version for each document)
2 Test environment (for each test)
2.1 Test identification (the test details are documented in the
STP)
2.2 Detailed description of the operating system and hardware
configuration
and the required switch settings for the tests
2.3 Instructions for software loading
OHT 10.33
33
3. Testing process
3.1 Instructions for input, detailing every step of the input
process
3.2 Data to be recorded during the tests
4. Test cases (for each case)
4.1 Test case identification details
4.2 Input data and system settings
4.3 Expected intermediate results (if applicable)
4.4 Expected results (numerical, message, activation of
equipment, etc.)
5. Actions to be taken in case of program failure/cessation
6. Procedures to be applied according to the test results
summary
OHT 10.34
34
Test Implementation
• Really, this is just running the tests, correction of
tests, running regression tests,
• Testing is done when the outcomes satisfy the
developers.
• When are these tests run?? (time of day/ date??)
OHT 10.35
35
Regression Testing
• Need not test everything.
• Typically re-test only those artifacts directly changed
and those providing inputs and outputs to these
changed artifacts (modules).
• Very often new errors are introduced when changes
are made.
• There’s always risk in not testing everything… but
these decisions must be made.
• Results of testing are documented in a test report.
OHT 10.36
36
OHT 10.37
37
1. Test identification, site, schedule and participation
1.1 The tested software identification (name, version and revision)
1.2 The documents providing the basis for the tests (name and
version for each document)
1.3 Test site
1.4 Initiation and concluding times for each testing session
1.5 Test team members
1.6 Other participants
1.7 Hours invested in performing the tests
2. Test environment
2.1 Hardware and firmware configurations
2.2 Preparations and training prior to testing
OHT 10.38
38
3. Test results
3.1 Test identification
3.2 Test case results (for each test case individually)
4. Summary tables for total number of errors, their
distribution and types
4.1 Summary of current tests
4.2 Comparison with previous results (for regression test summaries)
5. Special events and testers' proposals
5.1 Special events and unpredicted responses of the software during testing
5.2 Problems encountered during testing.
5.3 Proposals for changes in the test environment, including test preparations
5.4 Proposals for changes or corrections in test procedures and test case files

Más contenido relacionado

Similar a Testing software

ISTQBCH2.ppt
ISTQBCH2.pptISTQBCH2.ppt
ISTQBCH2.pptghkadous
 
ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2Yogindernath Gupta
 
Testing strategies part -1
Testing strategies part -1Testing strategies part -1
Testing strategies part -1Divya Tiwari
 
Softwaretestingstrategies
SoftwaretestingstrategiesSoftwaretestingstrategies
Softwaretestingstrategiessaieswar19
 
John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...
John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...
John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...TEST Huddle
 
John Fodeh Adventures in Test Automation - EuroSTAR 2013
John Fodeh Adventures in Test Automation - EuroSTAR 2013John Fodeh Adventures in Test Automation - EuroSTAR 2013
John Fodeh Adventures in Test Automation - EuroSTAR 2013TEST Huddle
 
Top Ten Tips for Tackling Test Automation Webinar Presentation.pptx
Top Ten Tips for Tackling Test Automation Webinar Presentation.pptxTop Ten Tips for Tackling Test Automation Webinar Presentation.pptx
Top Ten Tips for Tackling Test Automation Webinar Presentation.pptxInflectra
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testingBugRaptors
 
SOFTWARE TESTING W1_watermark.pdf
SOFTWARE TESTING W1_watermark.pdfSOFTWARE TESTING W1_watermark.pdf
SOFTWARE TESTING W1_watermark.pdfShubhamSingh606946
 
ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2Chandukar
 
Object Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesObject Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesPunjab University
 
Module V - Software Testing Strategies.pdf
Module V - Software Testing Strategies.pdfModule V - Software Testing Strategies.pdf
Module V - Software Testing Strategies.pdfadhithanr
 
Software Engineering (Testing Overview)
Software Engineering (Testing Overview)Software Engineering (Testing Overview)
Software Engineering (Testing Overview)ShudipPal
 
1)Testing-Fundamentals_L_D.pptx
1)Testing-Fundamentals_L_D.pptx1)Testing-Fundamentals_L_D.pptx
1)Testing-Fundamentals_L_D.pptxgianggiang114
 
AiTi Education Software Testing Session 02 a
AiTi Education Software Testing Session 02 aAiTi Education Software Testing Session 02 a
AiTi Education Software Testing Session 02 aAiTi Education
 

Similar a Testing software (20)

ISTQBCH2.ppt
ISTQBCH2.pptISTQBCH2.ppt
ISTQBCH2.ppt
 
Fundamentals of Testing Section 1/6
Fundamentals of Testing   Section 1/6Fundamentals of Testing   Section 1/6
Fundamentals of Testing Section 1/6
 
6. oose testing
6. oose testing6. oose testing
6. oose testing
 
Unit 3 for st
Unit 3 for stUnit 3 for st
Unit 3 for st
 
ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2ISTQB / ISEB Foundation Exam Practice - 2
ISTQB / ISEB Foundation Exam Practice - 2
 
Testing strategies part -1
Testing strategies part -1Testing strategies part -1
Testing strategies part -1
 
Software Testing.pdf
Software Testing.pdfSoftware Testing.pdf
Software Testing.pdf
 
Softwaretestingstrategies
SoftwaretestingstrategiesSoftwaretestingstrategies
Softwaretestingstrategies
 
John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...
John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...
John Fodeh - Adventures in Test Automation-Breaking the Boundaries of Regress...
 
John Fodeh Adventures in Test Automation - EuroSTAR 2013
John Fodeh Adventures in Test Automation - EuroSTAR 2013John Fodeh Adventures in Test Automation - EuroSTAR 2013
John Fodeh Adventures in Test Automation - EuroSTAR 2013
 
ST UNIT-1.pptx
ST UNIT-1.pptxST UNIT-1.pptx
ST UNIT-1.pptx
 
Top Ten Tips for Tackling Test Automation Webinar Presentation.pptx
Top Ten Tips for Tackling Test Automation Webinar Presentation.pptxTop Ten Tips for Tackling Test Automation Webinar Presentation.pptx
Top Ten Tips for Tackling Test Automation Webinar Presentation.pptx
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testing
 
SOFTWARE TESTING W1_watermark.pdf
SOFTWARE TESTING W1_watermark.pdfSOFTWARE TESTING W1_watermark.pdf
SOFTWARE TESTING W1_watermark.pdf
 
ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2
 
Object Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slidesObject Oriented Testing(OOT) presentation slides
Object Oriented Testing(OOT) presentation slides
 
Module V - Software Testing Strategies.pdf
Module V - Software Testing Strategies.pdfModule V - Software Testing Strategies.pdf
Module V - Software Testing Strategies.pdf
 
Software Engineering (Testing Overview)
Software Engineering (Testing Overview)Software Engineering (Testing Overview)
Software Engineering (Testing Overview)
 
1)Testing-Fundamentals_L_D.pptx
1)Testing-Fundamentals_L_D.pptx1)Testing-Fundamentals_L_D.pptx
1)Testing-Fundamentals_L_D.pptx
 
AiTi Education Software Testing Session 02 a
AiTi Education Software Testing Session 02 aAiTi Education Software Testing Session 02 a
AiTi Education Software Testing Session 02 a
 

Último

Enhancing Supply Chain Visibility with Cargo Cloud Solutions.pdf
Enhancing Supply Chain Visibility with Cargo Cloud Solutions.pdfEnhancing Supply Chain Visibility with Cargo Cloud Solutions.pdf
Enhancing Supply Chain Visibility with Cargo Cloud Solutions.pdfRTS corp
 
Introduction to Firebase Workshop Slides
Introduction to Firebase Workshop SlidesIntroduction to Firebase Workshop Slides
Introduction to Firebase Workshop Slidesvaideheekore1
 
eSoftTools IMAP Backup Software and migration tools
eSoftTools IMAP Backup Software and migration toolseSoftTools IMAP Backup Software and migration tools
eSoftTools IMAP Backup Software and migration toolsosttopstonverter
 
Comparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfComparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfDrew Moseley
 
Powering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data StreamsPowering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data StreamsSafe Software
 
Amazon Bedrock in Action - presentation of the Bedrock's capabilities
Amazon Bedrock in Action - presentation of the Bedrock's capabilitiesAmazon Bedrock in Action - presentation of the Bedrock's capabilities
Amazon Bedrock in Action - presentation of the Bedrock's capabilitiesKrzysztofKkol1
 
Zer0con 2024 final share short version.pdf
Zer0con 2024 final share short version.pdfZer0con 2024 final share short version.pdf
Zer0con 2024 final share short version.pdfmaor17
 
What’s New in VictoriaMetrics: Q1 2024 Updates
What’s New in VictoriaMetrics: Q1 2024 UpdatesWhat’s New in VictoriaMetrics: Q1 2024 Updates
What’s New in VictoriaMetrics: Q1 2024 UpdatesVictoriaMetrics
 
Patterns for automating API delivery. API conference
Patterns for automating API delivery. API conferencePatterns for automating API delivery. API conference
Patterns for automating API delivery. API conferencessuser9e7c64
 
Best Angular 17 Classroom & Online training - Naresh IT
Best Angular 17 Classroom & Online training - Naresh ITBest Angular 17 Classroom & Online training - Naresh IT
Best Angular 17 Classroom & Online training - Naresh ITmanoharjgpsolutions
 
Leveraging AI for Mobile App Testing on Real Devices | Applitools + Kobiton
Leveraging AI for Mobile App Testing on Real Devices | Applitools + KobitonLeveraging AI for Mobile App Testing on Real Devices | Applitools + Kobiton
Leveraging AI for Mobile App Testing on Real Devices | Applitools + KobitonApplitools
 
Osi security architecture in network.pptx
Osi security architecture in network.pptxOsi security architecture in network.pptx
Osi security architecture in network.pptxVinzoCenzo
 
Understanding Flamingo - DeepMind's VLM Architecture
Understanding Flamingo - DeepMind's VLM ArchitectureUnderstanding Flamingo - DeepMind's VLM Architecture
Understanding Flamingo - DeepMind's VLM Architecturerahul_net
 
2024 DevNexus Patterns for Resiliency: Shuffle shards
2024 DevNexus Patterns for Resiliency: Shuffle shards2024 DevNexus Patterns for Resiliency: Shuffle shards
2024 DevNexus Patterns for Resiliency: Shuffle shardsChristopher Curtin
 
Precise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive GoalPrecise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive GoalLionel Briand
 
2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf
2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf
2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdfAndrey Devyatkin
 
Machine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their EngineeringMachine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their EngineeringHironori Washizaki
 
Large Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and RepairLarge Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and RepairLionel Briand
 
Salesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZSalesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZABSYZ Inc
 
GraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4j
GraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4jGraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4j
GraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4jNeo4j
 

Último (20)

Enhancing Supply Chain Visibility with Cargo Cloud Solutions.pdf
Enhancing Supply Chain Visibility with Cargo Cloud Solutions.pdfEnhancing Supply Chain Visibility with Cargo Cloud Solutions.pdf
Enhancing Supply Chain Visibility with Cargo Cloud Solutions.pdf
 
Introduction to Firebase Workshop Slides
Introduction to Firebase Workshop SlidesIntroduction to Firebase Workshop Slides
Introduction to Firebase Workshop Slides
 
eSoftTools IMAP Backup Software and migration tools
eSoftTools IMAP Backup Software and migration toolseSoftTools IMAP Backup Software and migration tools
eSoftTools IMAP Backup Software and migration tools
 
Comparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfComparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdf
 
Powering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data StreamsPowering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data Streams
 
Amazon Bedrock in Action - presentation of the Bedrock's capabilities
Amazon Bedrock in Action - presentation of the Bedrock's capabilitiesAmazon Bedrock in Action - presentation of the Bedrock's capabilities
Amazon Bedrock in Action - presentation of the Bedrock's capabilities
 
Zer0con 2024 final share short version.pdf
Zer0con 2024 final share short version.pdfZer0con 2024 final share short version.pdf
Zer0con 2024 final share short version.pdf
 
What’s New in VictoriaMetrics: Q1 2024 Updates
What’s New in VictoriaMetrics: Q1 2024 UpdatesWhat’s New in VictoriaMetrics: Q1 2024 Updates
What’s New in VictoriaMetrics: Q1 2024 Updates
 
Patterns for automating API delivery. API conference
Patterns for automating API delivery. API conferencePatterns for automating API delivery. API conference
Patterns for automating API delivery. API conference
 
Best Angular 17 Classroom & Online training - Naresh IT
Best Angular 17 Classroom & Online training - Naresh ITBest Angular 17 Classroom & Online training - Naresh IT
Best Angular 17 Classroom & Online training - Naresh IT
 
Leveraging AI for Mobile App Testing on Real Devices | Applitools + Kobiton
Leveraging AI for Mobile App Testing on Real Devices | Applitools + KobitonLeveraging AI for Mobile App Testing on Real Devices | Applitools + Kobiton
Leveraging AI for Mobile App Testing on Real Devices | Applitools + Kobiton
 
Osi security architecture in network.pptx
Osi security architecture in network.pptxOsi security architecture in network.pptx
Osi security architecture in network.pptx
 
Understanding Flamingo - DeepMind's VLM Architecture
Understanding Flamingo - DeepMind's VLM ArchitectureUnderstanding Flamingo - DeepMind's VLM Architecture
Understanding Flamingo - DeepMind's VLM Architecture
 
2024 DevNexus Patterns for Resiliency: Shuffle shards
2024 DevNexus Patterns for Resiliency: Shuffle shards2024 DevNexus Patterns for Resiliency: Shuffle shards
2024 DevNexus Patterns for Resiliency: Shuffle shards
 
Precise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive GoalPrecise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive Goal
 
2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf
2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf
2024-04-09 - From Complexity to Clarity - AWS Summit AMS.pdf
 
Machine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their EngineeringMachine Learning Software Engineering Patterns and Their Engineering
Machine Learning Software Engineering Patterns and Their Engineering
 
Large Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and RepairLarge Language Models for Test Case Evolution and Repair
Large Language Models for Test Case Evolution and Repair
 
Salesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZSalesforce Implementation Services PPT By ABSYZ
Salesforce Implementation Services PPT By ABSYZ
 
GraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4j
GraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4jGraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4j
GraphSummit Madrid - Product Vision and Roadmap - Luis Salvador Neo4j
 

Testing software

  • 1. OHT 10.1 1 • The testing process • Determining the test methodology phase • Planning the tests • Test design • Test implementation • Test case design • Test case data components • Test case sources • Automated testing • The process of automated testing • Types of automated testing • Advantages and disadvantages of automated testing • Alpha and beta site testing programs Chapter 10 – Software Testing - Implementation
  • 2. OHT 10.2 2 Introduction • Need to discuss – testing effectiveness and – testing efficiency. • Equivalently, want to reduce number of undetected errors and yet test with fewer resources over less time. • Need to design testing procedures to be effective in detecting errors and doing so as efficiently as possible. • Also want to look at automatic testing too.
  • 3. OHT 10.3 3 Introduction – Topics to Cover • Much of this set of slides focuses on testing at various levels: – Unit tests – Integration tests – System tests.
  • 4. OHT 10.4 4 Ultimate Desired Outcomes for the Chapter • Today: – Describe the process of planning and designing tests – Discuss the sources for test cases, with their advantages and disadvantages • Next: – List the main types of automated software tests – Discuss the advantages and disadvantages of automated computerized testing as compared to manual testing – Explain alpha and beta site test implementation and discuss their advantages and disadvantages.
  • 5. OHT 10.5 5 10.1 The Testing Process • Testing is done throughout the development process. • Testing is divided into phases beginning in the design phase and ending at the customer’s site. • Testing process is illustrated in the next slide. • The two fundamental decisions that must be made before planning for testing can occur are: – What is the required software quality standard, and – What is the software testing strategy.
  • 6. OHT 10.6 6 Determining the test methodology Planning the tests Designing the tests Performing the tests (implementation)
  • 7. OHT 10.7 7 Determining the Appropriate Software Quality Standard • Different standards required for different software applications. • e.g. safety-critical software or aircraft instrumentation - critical. • In other cases, a medium-level quality would be sufficient, and • So, the expected damage resulting from failed software impacts standard of software quality. • Samples of damage to customers and users as well as to developers are shown on the next two slides:
  • 8. OHT 10.8 8 1. Endangers the safety of human beings 2. Affects an essential organizational function with no system replacement capability available 3. Affects functioning of firmware, causing malfunction of an entire system 4. Affects an essential organizational function but a replacement is available 5. Affects proper functioning of software packages for business applications 6. Affects proper functioning of software packages for a private customer 7. Affects functioning of a firmware application but without affecting the entire system. 8. Inconveniences the user but does not prevent accomplishment of the system’s capabilities
  • 9. OHT 10.9 9 1. Financial losses * Damages paid for physical injuries Aircraft or auto instrumentation; health equipment…. Law suites!! * Damages paid to organizations for malfunctioning of software Companies have many lawyers on staff!!! * Purchase cost reimbursed to customers * High maintenance expenses for repair of failed systems 2. Non-quantitative damages * Expected to affect future sales * Substantially reduced current sales
  • 10. OHT 10.10 10 Determining Software Testing Strategy • Big Bank or Incremental? So, do we want the testing strategy to be big bang or incremental? – Major testing at end in the past…. – If incremental, top down or bottom up? • Which parts of the testing plan should be done using white box testing? • Black box? • Which parts of the test plan should be done using an automated test model?
  • 11. OHT 10.11 11 Planning the Tests • We need to undertake: – Unit tests – Integration tests, and – System Tests. • Unit tests deal with small hunks – modules, functions, objects, classes; • Integration tests deal with units constituting a subsystem or other major hunks of capability, and • System tests refer to the entire software package or system. • These are often done by different constitutencies!!
  • 12. OHT 10.12 12 Lots of Questions • So we first need to consider five basic issues: – What to test – Which sources do we use for test cases – Who is to perform the tests – Where to perform the tests, and – When to terminate the tests. • Questions with not so obvious answers!
  • 13. OHT 10.13 13 What to Test • We would like to test everything. • Not very practical. • Cannot undertake exhaustive testing… – Number of paths is infinite…. • Consider: – Do we totally test modules that are 98% reused? – Do we really need to test things that have been repeatedly tested with only slight changes? – How about testing by newbees? – Testing on sensitive modules that pose lots of risk?
  • 14. OHT 10.14 14 • So, which modules need to be unit tested? • Which integrations should be tested? • Maybe low priority applications tested in unit testing may not be needed or included in the system tests…. • Lots of planning is needed, as testing IS a very expensive undertaking!
  • 15. OHT 10.15 15 Rating Units, Integrations, and Applications • We need to rate these issues to determine their priority in the testing plan. • Rate based on two factors: • 1. Damage severity level – severity of results if module / application fails. – How much damage is done?? – Will it destroy our business? Our reputation?? • 2. Software risk level – what is the probability of failure. • Factors affecting risk – next slide:
  • 16. OHT 10.16 16 Module/Application Issues 1. Magnitude (size) 2. Complexity and difficulty 3. Percentage of original software (vs. percentage of reused software) Programmer Issues 4. Professional qualifications 5. Experience with the module's specific subject matter. 6. Availability of professional support (backup of knowledgeable and experience). 7. Acquaintance with the programmer and the ability to evaluate his/her capabilities.
  • 17. OHT 10.17 17 Computations • Essential to calculate risk levels – We must budget our testing due to high cost – But we must temper our testing with cost/risk! • Helps to determine what to test and to what extent. • Use a ‘combined’ rating bringing together damage severity (how serious would the damage be?) and risk severity / probability. • Sample calculations: A is damage; B is risk. – C = A + B – C = k*A + m * B – C = A * B // most familiar with this one…
  • 18. OHT 10.18 18 • If we are including unit, integration, and applications in a test plan, we need to know how much/many resources are needed. – Thus, we need to prioritize. • Higher the rating, and priority more allocation of testing resources are needed. • Consider: – Some tests are based on high percentages of reused code. – Some applications are developed by new employees • We typically use a 5 point scale, where 5 is high. • (Variations include a 1-5 severity level and a probability of 0.0 to 1.0 of their occurrence.) • Can see results for Super Teacher application in next slide.
  • 19. OHT 10.19 19 Combined rating method Application Damage Severity Factor A Damage Severity Factor B A +B 7xA+2xB A x B 1. Input of test results 3 2 5 (4) 25 (5) 6 (4) 2. Interface for input and output of pupils’data to and from other teachers 4 4 8 (1) 36 (1) 16 (1) 3. Preparation of lists of low achievers 2 2 4 (5-6) 18 (7) 4 (5-6) 4. Printing letters to parents of low achievers 1 2 3 (7-8) 11 (8) 2 (8) 5. Preparation of reports for the school principal 3 3 6 (3) 27 (3) 9 (3) 6. Display of a pupil’s achievements profile 4 3 7 (2) 34 (2) 12 (2) 7. Printing of pupil’s term report card 3 1 3 (7-8) 23 (6) 3 (7) 8. Printing of pupil’s year-end report card 4 1 4 (5-6) 26 (4) 4 (5-6) (Damage / p())
  • 20. OHT 10.20 20 1. Which Sources Should be Used for Test Cases? • Do we use live test cases or synthetic test cases. • All three types of tests should consider these. • Use live data or contrived (dummy) data?? • What do you think?? • Also need to consider single / combined tests and the number of tests. • How about if the testing is top down? Bottom up? What sources do you think might be needed then??
  • 21. OHT 10.21 21 Who Performs the Tests? • Unit Testing – done by the programmer and/or development team. • Integration Testing – can be the development team or a testing unit. • System Testing – usually done by an independent testing team (internal or external (consultants) team. • For small companies, another testing team from another development team can be used and swapped. • Can always outsource testing too.
  • 22. OHT 10.22 22 Where to Perform the Tests? • Typically at the software developer’s site.. • For system tests, test at developer’s or customer’s site (target site). • If outsourced, testing can be done at consultant’s site.
  • 23. OHT 10.23 23 When are Tests Terminated? • This is always the $64,000 question!! • Decision normally applies to system tests. • Five typical alternatives • 1. Completed Implementation Route – Test until all is error free. (good luck) – All testing, regression testing; – Disregards budget and timetable constraints. – Applies to perfection approach
  • 24. OHT 10.24 24 • 2. Mathematical Models Application Route: • Here modeling us used to estimate percentage of undetected errors based on rate of error detection. • When detection rate falls below a certain level, stop. • Disadvantage: math model may not fully represent the project’s characteristics. – Thus testing may be cut short or extended too far. • Advantage: Well-defined stopping point.
  • 25. OHT 10.25 25 • 3. Error Seeding Route • Here, we seed errors prior to testing. • Underlying assumption is that percentage of discovered seeded errors will correspond to the percentage of real errors detected. • Stop once residual percentage of undetected seeded errors reaches a predefined level considered acceptable for ‘passing’ the system. • Disadvantages: extra workload for testers; also based on past experiences of some testers; • Too, seeding method can not accurately estimate the residual rate of undetected errors in unfamiliar systems.
  • 26. OHT 10.26 26 • 4. The dual independent testing teams route: • Here two teams implement the testing process independently. • Compare lists of detected errors. • Calculate the number of errors left undetected • Lots of statistics here. • High costs. Justified when??
  • 27. OHT 10.27 27 • 5. Termination after resources have petered out. • This means stop when budgets or time for testing has run out. • Very common in industry
  • 28. OHT 10.28 28 Test Plan • Lastly, system testing is documented in a software test plan. • Common formats are available.
  • 29. OHT 10.29 29 Test Design and Software Test Plan • Products of Test Design – Detailed design and procedures for each test – The input database / files for testing. • There are standard software test plans (STP) templates
  • 30. OHT 10.30 30 1 Scope of the tests 1.1 The software package to be tested (name, version and revision) 1.2 The documents that provide the basis for the planned tests 2 Testing environment 2.1 Sites 2.2 Required hardware and firmware configuration 2.3 Participating organizations 2.4 Manpower requirements 2.5 Preparation and training required of the test team
  • 31. OHT 10.31 31 3 Tests details (for each test) 3.1 Test identification 3.2 Test objective 3.3 Cross-reference to the relevant design document and the requirement document 3.4 Test class 3.5 Test level (unit, integration or system tests) 3.6 Test case requirements 3.7 Special requirements (e.g., measurements of response times, security requirements) 3.8 Data to be recorded 4 Test schedule (for each test or test group) including time estimates for: 4.1 Preparation 4.2 Testing 4.3 Error correction 4.4 Regression tests
  • 32. OHT 10.32 32 1 Scope of the tests 1.1 The software package to be tested (name, version and revision) 1.2 The documents providing the basis for the designed tests (name and version for each document) 2 Test environment (for each test) 2.1 Test identification (the test details are documented in the STP) 2.2 Detailed description of the operating system and hardware configuration and the required switch settings for the tests 2.3 Instructions for software loading
  • 33. OHT 10.33 33 3. Testing process 3.1 Instructions for input, detailing every step of the input process 3.2 Data to be recorded during the tests 4. Test cases (for each case) 4.1 Test case identification details 4.2 Input data and system settings 4.3 Expected intermediate results (if applicable) 4.4 Expected results (numerical, message, activation of equipment, etc.) 5. Actions to be taken in case of program failure/cessation 6. Procedures to be applied according to the test results summary
  • 34. OHT 10.34 34 Test Implementation • Really, this is just running the tests, correction of tests, running regression tests, • Testing is done when the outcomes satisfy the developers. • When are these tests run?? (time of day/ date??)
  • 35. OHT 10.35 35 Regression Testing • Need not test everything. • Typically re-test only those artifacts directly changed and those providing inputs and outputs to these changed artifacts (modules). • Very often new errors are introduced when changes are made. • There’s always risk in not testing everything… but these decisions must be made. • Results of testing are documented in a test report.
  • 37. OHT 10.37 37 1. Test identification, site, schedule and participation 1.1 The tested software identification (name, version and revision) 1.2 The documents providing the basis for the tests (name and version for each document) 1.3 Test site 1.4 Initiation and concluding times for each testing session 1.5 Test team members 1.6 Other participants 1.7 Hours invested in performing the tests 2. Test environment 2.1 Hardware and firmware configurations 2.2 Preparations and training prior to testing
  • 38. OHT 10.38 38 3. Test results 3.1 Test identification 3.2 Test case results (for each test case individually) 4. Summary tables for total number of errors, their distribution and types 4.1 Summary of current tests 4.2 Comparison with previous results (for regression test summaries) 5. Special events and testers' proposals 5.1 Special events and unpredicted responses of the software during testing 5.2 Problems encountered during testing. 5.3 Proposals for changes in the test environment, including test preparations 5.4 Proposals for changes or corrections in test procedures and test case files