1. ROLE OF TESTING IN
CERTIFICATION PROCESS
Presented by
Rishu Seth(seth@stud.fh-frankfurt.de)
M.Sc High Integrity Systems
University of Applied Sciences - Frankfurt am Main
2. TABLE OF CONTENTS
• Introduction
• Certification Process
• Role of Testing
• Principle Certification Problems
• Requirements for Testing based on Standards
• Conclusion
• References
3. INTRODUCTION
• TESTING:
According to Dictionary :
‘To ascertain (the worth, capability or endurance)
(of a person or thing) by subjection to certain
examinations.’
No universally accepted definition.
• CERTIFICATION:
A written guarantee that a system or component
complies with its specified requirements and is
acceptable for operational use.
4. CERTIFICATION PROCESS
• Carried out by government agencies or other
organizations with a national standing.
• Can be applied to either organizations,
individuals, tools or methods.
• Does not necessarily mean that the system is
correct.
5. NEED FOR CERTIFICATION
• Required for legal reasons.
• Important for commercial reasons like having
a sales advantage.
• Competence in specific areas.
6. PARTS OF CERTIFICATION PROCESS
• Verification & Validation (V & V) – Parts of
long certification process.
• V & V have become really important especially
in software development process because of
increased complexity.
• V&V is essential from the very beginning of
the development life cycle.
7. VERIFICATION
• The process of evaluating a system or component to
determine whether the product of a given
development phase satisfies the conditions imposed
at the start of that phase. i.e.
"Are we building the product right?“
• Two categories:
Dynamic Testing
Static Testing
8. VALIDATION
• The process of evaluating a system or component
during or at the end of the development process to
determine whether it satisfies specified
requirements i.e.
"Are we building the right product?“
• Validation Techniques:
Formal Methods
Fault injection
Hazard analysis
9. ROLE OF TESTING
• Software Testing is a process of verifying and
validating that a software application or program:
Meets the business and technical requirements
that guided its design and development.
Works as expected.
• Testing provides an insight into the risks that will be
incurred if lower quality is accepted which is also the
principle objective of Testing.
10. WHY TO DO TESTING?
• Software testing is focused on finding defects in the final product.
• Example of important defect that better testing would have found:
In June 1996 the first flight of the European Space Agency's
Ariane 5 rocket failed shortly after launching, resulting in an
uninsured loss of $500,000,000. The disaster was traced to the
lack of exception handling for a floating-point error when a 64-
bit integer was converted to a 16-bit signed integer.
11. WHAT DO WE TEST?
• Firstly test what is important i.e. focus on the core
functionality - the parts that are critical or popular.
• The value of software testing is that it goes far
beyond testing the underlying code.
• A comprehensive testing regime examines all
components associated with the application.
12. WHAT DO WE TEST?
• Testing can involve some or all of the following factors:
Business requirements
Functional design requirements
Technical design requirements
Regulatory requirements
Programmer code
Systems administration standards and restrictions
Corporate standards
Professional or trade association best practices
Hardware configuration
Cultural issues and language differences
13. WHO DOES THE TESTING?
• It is not a one person job.
• It takes a team, size of which depends on the
application.
• The developer of the application should play a
reduced role.
14. TEST SPECIFICATION TECHNIQUES
• White Box Testing:
Based on the program code or technical design.
Knowledge about the internal structure of the system plays an
important role.
Other terms used for this kind of techniques are Glass-box or
Structural Testing.
• Black Box Testing:
Based on functional specifications and quality requirements.
Knowledge about the structure of the system is not used, but
the judgement is made merely from a functional point of view
of the system.
Other terms used for this kind of techniques are Functional or
Behavioural Testing.
15. V-MODEL OF SOFTWARE TESTING
• V-Model of testing incorporates testing into the
entire software development life cycle.
• It illustrates that testing can and should start at the
very beginning of the project.
• It also illustrates how each subsequent phase should
verify and validate work done in the previous phase
and how work done during development is used to
guide the individual testing phases.
17. V-MODEL OF SOFTWARE TESTING
• Unit Testing - Tests the individual units of code that
comprise the application.
• System testing - Validates and verifies the functional
design specification and sees how all the modules
work together.
• Integration testing - Tests not only all the
components that are new or changed and are
needed to form a complete system, but it also
requires involvement of other systems.
18. V-MODEL OF SOFTWARE TESTING
• Integration Testing – Has sub-types:
Compatibility Testing
Load Testing
Stress Testing
• User Acceptance Testing (Beta testing or end user
testing) – It is where testing moves from the hands
of the IT department into those of the business
users, who then perform the real world testing.
• Production verification testing - It identifies
unexpected changes to existing processes introduced
by the new applications.
19. V-MODEL OF SOFTWARE TESTING
• The V-Model of testing identifies five software testing phases,
each with a certain type of test associated with it.
20. PRINCIPLE CERTIFICATION PROBLEMS
• Reliability Certification Problem:
Object-oriented technology is good to produce
reusable modules.
Components are often not reused if their
reliability cannot be guaranteed.
Object-oriented technology itself does not specify
any particular testing methods.
21. PRINCIPLE CERTIFICATION PROBLEMS
• Verification Methods Drawbacks:
Every existing standard uses two software
verification approaches to verify software.
But they are not advanced enough in relation to the
safety integrity levels needed for the software.
They are not entirely practical.
Only practical demonstrations can validate the
usability of some of the verification methods.
22. REQUIREMENTS FOR TESTING BASED
ON STANDARDS
• Aerospace: RTCA/DO-178B (EUROCAE ED12B):
Software Verification: Verification is the most
important in DO-178B which accounts to over two
thirds of the total process. It has different
criticality levels.
Level D: Software verification requires test
coverage of high-level requirement only.
Level C: Low-level requirement testing is
required.
23. REQUIREMENTS FOR TESTING BASED
ON STANDARDS
Level B: Decision coverage is required.
Level A: Code requires Modified Condition
Decision Coverage (MCDC).
24. REQUIREMENTS FOR TESTING BASED
ON STANDARDS
• Biomedical Engineering: IEC 60601-1-4
Verification – A plan shall be produced to show
how the Safety requirements for each
Development Life-cycle phase will be verified.
Plan includes –
Documentation of verification strategies
Selection and utilization of tools
Coverage criteria for Verification.
25. REQUIREMENTS FOR TESTING BASED
ON STANDARDS
Validation -A Validation plan shall be produced to
show that correct safety requirements have been
implemented. The Validation shall be performed
according to the Validation plan. The results of
Validation activities shall be documented,
analyzed and assessed.
26. CONCLUSION
• Software Testing has the potential to save time and
money by identifying problems early and to improve
customer satisfaction and safety by delivering a more
error free product.
• To make optimum use of testing, it should be
practiced throughout the development life cycle of
the product.
• The requirements for verification and validation
should be proactively modernized for advancement
of the product with introduction of new
technologies
27. REFERENCES
• Verification/Validation/Certification, Carnegie Mellon University,
18-849b Dependable Embedded Systems, Spring 1999, Author:
Eushiuan Tran
• Software Testing Fundamentals—Concepts, Roles, and Terminology,
John E. Bentley, Wachovia Bank, Charlotte NC
• Software Testing, A guide to the TMAP Approach, Martin Pol, Ruud
Teunissen, Erik van Veenendaal
• Reliability Certification of Software Components, Claes Wohlin and
Björn Regnel, Department of Communication Systems, Lund
Institute of Technology, Lund University, Box 118, SE-221 00 Lund,
Sweden.
• Software Safety Certification: A Multi-domain Problem, Patricia
Rodriguez-Dapena, European Space Agency, Copyrighted IEEE
28. REFERENCES
• Applying DO178B for IV & V of Safety critical
Software, White Paper, Sreekumar Panicker, Wipro
Technologies
• Testing Medical Devices, Written by Gary
Powalisz, GE Healthcare Available:
http://www.evaluationengineering.com/index.
• php/solutions/instrumentation/testing-medical-
devices.html. Last access
• on:11/01/2011
• IPL Testing Tools and IEC 61508, IPL Information
Processing Ltd.,Eveleigh House, Grove Street, UK