This paper highlights the significance of software testing, especially integration testing during development. It also identifies the key challenges of Service‐Oriented Architecture (SOA) integration testing and illustrates how they can be overcome with comprehensive integration testing frameworks such as xFIT from HCL. The audience of this paper is business savvy managers and senior executives interested in understanding how integration testing can act as a business differentiator.
e
3. Table of Contents
Introduction ............................................................................................................................................ 4
Purpose ................................................................................................................................................... 4
Market for Software Testing ................................................................................................................... 4
Integration Testing and Its Relevance .................................................................................................... 6
.
Integration Testing and SDLC .............................................................................................................. 6
Integration Testing and SOA ............................................................................................................... 7
Challenges of SOA Integration Testing ........................................................................................... 7
.
Players in the SOA Integration Testing Space ................................................................................. 8
Integration Testing Service from HCL ..................................................................................................... 9
Value Propositions of xFIT to Client ...................................................................................................... 10
xFIT: Framework for Integration Testing 3
4.
Introduction
W herever there are two or more individuals, departments in an organization, or
components in a machine, common rules of etiquettes, communications, agreements,
disagreements, and other norms of integration are required in order for the whole to
function in a harmonious manner. Conceptually it is no different in the world of software. In a given
company several applications and solutions need to integrate without problems in order meet the
demands of end users. To ensure that it is so, integration testing has become an indispensable
aspect of software development.
Purpose
T
his paper highlights the significance of software testing, especially integration testing during
development. It also identifies the key challenges of Service‐Oriented Architecture (SOA)
integration testing and illustrates how they can be overcome with comprehensive integration
testing frameworks such as xFIT from HCL. The audience of this paper is business savvy managers
and senior executives interested in understanding how integration testing can act as a business
differentiator.
Market for Software Testing
T
he software testing market has been growing steadily, and according to a study done by
Ovum, it is expected to reach $56.9 billion by 2013, of which $32.8 billion will be accounted
for by outsourced testing market. 1 Although still in a nascent stage, Russia, Eastern Europe,
China, and India are emerging as important hubs for offshore testing. 2 According to Ovum, all the
tier‐1 IT service vendors with test services, including IBM, Capgemini (itself and through Sogeti), and
HP‐EDs have offshore testing capacity in India. 3 The significance of software testing can be gauged
by the fact that a number of companies (e.g., Acutest, Applabs, SDLC Solutions Ltd., Software Quality
Systems (SQS), Tescom) have entered the “Pure Play Independent Software Testing Services” space. 4
Not surprisingly, several companies in India (e.g., SNS Technologies, Verisoft, RelQ, AppLabs,
Maveric, ReadyTestGo, Quexst Associates, Amperesoftware, and Pure Testing) have increased the
1
See Alexander Simkin, “Testing Services: an Opportunity When Budgets Are Tight?” (27 February 2009) p. 19,
http://www.ovum.com/go/content/s,77365 [May 2010]‐> represents when this document was accessed.
2
See "2009 Top Software Testing and Quality Assurance Outsourcing Vendors,"
http://www.datamonitor.com/store/Product/2009_top_software_testing_and_quality_assurance_outsourcin
g_vendors?productid=3348BB7D‐458F‐4771‐9D66‐E7CA262310A7 [May 2010]. According to Datamonitor's
study the software testing market was estimated to stand at $14 billion in 2009. Of that the
outsourced/offshored market was estimated to be at $8.5 billion, with India alone accounting for $3.3 billion.
Datamonitor's estimates are lower than that of Ovum, but they both point to the fact that the testing market
has been growing year on year over the last few years.
3
Alexander Simkin, Op cit., p. 9‐10.
4
Companies in this space offer only testing services, without product development or system integration
services. See "Pure Play Independent Software Testing Services Market in Europe 2008‐2012," (9 December,
2009) http://www.technavio.com/content/pure‐play‐independent‐software‐testing‐services‐market‐europe‐
2008‐2012 [May 2010]. See also, "Capgemini and Sogeti Integrate Software Testing Businesses,"
http://www.globalservicesmedia.com/News/Home/Capgemini‐and‐Sogeti‐integrate‐software‐testing‐
businesses/21/27/9381/GS100226268109 [May 2010].
xFIT: Framework for Integration Testing 4
5. scope of their software testing services. 5 Furthermore, the WITCH companies (i.e., Wipro, 6 Infosys, 7
TCS, 8 Cognizant, 9 and HCL 10 ) have responded to the growing demand for software testing by giving a
boost to their testing services divisions.
It would not be far‐fetched to assume that the business context for the steady growth in the
software testing market is related to the heavy costs incurred by businesses when software failed to
perform as expected.
In 2002, the National Institute of Standards and Technology (NIST), commissioned a study to
investigate the economic impact of inadequate infrastructure for software testing in the U.S. Based
on the extensive survey of software developers and users conducted by Research Triangle Park (RTI)
surveys, the investigation team reported that “the national costs of an inadequate infrastructure for
software testing is estimated to range from $22.2 to $59.5 billion.” 11 The investigation team learned
that users of software design tools in the automotive and aerospace as well as financial services
industries 12 spent significant resources dealing with software errors. In these sectors alone, the total
cost of inadequate software testing was estimated at $1.8 billion and $3.3 billion, respectively. 13
Software failure due to insufficient testing or quality checks and inadequate backup plans is
considered serious, given that it leads to financial losses and hurts the reputation and brand identity
of companies or organizations using them. High profiled cases of software failure include, though
not limited to, the ill‐fated Mars Mission, the faulty Denver airport baggage handling system, failure
of the automated transaction settlement system for the London Stock Exchange, overexposure of
patients during radiation treatments for cancer by Canada's Therac‐25 radiation therapy machine,
crash of Ariane 5 Flight 501, Toys R Us order fulfillment failure, 14 and many others. 15
5
For a good discussion on this, see Nanda Kasabe, "Software Testing: After Thought or Necessity?" (25
January, 2005) http://dqindia.ciol.com/content/special/2005/105012501.asp [May 2010].
6
See "Testing for deployment readiness," http://www.wipro.com/services/testing‐services/index.htm [June
2010]; "Wipro Testing Services," http://www.wipro.com/services/testing‐services/pdf/WipCafe.pdf [June
2010]; Kathleen Goolsby, "Greening Test Labs: Leveraging WTAS for Green Benefits and Managing Your Test
Lab Environment Challenges," http://www.wipro.com/resource‐
center/library/pdf/Wipro_Greening_Test_Labs_Whitepaper.pdf [June 2010].
7
See "Independent Validation and Testing Services," http://www.infosys.com/offerings/it‐
services/independent‐validation‐testing‐services/service‐offerings/pages/index.aspx [June 2010].
8
The testing segment at TCS has seen a dramatic rise in the last couple of year. "In FY07, TCS…registered [a]
growth of 122% in testing business, which is more than 3 times the growth in IT service segment as a whole."
Asit C. Mehta Investment Interrmediates Ldt., "Tata Consultancy Services Ltd. ‐ Company Report (28 February
2008) p. 14, http://smartinvestor.in/BSCMS/PDF/tata%20consultancy%20services%20ltd.pdf [June 2010].
9
See http://www.cognizant.com/html/solutions/services/testing/domain‐aligned‐services.asp [June 2010].
10
See “Testing Services,” http://www.hcltech.com/automotive/testing‐services/ [June 2010].
11
"The Economic Impacts of Inadequate Infrastructure for Software Testing," (May 2002). Prepared by RTI for
NIST, p. ES‐3, http://www.nist.gov/director/planning/upload/report02‐3.pdf [May 2010].
12
Both these sectors were studied extensively in order to gain insight into the extent of the problems of poor
software testing. Ibid.
13
Ibid. See pp. ES‐8 and ES‐10.
14
For a list of famous failures of supply chain software, see “The 11 Greatest Supply Chain Disasters,”
http://www.scdigest.com/assets/reps/SCDigest_Top‐11‐SupplyChainDisasters.pdf
15
The following sources give an idea of the extent of (well‐known) software failures and their impact: Charles
T. Carroll, "The Cost of Poor Testing: A U.S. Government Study (Part 1)," EDPACS, Volume 31, Issue 1 July 2003,
pp. 1 ‐ 17 http://www.informaworld.com/smpp/content~content=a768433146&db=all [May 2010]; "Top 10
Worst Internet Outages," http://leatherheadblog.com/2009/02/28/top‐10‐worst‐internet‐outages/ [May
xFIT: Framework for Integration Testing 5
6. Integration Testing and Its Relevance
I
n today’s competitive environment where shortest possible ‘time‐to‐market’ and reduction in
operational cost can act as business differentiators, the call for automated integration testing is
graining high currency. It addresses both aforementioned business priorities by:
Decreasing the number of skilled resources required for each task in testing
Minimizing or eliminating errors and reducing the time required for troubleshooting
Ensuring the superior quality of software and regulatory compliance
Speeding up the testing process and software development
Generating comprehensive reports on the testing process
Integration Testing and SDLC
The purpose of integration testing is to verify if the solutions being developed meets the functional,
performance, and reliability requirements. In the context of Software Development Life Cycle (SDLC)
integration testing is done following unit testing and prior to system testing. The idea is to catch
errors that were missed during unit testing and to eliminate or minimize the possibility of bugs
during system testing. 16
Integration testing can be performed in a non‐incremental or incremental manner. The former is
also referred to as the "big bang" approach where integrating testing is done after all the modules or
components are completed. This approach, while common, is not recommended because it makes it
almost impossible to determine the cause of errors or problems. Incremental testing assumes that
small segments of the system are integrated slowly making it relatively easy to isolate problems. This
approach is highly encouraged because it enforces a more systematic approach to software
development and improves reliability.
Incremental testing may be done using any of the following methods:
• TOP‐DOWN Higher level control modules are developed and tested first with stubs for the
lower level modules.
• BOTTOM‐UP Lower level control modules are developed and tested first with drivers for the
higher level modules.
2010]; "Google Corporate User "Demanding Serious Answers" Over Mail Outage," (25 February 2009)
http://www.computing.co.uk/computing/news/2237182/google‐corporate‐user‐demanding [May 2010];
"History's Most (In)Famous Software Failures," Monday, (26 November 2007)
http://bugsniffer.blogspot.com/2007/11/infamous‐software‐failures.html [May 2010]; Michael Krigsman "40
IT failures caused by software bugs," (1 October, 2007) http://www.zdnet.com/blog/projectfailures/40‐it‐
failures‐caused‐by‐software‐bugs/427 [May 2010]; Tim Clark, "eBay Online Again After 14‐hour Outage,"
http://news.cnet.com/eBay‐online‐again‐after‐14‐hour‐outage/2100‐1017_3‐229518.html [May 2010];
"BlackBerry Network Failure Due To Insufficient Testing Of Software Upgrade," (Internet Business News, 20
April, 2007) http://findarticles.com/p/articles/mi_m0BNG/is_2007_April_20/ai_n19021503/ [May 2010]. See
also "The Economic Impacts of Inadequate Infrastructure for Software Testing," Op cit., pp. 1‐11 to 1‐13.
16
For a fairly straightforward explanation, see "Integration Testing," http://msdn.microsoft.com/en‐
us/library/aa292128(VS.71).aspx [May 2010]; Tom Mochal, "Integration Testing Will Show You How Well Your
Modules Get Along," (10 September 2001) http://articles.techrepublic.com.com/5100‐10878_11‐
1061716.html [May 2010]; "Integration Testing," http://www.testinggeek.com/index.php/testing‐types/life‐
cycle/54‐integration‐testing [May 2010]; "Integration Testing,"
http://en.wikipedia.org/wiki/Integration_testing [May 2010].
xFIT: Framework for Integration Testing 6
7. • SANDWICH Combines the top‐down and bottom‐up approach.
• RISK DRIVEN Critical modules are developed and tested first with stubs for the less‐critical
modules. 17
Integration Testing and SOA
Where solutions are developed using the SOA approach, integration testing is undertaken to ensure
that all Web services or asynchronous messages in a given solutions work as anticipated. This is, of
course, easier said than done because [e]ven if [Web services] didn't go wrong when tested
individually it's quite possible for them to go wrong when they're holding hands, 18 or as Chris
Benedetto put it "[w]ith so many moving parts in the SOA environment, and a multitude of
heterogeneous systems to expose as services, how can businesses ensure all those crucial parts are
working properly?" 19 End‐to‐end SOA and integration testing is, therefore, needed to see what is
happening inside the SOA environment and validate that services, messages, interfaces, and
business processes are performing as expected. Integration testing of a service means testing the
functionality of a Web service and its relationship with all immediate (directly connected) services.
Challenges of SOA Integration Testing
Aspects of SOA based solutions that pose the greatest challenge to testers are as follows:
• SOA environment comprises many components that need to be integrated: Web services,
enterprise service buses (ESB), legacy assets, databases, files, and numerous transport
protocols that move messages and orchestrate services. The SOA landscape is "always on"
i.e., it is continuously changing; it is loosely coupled, and comprises multiple providers (in‐
house, business partners, and commercial services). Moreover, since SOA applications are
reusable and highly distributed, multiple teams need to be involved in the testing process. 20
Consequently, testing teams need to have a good understanding of the SOA environment.
• Where Web services are used, functional and integration testing assumes that testing teams
have an understanding of the relationships among services (e.g., cart service and catalogue
service) within a specific business context or business event (e.g., existing customer
purchase). It also assumes that testing teams know how to validate those services (i.e.,
determine whether or not the services deliver the business functionality required by the
SOA solution). 21 Similarly, the performance or stress/load testing assumes that the testing
17
There are many references on this subject. For a brief summary, see “Integration Testing,”
http://www.site.uottawa.ca/~awilliam/seg3203/Integration.ppt [May 2010]; "Integration Testing in Small," (9
November, 2008) http://www.testingexcellence.com/integration‐testing‐in‐small/ [May 2010].
18
For a to‐the‐point view on this, see Robin Bloor, "SOA and Software Testing," (19 June 2006) http://www.it‐
director.com/technology/infrastructure/content.php?cid=8568 [May 2010].
19
Chris Benedetto, "SOA and Integration Testing: The End‐to‐End View," (September 27, 2006) http://soa.sys‐
con.com/node/275057 [May 2010].
20
See Colleen Frye, "SOA Applications Bring Testing Challenges," (29 October 2008)
http://searchsoftwarequality.techtarget.com/news/article/0,289142,sid92_gci1336973_mem1,00.html [May
2010].
21
For a good summary of the challenges, see David W. Johnson, "Unit, Integration Testing First Steps Toward
SOA Quality," (27 August 2008)
http://searchsoftwarequality.techtarget.com/tip/0,289483,sid92_gci1327168_mem1,00.html [June 2010) and
"Use Functional and Regression Testing to Validate SOA Solutions," (6 October 2008)
http://searchsoftwarequality.techtarget.com/tip/0,289483,sid92_gci1333546_mem1,00.html [June 2010];
xFIT: Framework for Integration Testing 7
8. team has an in‐depth understanding of the systems, including hardware, software,
firmware, protocols, transactions, and the business. 22
• Testing approach needs to break down the end‐to‐end transactions to detect the point of
failure. This means that testing teams need to capture and analyze all the Simple Object
Access Protocol (SOAP) messages that are passed from one component to another. In the
case of non‐Web services scenario testing teams need to understand the error messages
between native adaptors.
• Testing teams need to adopt agile practices and tools and use automation and test‐driven
development techniques. This is in contrast to the ‘first build and then test’ approach.
Players in the SOA Integration Testing Space
Companies that offer integration testing for SOA based solutions also tend to offer other types of
testing, including unit testing, regression testing, load and functional testing, application testing, and
monitoring. Some of the well‐known companies in this space include, though not limited to,
• INTERACTIVE ITKO (ITKO) developed LISA Test – an automated testing solution designed for
cloud applications and other distributed application architectures that leverage SOA, BPM,
integration suites, and ESBs. 23
• PARASOFT offers a number of testing products including SOAtest. 24
• GREEN HAT developed GH Tester. 25
• INFOSYS has ACCORD and the Infosys Test Automation Accelerators (ITAA). 26
• TESTREE which has proprietary testing frameworks for SOA applications. 27
• IBM launched the Federated Integration Testing (FIT) initiative for internal use by solution
teams and service teams. 28
22
For possible ways to overcome the complexity of performance and load testing of SOA application, see David
W. Johnson, "Performance Testing: Ensure Your SOA Applications Perform," (20 October 2008)
http://searchsoftwarequality.techtarget.com/tip/0,289483,sid92_gci1335370,00.html [June 2010].
23
See “LISA Test,” http://www.itko.com/products/index.jsp and http://www.itko.com/products/lisatest.jsp
[June 2010]
24
Go to http://www.parasoft.com/jsp/products.jsp [June 2010].
25
Go to http://www.greenhat.com/ghtester/?gclid=CLGY9LCPgaICFRFB6wodkE8DGA [June 2010].
26
Go to http://www.infosys.com/offerings/IT‐services/independent‐validation‐testing‐services/service‐
offerings/Pages/SOA‐middleware‐testingand‐services.aspx [June 2010].
27
Go to http://www.testree.com/about_us_home.html [June 2010].
28
The idea behind this initiative is that after each product has gone through scenario testing it should be
tested by the FIT team in the context of the real‐world so that IBM can understand what customers are
typically trying to do with its products. "When doing major BPM, collaboration or SOA projects, for instance,
customers do not want to have to work out for themselves ‐‐ or pay IBM services to tell them ‐‐ how best to
combine products from IBM's Rational, Lotus, FileNet, Tivoli, Telelogic or other brands. Today the FIT team is
helping those services and pre‐sales consultants to do just that." See "IBM Gets FIT in Integration Testing," (10
August 2007) http://www.cbronline.com/news/ibm_gets_fit_in_integration_testing [June 2010].
xFIT: Framework for Integration Testing 8
10.
The xFIT framework has been built using open source tools for addressing challenges that arise when
applications are tested across distributed locations and platforms. 30 Those tools include, though not
limited to, Tool Command Language (Tcl) with Expect and Bash 31 and Tcl‐Distributed Programming
(Tcl‐DP). 32 xFIT is portable, extensible, and flexible.
xFIT offers:
• WEB SUPPORT for reporting, performing dashboard operations and for scheduling test runs.
• RUNTIME SUPPORT for validating outputs and collecting responses.
• TEST SCRIPT DEVELOPMENT (with built‐in templates, inheritance, and ability to group test cases)
for language and test case dependencies.
• TEST SCRIPT MANAGEMENT for testware management and data management.
• TEST BED MANAGEMENT for verification, cleanup and query.
Since xFIT supports its own synchronization and communication mechanisms, it does not affect the
actual performance of the application being tested. It is lightweight, non‐intrusive, and does not
require executables or DLLs. xFIT has the ability to develop, run and maintain automated regression
test libraries for complex EAI scenarios, and it can run a test case on multiple systems under test
(SUT) with minimal effort.
Value Propositions of xFIT to Client
Clients who use xFIT can expect to reduce operational cost in measurable terms:
By decreasing the number of resources required to perform specific tasks. If, prior to xFIT, it
took 10 resources to perform 100 tests, it will take 20‐50% less resources to perform the
same number of tests with xFIT.
By reducing the amount of time required for troubleshooting or fixing errors. Using a
manual process, if it took an average of “X” minutes to diagnose and fix a problem, with
xFIT, the time required will reduce by 50% or more.
Table 1 is representative of the benefits that xFIT has already yielded for clients. In this example, we
compare a manual testing scenario comprising 20 services (having medium complexity) with one
where xFIT is used for performing the same tasks. The difference is significant – almost a 70%
reduction in man hours and a saving of almost $600,000 for 20 services with xFIT.
30
xFIT may be used on any platform, including UNIX, LINUX, AIX and Windows.
31
This language is widely adopted by the testing communities worldwide. It is well suited for interactive
automation of testing across platforms in a distributed environment.
32
Used for handling distributed processing.
xFIT: Framework for Integration Testing 10
11. Releases Parameters Compared Manual xFIT Hrs
Hrs
Requirements, Test Plan, Test Bed, Data & 320 320
First Major Release Test Case Preparation
Test Client & Script Preparation 20 440
Test Execution & Reporting 880 40
Retesting for bug fixes 220 40
Total 1,440 840
Subsequent Enhancement Requirements, Test Plan, Test Bed, Data & 180 160
Releases Test Case Preparation
Test Client & Script Preparation 0 160
Test Execution & Reporting 540 20
Retesting for bug fixes 120 20
Regression Test Execution 800 40
Total 1,640 440
Consolidated Results First Release 1,440 840
Subsequent Releases (Assuming 8 Releases 13,120 3,200
in 2 years)
Total Hours 14,560 4,040
Total Cost (Assuming $55/hr) – rounded to $800,800 $222,200
the nearest $1000
Table 1: Manual Testing Vs Testing with xFIT
The results shown in Table 1 for xFIT are the direct benefits of:
AUTOMATION which speeds up the testing process and the generation of comprehensive
analysis reports. These reports provide insights into testing process.
REUSABILITY AND REPEATABILITY of test cases and test scenarios, which is achieved by separating
the test code into procedural and declarative pieces.
IMPROVED COVERAGE OF TEST SCENARIOS which enhances the quality of the solution or system
being tested.
xFIT: Framework for Integration Testing 11