Testing is an often misunderstood aspect of quality assurance, and as such many enterprises are not getting the most benefit out of testing efforts. Use this storyboard to gain insight from other Info-Tech client interviews and surveys. Develop a testing strategy and plan for implementation.
•Assess your testing needs to determine a testing strategy
•Improve testing to make the most out of testing resources
•Develop a testing strategy based on resources and strategy assessment by putting together a test plan
•Understand testing to effectively communicate testing principles to testers, developers and management
The storyboard includes links to several tools and templates that will get you started on implementing a testing process regardless of your scenario.
1. Develop & Improve Your Software Testing Strategy Conveying and Inspiring Confidence Info-Tech Research Group “ Software implementation is a cozy bonfire, warm, bright, a bustle of comforting concrete activity. But beyond the flames is an immense zone of darkness. Testing is the exploration of this darkness.” - extracted from the 1992 Software Maintenance Technology Reference Guide
2.
3. Executive Summary Your success is dependant on how well you test your applications. Whether you have developed them or simply integrated or configured them, this solution will help you understand how and why certain testing should be completed Software Quality Assurance or SQA is an umbrella term that refers to the process involved in ensuring quality in your deliverables. It encompasses both Software Quality Control (SQC) and Software Testing. Business leaders will all eagerly agree that quality is important , but understanding how to get there, what processes to implement, what people to engage is often a difficult decision … this solution will help you to sift through all the information and turn it into useable knowledge to increase your success As any development organization matures, priorities jockey for position. For startups the elusive quality is not the major concern , delivery is, but as the organization matures quality ultimately ends up at the top Attempting to turn a screw with a knife blade can work, but it works much better, faster, and more accurately with the proper tool. Having trained quality professionals will help the organization achieve the level of quality that your customers expect … and deserve Develop and Improve your overall quality by learning and implementing proper process that fits your organizational needs. Whether you develop for the web, or integrate off the shelf software, your customers deserve your best, and you deserve their trust. This solution will help guide you through creating and maintaining a process that works for you Info-Tech Research Group A ssess your readiness … I mprove your success D evelop the means to achieve … U nderstand SQA, SQC, and Testing
4. Info-Tech Research Group Develop & Improve Your Software Testing Strategy Assess where you stand with software quality assurance Success is determined by how well your customers feel toward you, your service, and your product Assessing your place in the world Assess The General Situation Your Environment Improve Registered Certified Gold Redesigned Network Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
5.
6. Most businesses with development groups will be system builders or integrators and will mature at a predictable pace Info-Tech Research Group Experienced : 15-30 Dev, 2-3 Dev Mgr, 1 Sr Dev Mgr, 0-1 BA, 0-1 PM, some formalized dev standards, most projects run by dev leaders, some early formalized planning, requirements mostly known, little change control, deadlines are set based on rough estimates and business/client need still largely asap. Inexperienced : 1-5 Dev, no Dev Mgr, no BA, no PM, no dev standards, projects run by dev, no real planning, requirements are extremely loose, no change control, deadlines are all asap. **Top Priority -> Delivery Mildly Experienced : 5-15 Dev, 1-2 Dev Mgr, no BA, no PM, unofficial dev standards, projects run by dev, no formalized planning, requirements are loose, no change control, deadlines are all asap. Very Experienced : 30-50 Dev, 5-8 Dev Mgr, 1-2 Sr Dev Mgr, 1-2 BA, 3-5 PM, formalized development standards being followed, some projects still run by dev leaders, most run by PM, formalized planning, requirements are set, some change control is attempted, deadlines are set based on established estimation process with business/client needs factoring heavily. Seasoned Veteran : 50+ Dev, 8+ Dev Mgr, 3+ Sr Dev Mgr, 2-3+ BA, 8+ PM, formalized dev standards are followed, all projects controlled by PM, standardized and formalized planning, requirements are known, change control is in place, deadlines are based on well thought out estimates and business/client needs. ** Top Priority -> Quality No Testers ad-hoc developer testing only 0-1 Testers very little testing, some developer testing, some ad-hoc exploratory type testing 1 SQA/Test Leader, 5-10 Testers some developer testing, some standardized testing, mostly regression, smoke, exploratory, acceptance 1-2 SQA/Test Leader, 8-20 Testers some developer testing, standardized SQA run testing – separate test environment 1-2 SQA/Test Leaders, 20+ Testers developer unit testing, automated build testing standardized SQA run testing, automated regression, other automated testing – multiple separate test environments Individual business focus may be different, the approach to testing may be different, but the goal toward quality will be constant
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17. If you’re testing an enterprise app, it is of little value how much testing has been done if it cannot be integrated Info-Tech Research Group Make sure your testers are aware of these Enterprise Testing Fundamentals: Testing is a critical requirement to making good enterprise deployment decisions. Recommendations include approaching your enterprise testing with a phased approach to your component, feature, and system testing. The test process for enterprise solutions can be best viewed as a reverse engineering of the product development. Typically this can be done in a 3-step process involving module verification , feature verification , and usage . Product testing can only begin once one or more of the interrelated functions have been unit tested sufficiently as to allow a normal progression through each of the major areas of functionality. Start testing early to provide your supplier, vendor, or internal development group sufficient time to respond with fixes and have your test team verify the fixes while still preserving your timeline Info-Tech Insight:
18.
19.
20. Info-Tech Research Group Develop & Improve Your Software Testing Strategy Improve your overall approach to software quality testing It is impossible to “test” quality into a product Improving your testing effectiveness Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Subscription Based Software Competencies Understand Program Management Concordant Bodies & Alternatives
21. Incomplete requirements, poor specifications, and unclear objectives are listed clearly as the biggest project problems 0 Info-Tech Research Group In a recent unofficial independent poll of Software Quality Assurance professionals (Linked-In) when asked “What do you think the top problems with development projects are?” The results showed that poor requirements was the leading cause of failure in projects. Inadequate testing was NOT the biggest concern, poor requirements were Make sure your project team fully understands the requirements and make sure those requirements are testable If your team can’t test the requirement, you have a problem! approx N=19 18% 13% 7% 8% 28% Poor Requirements Feature Creep Miscommunication Inadequate Testing Unrealistic Schedule Having testers involved early in the projects can help to prevent the number one cause of failure in projects. *** Visionary requirements cannot be tested , but detailed, well thought out requirements can!
22. Improve your testing process by first understanding the 5 most common problems and how to address them Info-Tech Research Group solid requirements - clear, complete, detailed, attainable, testable requirements that are agreed to by all stakeholders. realistic schedules - allow adequate time for planning, design, testing, bug fixing, re-testing, changes, and documentation; project resources should be able to complete the project without burning out. adequate testing - start testing early, re-test after fixes or changes, plan for adequate time for testing and bug-fixing. 'Early' testing could include static code analysis/testing, test-first development, unit testing by developers, automated post-build testing, etc. stick to initial requirements where feasible - be prepared to defend against excessive changes and additions once development has begun, and be prepared to explain consequences. If changes are necessary, they should be adequately reflected in related schedule changes. If possible, work closely with customers/end-users to manage expectations. communication - require walkthroughs and inspections when appropriate; make extensive use of group communication tools - groupware, wiki's, bug-tracking tools and change management tools, intranet capabilities, etc.; ensure that information/documentation is available and up-to-date - preferably electronic, not paper; promote teamwork and cooperation; use prototypes and/or continuous communication with end-users if possible to clarify expectations.
23.
24. Do not limit your testing resources to just testers, include resources from other areas to improve your success 0 Info-Tech Research Group “ Just like development, to be effective SQA needs to have good processes For good testing results those processes need to include other people as necessary” - SQA Analyst– IT Professional Services (Financial Industry) Organizations that included resources outside of testing throughout their process showed to be more successful in every case. N=72 Source: Info-Tech Research Group Business Analysts Business Users Developers Other IT +38% +33% +24% +83% Those that did Those that did not Clients use of additional resources
25.
26.
27.
28.
29.
30.
31. Improve your effectiveness by following this simple list of best practices for software quality testers Info-Tech Research Group 1) Learn to analyze your test results thoroughly 2) Learn to maximize your test coverage 3) Break your application into smaller functional modules 4) Write test cases for intended functionality 5) Start testing the application with the intent of finding bugs 6) Write your test cases in the requirement analysis and design phase 7) Make your test cases available to developers prior to coding 8 ) If possible identify and group your test cases for later regression testing 9) Applications requiring critical response time should be thoroughly tested for performance 10) Developers should not test their own code 11) Go beyond requirement testing 12) While doing regression testing use bug data 13) Note down the new terms, concepts you learn while testing 14) Note down all code changes done for testing 15) Keep developers away from test environment 16) It’s a good practice to involve testers right from software requirement and design phase 17) Testing teams should share best testing practices 18) Increase your conversations with the developers 19) Don’t run out of time to do high priority testing tasks 20) Write clear, descriptive, unambiguous bug reports Read the full “ Testing Best Practices ” in Appendix IV of this document. Don’t forget: testing is a creative and challenging task. It depends on your skill and experience, how you handle this challenge
32. Info-Tech Research Group Develop & Improve Your Software Testing Strategy Develop your testers, your process, and the means to measure Involve SQA early, but not too early … wait until the ambiguity starts to settle Developing your testing strategies Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand Program Management Concordant Bodies & Alternatives
33.
34. Having the right mix of experience and training in the right roles makes a difference to your chance of success 0 Info-Tech Research Group Increasing Success Our survey shows: Clients that showed greater success had 36% more experienced staff on their team Organizations that showed greater success had 45% more staff with development backgrounds Clients that showed greater success had 41% more staff with formal QA training N=72 Source: Info-Tech Research Group In a related unofficial poll of SQA Professionals, when asked “ What is a good ratio of developers to testers? ” the response was an overwhelming and unanimous “NO” The unanimous reasoning was simply that there are too many variables to give a reasonably accurate answer However the survey did show that while the minimum ratio was 0:1, the maximum was 1:30 the most common ratio was 1:3 (testers to developers) “ Having the right resources spread between Dev, SQA, and the BA’s is like 3 legs of a stool … you need them all ” - IT Manager Financial Industry +45% +36% +41% Relevant Work Experience Formal QA Training Development Background
35. Trained SQA testers should know what type of test is best, but using these should be your minimum coverage Info-Tech Research Group Unit Testing - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Functional Testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) Integration Testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Exploratory & Ad-Hoc Testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it Regression Testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing approaches can be especially useful for this type of testing Usability Testing & User Acceptance - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers “ Automated testing is not the silver bullet that everyone believes it to be but yet another tool in the QA toolbox. An automation framework is great when it is used to enhance testing ” - SQA Trained Professional with over 16yrs exp . Read the complete list of testing types in Appendix V
36.
37.
38.
39.
40.
41. Info-Tech Research Group Develop and Improve Your Software Testing Strategy If you only do a little, you need to understand the basics Understanding is the first step toward useful knowledge Understand how it all fits together Assess The General Situation Your Environment Improve Your Testing Focus Your Testing Strategy Your Testing Effectiveness Best Practices Develop Your Testers & Test Coverage Your ability to measure success Understand If you only test a little, this is what you need to know How testing fits in the bigger SQA picture
42.
43.
44.
45.
46.
47.
48.
49. Info-Tech Research Group Software Quality Assurance is not all about the testing, in fact SQA encompasses all aspects of the Software Development Life Cycle from planning through support An organization that focuses on Quality Management looks at all aspects from beginning to end with a goal of total satisfaction. “ Getting them engaged early creates more successful projects because they have the context to plan and prepare.” - Manager of QA – Government (Health Sector) “ If you relegate QA responsibilities to just testing the end product, you overlook an opportunity to integrate quality into the entire software development life cycle.” - SQA Trained Professional with over 16yrs exp. “ To truly be effective at building quality, you can’t just look at testing alone. Making sure the requirements are even testable is a crucial step towards quality.” - SQA Trained Professional with over 16yrs exp. Software testers typically get involved during design and development phases …but you really shouldn’t wait, or stop there. Software Development Life Cycle (SDLC)
50.
51.
52.
53. Appendix I Case Study: Testing is limited. The business doesn’t care Info-Tech Research Group The business remains uninvolved because they trust the development group. There is not enough regression testing to the manager’s estimation, and as a result there are post production issues. Since the business doesn’t see it as a concern they may never change testing practices. “ We have monitoring software for servers. Monitor but don’t test . We put it out there and hope that it doesn’t crash . It will go down and we’re reactive about it. The IT budget is limited, so we can’t invest in testing” The Situation : Finance Industry – Applications developed include website and internal applications . There are approximately 30 people in IT, 10 Web Developers ( no testers ) – Spread across an average of 10 applications and/or projects. Some project managers are involved in some of the projects and sometimes will produce a plan. Development staff typically will draft a test plan (of sorts). There is no standard methodology followed, everything is very ad-hoc. Attempts are made to collaborate with the business when possible; projects managers are too distant and operational managers are not equipped with knowledge to provide feedback. The Challenge : One of the biggest struggles facing the group is trying to get the business included and involved. They have no understanding of development or testing and do not support what they don’t know. IT has no idea what they are supposed to test, and because this has not caused a major problem yet , the business does not feel enough incentive to get involved , or to support the need for testing. The only time the business gets involved in our projects is when they need to sign off for compliance reasons. Generally the business takes on the attitude of … “we just trust you.” Comments: Recommendations: This business needs to take a step back and look at the risk associated with not doing testing. Trusting of their developers is admirable, but it won’t help the business. A serious look at the cost justifications for testing should be done. Compared against the cost to react to problems and the risk they are exposing themselves to, especially considering their industry, should be enough to help them in the right direction. Many businesses fall into this pit … they trust their developers, and so testing is a cost not justifiable. Before you right it off , as with any good business decision … do your due diligence , quality is worth it! Go Back
54. Appendix II Case Study: Even small shops need a testing strategy Info-Tech Research Group “ No, I think that for us, because we’re a smaller shop, it comes down to time. We just don’t (test)– I’m gonna say from a manpower perspective, we definitely take some shortcuts right now . In some cases, I have my VAR, my key contact, he will test it. He’ll show me what – we’ll quickly go over it and I’ll say “ Yeah, okay that looks good ” and I just let it go . I don’t test it ; I don’t necessarily test them myself just because I don’t have the time to do it .” For vendor patch updates, the partner installs and tests patches. But they don’t put all their faith in the hands of the development partner. Before any patches are applied, a copy is made of the previous release. If something happens once the development partner applies the patches they go back to original. The Situation : Small Shop Profile –CIO, Application manager and QA manager at a manufacturing industry IT shop of four. Development for their iSeries based commercial ERP platform is an external partner, with whom they have had years of experience. They work with a VAR to specify and test ERP enhancements, and sometimes just don’t have time to test. The Challenge : Big projects – will test in conjunction with developers The alternative: For minor projects, and when there just isn’t time, will have the development partner test functionality. Comments: Recommendations: The manager in this case recognizes the need to test, he uses the 3 rd party VAR whom he trusts to take on a lot of the testing responsibilities and trusts that if something goes wrong, they can back out of it. As with in the first case study, this business needs to take a step back and look at the risk associated with not doing testing. Trusting of their VAR in this case is risky, and it won’t help the business. The VAR does not have the same stake if something goes wrong. A serious look at the cost for testing should be done. All may be well today, but what about tomorrow… As a small shop of only four developers, it is easy to understand, and easy to justify not having specific testing resources. However, improved quality can be achieved by having dedicated resources that know how to plan, how to strategize and test the integration of the systems. A dedicated resource can free up the development resources to work on more relevant tasks. The development resources in small shops can be an amazing resource, but they can also be very quick to move on. Keep them productive and happy doing things they love to do …. develop. Focus the developers attention on the solutions, and bring in co-op students to handle the testing. its your quality, don’t leave it to someone else! Go Back
55. Appendix III Case Study: Test data, the critical component Info-Tech Research Group “ So I’ve seen one of my testers was testing maintenance for something, and there literally was not enough data to sort a column. And that project, that application had made it through to production.” Another scenario is when data is not unique enough. “We’ve had this one happen – the entire test, people in there had the same birth date, so you’re trying to validate reports that generate different things based on date of birth, you don’t really know if it’s working or not because you’re always seeing the same result.” The Situation : 100 person IT shop. 10 QA staff, lots of custom and commercial testing. There are software architects, developers, e-business analysts, and testers, and kind of quality analyst type roles, project managers, application desk-side support analysts. The Challenge : For security/compliance reasons an enterprise is limited in the real data they can tests in applications, and incomplete test data can limit testing. The result is that testing does not actually meet the targets in the test script. People need to create custom data and sometimes they just don’t create enough. Comments: Recommendations: This organization has a well established and well formed environment. Their resource make up is good, the processes they have in place are good, the testing strategy is good, and there is adequate planning and support for their testing efforts. The short fall is with the infrastructure for the test environments. In this case, data. Time needs to be spent ensuring that the test team has sufficient data to properly test for conditions that properly reflect those situations that emulate the real world. The recommendation would be to duplicate one of the production databases, move this to an isolated test environment, and take the time to change the data from real to test … for example, changing names, changing titles, changing all facets of the data to test (fake) and obviously fake, data. This is a tedious and time consuming task, however once completed it will increase the effectiveness of the testing team. Another alternative recommendation would be to create a mirror within the production system that could write (changed) data to a separate database for use in the test environment. Go Back
56. Appendix IV Testing Best Practices Info-Tech Research Group 1) Learn to analyze your test results thoroughly. Do not ignore the test result. The final test result may be ‘pass’ or ‘fail’ but troubleshooting the root cause of ‘fail’ will lead you to the solution of the problem. Testers will be respected if they not only log the bugs but also provide solutions . 2) Learn to maximize the test coverage every time you test any application. Though 100 percent test coverage might not be possible you can always try to reach it. 3) To ensure maximum test coverage break your application into smaller functional modules. Write test cases on individual unit modules. Also, if possible, break these modules into smaller parts, e.g.: If you have divided your website application in modules and “ accepting user information ” is one of the modules. You can break this “ user information ” screen into smaller parts for writing test cases: Parts like UI testing, security testing, functional testing of the user information form etc. Apply all form field type and size tests, negative and validation tests on input fields and write all test cases for maximum coverage. 4) While writing test cases, write test cases for the intended functionality first i.e: for valid conditions according to requirements. Then write test cases for invalid conditions. This will cover expected as well unexpected behavior of the application. 5) Think positive . Start testing the application by intending to find bugs/errors. Don’t think beforehand that there will not be any bugs in the application. If you test the application with the intention of finding bugs you will definitely succeed. 6) Write your test cases in requirement analysis and the design phase itself. This way you can ensure all the requirements are testable. 7) Make your test cases available to developers prior to coding. Don’t keep your test cases with you waiting to get the final application release for testing, thinking that you can log more bugs. Let developers analyze your test cases thoroughly to develop a quality application. This will also save the re-work time. Go Back
57. Appendix IV - continued Testing Best Practices Info-Tech Research Group 8) If possible identify and group your test cases for regression testing. This will ensure quick and effective manual regression testing. 9) Applications requiring critical response time should be thoroughly tested for performance. Performance testing is the critical part of many applications. In manual testing this is mostly ignored by testers. Find out ways to test your application for performance. If it is not possible to create test data manually, then write some basic scripts to create test data for performance testing or ask the developers to write it for you. 10) Programmers should not test their own code. Basic unit testing of the developed application should be enough for developers to release the application for the testers. But testers should not force developers to release the product for testing. Let them take their own time. Everyone from lead to manger will know when the module/update is released for testing and they can estimate the testing time accordingly. This is a typical situation in an agile project environment . 11) Go beyond requirement testing. Test the application for what it is not supposed to do. 12) While doing regression testing use previous bug information. This can be useful to predict the most probable bug filled part of the application. 13) Keep a text file open while testing an application and write down the new terms and concepts you learn. Use these notepad observations while preparing a final test release report. This good habit will help you to provide a complete unambiguous test report and release details. 14) Many times testers or developers make changes in the code base for the application when under test. This is a required step in development or testing environments to avoid execution of live transaction processing like in banking projects . Record all such code changes done for testing purposes and at the time of final release make sure you have removed all these changes from the final client side deployment file resources. Go Back
58. Appendix IV – continued (2) Testing Best Practices Info-Tech Research Group Go Back 15) Keep developers away from the test environment. This is a required step to detect any configuration changes missing in a release or deployment document. Sometimes developers do some system or application configuration changes but forget to mention those in deployment steps. If developers don’t have access to the testing environment they will not make any of these changes accidentally. These changes must be captured at the right place . 16) It’s a good practice to involve testers right from the software requirement and design phase. This way testers can get knowledge of the application dependability resulting in detailed test coverage. If you are not being asked to be part of this development cycle then make a request to your lead or manager to involve your testing team in all decision making processes or meetings . 17) Testing teams should share best testing practices , and experience with other teams in their organization. 18) Increase your conversation with developers to know more about the product. Whenever possible, use face-to-face communication for resolving disputes quickly and to avoid any misunderstandings. But also, when you reach an understanding or resolve any dispute - make sure to communicate the same in writing. Do not leave anything strictly verbal . 19) Don’t run out of time to do high priority testing tasks. Prioritize your testing work from high to low priority and plan your work accordingly. Analyze all associated risks to prioritize your work. 20) Write clear, descriptive, unambiguous bug reports. Do not provide only the bug symptoms but also provide the effect of the bug and all possible solutions.
59. Appendix V Understand the various testing types Info-Tech Research Group Black box testing - not based on any knowledge of internal design or code. Tests are based on requirements and functionality White box testing - based on knowledge of the internal logic of an application's code. Tests are based on coverage of code statements, branches, paths, conditions Unit Testing - the most micro scale of testing; to test particular functions or code modules. Typically done by the developer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses Incremental Integration Testing - continuous testing of an application as new functionality is added; requires that various aspects of an application's functionality be independent enough to work separately before all parts of the program are completed, or that test drivers be developed as needed; done by programmers or by testers Integration Testing - testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems Functional Testing - black-box type testing geared to functional requirements of an application; this type of testing should be done by testers. This doesn't mean that the programmers shouldn't check that their code works before releasing it (which of course applies to any stage of testing) System Testing - black-box type testing that is based on overall requirements specifications; covers all combined parts of a system End-to-End Testing - similar to system testing; the macro end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate Smoke Testing - typically an initial testing effort to determine if a new software version is performing well enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5 minutes, bogging down systems to a crawl, or corrupting databases, the software may not be in a condition to warrant further testing Regression Testing - re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing approaches can be especially useful for this type of testing Acceptance Testing - final testing based on specifications of the end-user or customer, or based on use by end-users/customers over some specified period of time Load Testing - testing an application under heavy loads, such as testing of a web site under a range of loads to determine at what point the system's response time degrades or fails Go Back
60. Appendix V – continued Understand the various testing types Info-Tech Research Group Stress Testing - often used interchangeably with load and performance testing. Also, used to describe such tests as system functional testing while under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, and large complex queries to a database system, etc. Performance Testing - often used interchangeably with stress and load testing. Ideally , performance testing is defined in requirements documentation or QA or Test Plans. (Specified performance criteria from the customer). Usability Testing - testing for user-friendliness. This is the most subjective, and will depend on the end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Developers and testers are usually not appropriate as usability testers. Deployment Testing - testing of full, partial, or upgrade install/uninstall processes. Recovery & Failover Testing - testing how well a system recovers from crashes, hardware failures, or other catastrophic problems. Security Testing - testing how well the system protects against unauthorized internal or external access, willful damage, etc.; may require sophisticated testing techniques. Compatibility Testing - testing how well software performs in a particular hardware/software/operating system/network/etc. environment. Exploratory Testing - often taken to mean a creative, informal software test that is not based on formal test plans or test cases; testers may be learning the software as they test it. Ad-Hoc Testing - similar to exploratory testing, but often taken to mean that the testers have significant understanding of the software before testing it. User Acceptance Testing - determining if software is satisfactory to an end-user or customer. Comparison Testing - comparing software weaknesses and strengths to competing products. Alpha/Beta Testing - testing of an application when development is nearing completion; minor design changes may still be made as a result of alpha testing. Beta testing occurs when development and testing are essentially completed and final bugs and problems need to be found before final release. Typically, alpha & beta testing is done by end-users or others, not by developers or testers. Go Back
61. Appendix VI Available Tools & Templates Info-Tech Research Group Use Info-Tech’s “ Test Plan Template ” to quickly layout and streamline your project test plan. (strategy) Use Info-Tech’s “ Defect Reporting Template ” to quickly record and document your project test cases. Use Info-Tech’s “ Software Testing Strategy & Risk Assessment ” tool to determine risk and testing strategies by looking at these influencing factors. Use Info-Tech�