The document discusses automation testing in agile environments. It covers agile values and principles, objectives of agile automation testing including a strategy on what to automate and not automate. It also discusses challenges, characteristics of an ideal automation testing framework, and includes a game to design scenes to understand scaling of test automation. The presentation aims to help understand how automation testing can enable agile development.
Tests that require ad hoc/random testing based on domain knowledge/expertise – Exploratory Testing.
User experience tests for usability
Configuration testing where tests will be run with different configurations
Tests that take a long time to perform and may need to be run during breaks or overnight.
Test cases where it is easy to see the expected result.
•Key to successful test automation is knowing when to automate
Business-critical functionality
Functions that are used frequently by many users
Test cases which will be run several times with different test data or conditions
Tests that involve inputting large volumes of data, such as filling up very long forms.
Configuration testing where tests will be run with different configurations
Tests that take a long time to perform and may need to be run during breaks or overnight.
Test cases where it is easy to see the expected result.
Tests that can be used for performance testing, like stress and load tests.
Tests that need to be run against every build/release of the application, such as smoke test, sanity test and regression test.
Tests that require ad hoc/random testing based on domain knowledge/expertise – Exploratory Testing.
User experience tests for usability
Different objectives would require different test automation strategies, for example Captcha.
Tests that you will run only once. The only exception to this rule is that if you want to execute a test with a very large set of data.
Tests that need to be run ASAP. Usually, a new feature which is developed requires a quick feedback so testing it manually first.
Test that cannot be 100% automated should not be automated at all, unless doing so will save a considerable amount of time.
•Core functionality and navigation flow is approved and accepted by end client.
•All regression tests:
•No planned major enhancements in the functionality for minimum next 3 regression rounds.
•Based on the principles of Agile (Welcome changes), the applied tools in Automation testing is very important.
•It has to be stable enough to adapt all changes during implementing.
•Functional regression pack: which is meant to check the functionality of the application in more detail ????
Readability: When you look at the test case, you can read it through and understand what the test is for. You can see what the expected behavior is, and what aspects of it are covered by the test. When the test fails, you can quickly see what is broken.
If your test case is not readable, it will not be useful, neither for understanding what the system does, nor identifying regression errors. When it fails, you will have to dig though other sources outside of the test case to find out what is wrong. You may not understand what is wrong and you will rewrite the test to check for something else, or simply delete it.
Speed: As an Agile developer you run your test suite frequently. Both (a) every time you build the system, (b) before you check in changes, and (c) after check-in in an automated Continuous Integration environment. I recommend time limits of 2 minutes for (a), 10 minutes for (b), and 60 minutes for (c). This fast feedback gives you the best chance of actually being willing to run the tests, and to find defects when they’re cheapest to fix, soon after insertion.
If your test suite is slow, it will not be used. When you’re feeling stressed, you’ll skip running them, and problem code will enter the system. In the worst case, the test suite will never become green. You’ll fix the one or two problems in a given run and kick off a new test run, but in the meantime you’ll continue developing and making other changes. The diagnose-and-fix loop gets longer and the tests become less likely to ever all pass at the same time.
Updatability: When the needs of the users change, and the system is updated, your tests also need to be updated in tandem. It should be straightforward to identify which tests are affected by a given change, and quick to update them all.If your tests are not easy to update, they will likely get left behind as the system moves on. Faced with a small change that causes thousands of failures and hours of work to update them all, you’ll likely delete most of the tests.
Test automation backlog: Maintain a test automation backlog for your project that contains all needed automation tasks and identified improvements. If you then target a few items from the backlog every sprint, in no time you will start to see the new regression test suite taking shape. Occasionally, stories from the test automation backlog may require dedicated developer time to implement and consequently some buy-in from the product owner in order to proceed. However, it should not be difficult to convince the product owner about the value of such stories if everyone on team is committed to quality.
A test automation backlog could contain a prioritized list of items such as:
Parameterize the test environment for test execution.
Round 1: 2 actors, scene: garden: 4 trees
Round 2: 10 actors, scene: park: 20 trees
Round 3: 20 actors, scene: forest: 50 trees
Round 1: 2 actors, scene: garden: 4 trees
Round 2: 10 actors, scene: park: 20 trees
Round 3: 20 actors, scene: forest: 50 trees
Round 1: 2 actors, scene: garden: 4 trees
Round 2: 10 actors, scene: park: 20 trees
Round 3: 20 actors, scene: forest: 50 trees
Round 1: 2 actors, scene: garden: 4 trees
Round 2: 10 actors, scene: park: 20 trees
Round 3: 20 actors, scene: forest: 50 trees
Round 1: 2 actors, scene: garden: 4 trees
Round 2: 10 actors, scene: park: 20 trees
Round 3: 20 actors, scene: forest: 50 trees
Cost in initial step is very big, and you have to make sure that we will get the corresponding result, to avoid to waste effort and cost.
One tester in Agile, it should be better if they know programming language, this builds the easier way to do in Agile, because it’s not fullfill in Agile without Automation testing, so they have to know Programming language, moreover, it helps optimize the framework
The scripts should be flexible to run on all environment/os/browser/devices. It’s very difficult to do that but we have to adapt it, cause the project need to run based on the requirement from customer
There are some framework types (Keyword Driven, Data Driven, Modular, Modal-base, hybrid…). What should use based on your context, there is no standardized one
Page Object Model is a design pattern to create Object Repository for web UI elements.
For each web page in the application, there should be corresponding page class.
This Page class will find the WebElements and contains Page methods which perform operations on those WebElements.
Page Object Pattern says operations and flows in the UI should be separated from verification. This concept makes our code cleaner and easy to understand.
The Second benefit is the object repository is independent of test cases, so we can use the same object repository for a different purpose with different tools. For example, we can integrate POM with TestNG/JUnit for functional testing and at the same time with JBehave/Cucumber for acceptance testing.
Code becomes less and optimized because of the reusable page methods in the POM classes
Easy to Use: Methods get more realistic names which can be easily mapped with the operation
====> Demo on Ts01DemoPOM.java
====> Demo on Ts02DemoMultipleBrowsers.java - set property browser to IE or FF
====> Demo on Ts03DemoReportAndMetrics
Talk about Soft assertions for failed VP
When observing the report, talk about the Metrics, Logs, Report Details (failed with the captured pictures)
Explain 3 files in D:\projects\aavnDemoFramework\conf\RemoteWebDriver
Show the recorded video https://drive.google.com/open?id=13XM3ir3yKBZDXg8Q6SFO54VgZVVU2jzN
Intro - bugs everywhere
Demo on Ts06DemoCodeQuality.java class - Run the test, analyze the report
Show how to fix verifyElementTextDemoBadQuality()
====> Demo on Ts04DemoDataDrivenTesting
====> Demo on cucumberTests
====> Demo on Ts07DemoAPITesting.java class - demo the weather rest services from http://restapi.demoqa.com/utilities/weather/city/Saigon
For soap services, this is possible too we can compare the xml or using xpath to call the node assertion