1. A CONFUSED TESTER IN AGILE
WORLD …
QA A LIABILITY OR AN ASSET
THIS IS A WORK OF FACTS & FINDINGS BASED ON TRUE STORIES OF ONE & MANY
TESTERS !!
Presented By
Ashish Kumar,
2. WHAT’S AHEAD
• A STORY OF TESTING.
• FROM THE MIND OF A CONFUSED TESTER.
• FEW CASE STUDIES.
• CHALLENGES IDENTIFIED.
• SURVEY STUDIES.
• GLOBAL RESPONSES.
• SOLUTION APPROACH.
• PRINCIPLES AND PRACTICES.
• CONCLUSION & RECAP.
• Q & A.
4. HAVE YOU HEARD ANY OF THESE ??
• YOU DON’T NEED A DEDICATED SOFTWARE TESTING TEAM ON YOUR AGILE
TEAMS
• IF WE HAVE BDD,ATDD,TDD,UI AUTOMATION , UNIT TEST >> WHAT IS THE NEED
OF MANUAL TESTING ??
• WE WANT 100% AUTOMATION IN THIS PROJECT
• TESTING IS BECOMING BOTTLENECK AND REASON OF SPRINT FAILURE
• REPEATING REGRESSION IS A BIG TASK AND AN OVERHEAD
• MICROSOFT HAS NO TESTERS NOT EVEN GOOGLE, FACEBOOK AND CISCO
5. • IN A “MOBILE-FIRST AND CLOUD-FIRST WORLD.”
• THE EFFORT, KNOWN AS AGILE SOFTWARE
DEVELOPMENT, IS DESIGNED TO LOWER COSTS
AND HONE OPERATIONS AS THE COMPANY
FOCUSES ON BUILDING CLOUD AND MOBILE
SOFTWARE, SAY ANALYSTS
• MR. NADELLA TOLD BLOOMBERG THAT IT MAKES
MORE SENSE TO HAVE DEVELOPERS TEST & FIX
BUGS INSTEAD OF SEPARATE TEAM OF TESTERS
TO BUILD CLOUD SOFTWARE.
• SUCH AN APPROACH, A DEPARTURE FROM THE
COMPANY’S TRADITIONAL PRACTICE OF DIVIDING
ENGINEERING TEAMS.
• WOULD MAKE MICROSOFT MORE EFFICIENT,
ENABLING IT TO CUT COSTS WHILE BUILDING
SOFTWARE FASTER, EXPERTS SAY.
• 15K+ DEVELOPERS /4K+ PROJECTS UNDER
ACTIVE DEVELOPMENT/50% CODE CHANGES
PER MONTH.
• 5500+ SUBMISSION PER DAY ON AVERAGE
• 20+ SUSTAINED CODE CHANGES/MIN WITH
60+PEAKS
• 75+ MILLION TEST CASES RUN PER DAY.
• DEVELOPERS OWN TESTINGANDDEVELOPERS
OWN QUALITY.
• GOOGLE HAVE PEOPLE WHO COULD CODE AND
WANTED TO APPLY THAT SKILL TO THE
DEVELOPMENT OF TOOLS, INFRASTRUCTURE,
AND TEST AUTOMATION.
• “DEVELOPERSKILLS ANDA TESTERMINDSET.”
• GOOGLE PERFORMS A GREAT DEAL OF MANUAL
TESTING, BOTH SCRIPTED AND EXPLORATORY,
Source:
Wall Street Journal: http://blogs.wsj.com/cio/2014/07/15/microsoG plots agile development course as talk on job cuts loom/‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐ ‐
Mico J Tools for Continuous Integration at Google Scale https://www.youtube.com/watch?v=KH2_sB1A6lA&feature=youtu.be
How Google test Software :James W, Jason A, Jeff C
7. • IS QA PART OF THE DEVELOPMENT TEAM?
• CAN WE FIT QA IN THE SAME ITERATION AS DEVELOPMENT?
• SHOULD I FOCUS ON MANUAL OR AUTOMATION
• HOW CAN WE SCALE AGILE QA?
• WHO DOES QA?
• DOES QA COSTS MORE IN AGILE AS PRODUCT SEEMS TO CHANGE FROM
SPRINT TO SPRINT?
• DO WE NEED “TEST PLAN”?
• ARE STORY ACCEPTANCE TESTS ENOUGH?
• WHEN DO WE KNOW TESTING IS DONE?
• WHO DEFINES TEST CASES?
• DO WE NEED TO TRACK BUGS?
9. QA AND AGILE ARE INEXTRICABLY
INTERTWINED….
•BUT QUITE OFTEN IN AGILE ORGANIZATIONS, THE ART OF QA IS NOT WELL
UNDERSTOOD.
•THE VERY ESSENCE OF AGILE DEVELOPMENT IS DELIVERING QUALITY
WORKING SOFTWARE FREQUENTLY.
IN AGILE PROJECTS, QA SHOULD BE EMBEDDEDIN THE SCRUM TEAMS BECAUSE
TESTING AND QUALITY IS NOT AN AFTERTHOUGHT.
QUALITY SHOULD BE BAKED IN RIGHT FROM THE START.
11. Case 1
Project Description Type: Enhancement and Maintenance Project ; Domain : Core banking
Team Size: 40 ; With Agile : < 5 years
QA Roles No testers on Team
QA Approach 1. Whole Team Approach o ve r Testing Departments and Independent Testing
2. Developers perform Automation and Cross developed verification.
3. TDD
4. Developers Develops Unit test Case > Story Development > Functional Automation Test
Case> Exploratory testing > Done
Challenges 1.Hiring testers who can code features is difficult; finding feature developers who can test is
even more difficult.
2. Maintenance is a BIG challenge
3. Non-functional testing during sprint is a challenge
One Query Why Should we pay more for Manual testing
The 'whole team' approach has helped in instilling sense of
‘Inclusiveness’ within the team. It has also helped in reducing delays &
improved the overall team efficiency. It is been a paradigm shift for
many.
12. Case 2
Project Description Type: Development Project ; Domain : Finance
With Agile : < 3+ years
QA Roles 1. Cross-Functional Team
2. Functional Tester Performing both the task of Manual validation and Automation
QA Approach 1. ATDD.
2. Team together works on test scenarios > Dev – develop the stories ||QA – Develop test
case >
QA automate the test scenario || Developers pitch in for help > BA Validates > Done
3. Whole Team Approach , Developers also supports QA to perform Automation and
Verification
Challenges 1. Shortened time for testing
2. Sub-standard delivery of few stories towards the end of sprint
3. Spill Over
4. Testing backlog creation
One Query Why should we duplicate the effort by having separate roles as manual and automation
testers.
We believe in spirit of agile, it was difficult to break the shackles
of mindset and create an effective whole team approach. But it
works wonder for us although we have lot of scope for
improvements
13. Case 3
Project Description Type: Mission Critical Products;
With Agile : 5+ years. More inline with DAD approach
QA Roles • Manual testers as Part of scrum team ( Work as product experts)
• Automation testers distributed among different teams
• Field Engineers along with PO does UAT, Regulatory Testing etc..
QA Approach • Component to verify :- Hardware , Firmware, Application Software
• Application :- Automate, Interface :- Automate, Portion of Firmware and H/W :-
Automate
• Unit and Integration Testing by developers. System Integration and System Testing by
QA
• Because of complex integration and system dependency, dedicated hardening sprints
at the end
Challenges • Sometime there is lag in automation.
• Risk based testing as all configurations can’t be testing before release
• Work load is uneven for Manual Test team.
One Query Can we make the non-functional test also a part of sprint, if yes how ?
Quality is everyone responsibility. Agile
has made it true. It's not Developers or QA who is owning but
right
from customer everyone is building Quality in the product
14. Case 4
Project Description Type: Development Project
Team Size: 45 ; With Agile : 1~2 years
QA Roles 1. Separate Testing team / Vendor for QA
QA Approach 1. Development Sprint and QA sprint are separate.
2. Both the Sprint have different sprint goals and deliverables.
3. QA Sprint always lag by one dev sprint
4. They work on current sprint test scenarios and verifying previous sprint deliverables.
5. All the QA activities Functional and Non-functional are taken care in QA sprint.
Challenges 1. Teams working in Silos
2. The approach is very much waterfall
3. Defects and issues found is QA sprint are part of product backlog.
One Query Why do we need to release sub standard builds in every sprint.
We Effectively synchronized QA activity on a distributed
development model with dedicated QA-Dev pairing. Did not
reduce QA’s Importance to unit test dev’s substandard build.
15. A Sneak Peak into the past..
https://www.scrumalliance.org/community/articles/2015/june/a-confused-tester-in-agile-world
17. CHALLENGES IDENTIFIED
• CHANGING REQUIREMENTS /LAST MINUTE CHANGES
• NOT ENOUGHINFORMATION ON THE STORY
• CONTINUOUS TESTING
• TECHNICAL SKILLS /TEST AUTOMATION
• MULTIPLE BROWSERS /MULTIPLE DEVICES
• COMMUNICATION :: “TO PRODUCE ANDCOMMUNICATE RELEVANT INFORMATION
PROMPTLY”
• FEARTOLOSE IDENTITY
• COLLABORATION :: “TO MAKE TESTING, DEVELOPMENT ANDBUSINESS COLLABORATE”
• HOWTOKEEPUPWITHTHE PACE OF THE DEVELOPMENT?
• HOWTOTEST EARLY BUT NOT DOANTICIPATORY TEST DESIGN?
18. STUDY OF AGILE PRACTICES IMPLEMENTATION IN
DISTRIBUTED SOFTWARE DEVELOPMENT – A
REFERENCE
This is a research conducted by Manjunath M S Rao, Vijay Wade and M M Jha .
This Paper presents the results of a systematic study of implementation of agile practices, which covers the
summary of most effectively implemented practices, most widely recommended practices and least implemented
practices in Global Software Engineering (GSE).
The findings are based on the survey data collated from 22 agile practitioners from 14 different software
organizations spread across the globe.
2 to 5
years 57%
1 to 2
14%
19. FEW SAMPLE SURVEY QUESTIONS
1. ARE RELEASE BACKLOGS BUILT WITH THE INVOLVEMENT OF THE RELEVANT
STAKEHOLDERS ( PRODUCT OWNER, SALES/MKT, PRODUCT MANAGER, ARCHITECTS,
BUSINESS ANALYSTS, SYSTEM TESTING. ETC..)?
2. IS THE TEAM CROSS FUNCTIONAL AND INDEPENDENT TO DELIVER A FUNCTIONAL
SOFTWARE(STORY) WITHIN A SPRINT ?
3. ARE RISKS AND ISSUES GETTING TRACKED WITHIN SPRINTS?
4. HAVE YOU IMPLEMENTED XP PRACTICES LIKE TEST DRIVEN DEVELOPMENT, PAIR
PROGRAMMING ETC? PLEASE PROVIDE DETAILS IN REMARKS
5. DO YOU HAVE A SETUP TO HANDLE CONTINUOUS INTEGRATION AND DELIVERY TO
MAINTAIN THE PACE OF DELIVERY?
6. ARE YOU USING AUTOMATION TO OPTIMIZE EFFORT AND TO IMPROVE PRODUCT
QUALITY?
7. IS THE DELIVERABLE AT THE END OF THE SPRINT / ITERATION READY TO BE SHIPPED ?
( IS THERE A SEPARATE TESTING PHASE OR DELIVERABLE FROM EACH SPRINT IS READY
23. IS QA AN ASSET ON YOUR TEAM ??
Response
1.Always
2.Sometime
3.Not-Done
4.NA
Recommendations
1.Strongly recommended
2. Recommended
3.Not recommended
4.Fine Tune
28. PRINCIPLES AND PRACTICES
• TESTING MOVES THE PROJECT FORWARD
• TESTING IS NOT A PHASE……ON AGILE TEAMS, TESTING IS A WAY OF LIFE.
CONTINUOUSTESTINGISTHEONLYWAYTOENSURECONTINUOUSPROGRESS.
• EVERYONE TESTS – WHOLE TEAMAPPROACH– COLLABORATION
• SHORTENING FEEDBACKLOOPS
• KEEPTHE CODE CLEAN
• LIGHTWEIGHT DOCUMENTATION
• TEST-LAST V. TEST-DRIVEN
34. Q2 : Contrary to the synchronous activities of traditional waterfall project, agile expects development
action to be performed in order they are needed more of asynchronous we can say.
Now what are the different ways for a testing professional to engage EFFECTIVELY during a sprint before
any feature has been built?
Source; https://www.linkedin.com/groups/55636/55636-6146759619357265922?trk=hb_ntf_LIKED_GROUP_DISCUSSION_YOU_CREATED
35. Q3 : How may of you think that S/W quality is compromised due to shortened period allotted for testing
with major emphasis on development.
Also if you can please add reasons for this & correction measures.
Source; https://www.linkedin.com/groups/55636/55636-6146757659057020932?trk=hb_ntf_LIKED_GROUP_DISCUSSION_YOU_CREATED
36. Q4: During My Pursuit of identifying the approach different teams are following for testing in an agile
environment I came across some following cases, (The details given are brief)