Meetup TestingUy | Montevideo, Uruguay | 1st August 2017
Abstract
This presentation is about a testing strategy to automate the Viewer, one of the core features of the product under test, a multi-platform UI prototyping tool (mainly) for Interaction Designers, in the context of a cross-functional team fully dedicated to developing a single product. Claudia will also share what they've learnt as a team from the testing perspective and how the tester role in the team has changed during this journey.
Duration: 45 minutes
2. WORK EXPERIENCE
• Senior Quality Engineer, Indigo Studio Team, Infragistics,
2009
• TestingUy (www.testing.uy)
ABOUT ME J
STUDIES
• Computer Engineer
• Rapid Software Testing course with Michael Bolton
• Association for Software Testing courses
(Foundations & Bug Advocacy)
• Scrum Master
• ISTQB Foundation
PAST WORK EXPERIENCE
• Test Manager, Tester and Business Analyst
• Teacher for Computing Science Department within the
School of Engineering at Universidad de la República
5. • Fully dedicated to building a product
• Ten members
• Working together for 5 years
Team
CONTEXT
6. • Multiplatform UI prototyping and IXD tool
• Two big components:
• Stakeholders can experience an application and validate
ideas
http://www.infragistics.com/products/indigo-studio
• Nine major releases and several intermediate releases
• In the market since 2012
– Prototype Designer
– Prototype Viewer
Product under Test
CONTEXT
11. Style properties,
behavior actions,
interactions and
animations can
be added, edited
and removed.
Interactions
Animations
Behavior Actions
Style Properties 1 to 20
4 to 14
4 to 14
Basic / Composite
A single control
Feature under Test: Prototype Viewer
TESTING STRATEGY
13. Many Scenarios + Platforms + Browsers + Devices
Feature under Test: Prototype Viewer
TESTING STRATEGY
14. – reused for the different environments and releases
Elaborate testing artifacts to support manual testing
that can be:
– extended according to changes in the product
– created and understood by the whole team
Goals
TESTING STRATEGY
15. • We designed and created testbeds.
• Each testbed covers a testing goal.
• A testing goal is covered by a set of testbeds.
• A testbed is created with the product under test.
Definition
TESTING STRATEGY
17. We also defined:
– a checklist of general considerations to elaborate testbeds
– new types of activities to be included in the sprint
– a testbeds backlog
– a centralized repository where testbeds are stored
according to a certain criteria and nomenclature
Definition
TESTING STRATEGY
18. Definition: Testbeds Design
TESTING STRATEGY
We would like to select the test cases that
– cover the feature even if they don’t fail.
– identify interesting bugs with the least effort.
19. How do we design the template?
Definition: Testbeds Design
TESTING STRATEGY
20. Scenarios structure
Checklist
Test Cases Design Techniques
Testing goal
Template indicates how to fill test cases with data sets
High Level Test Case
Definition: Testbeds Design
TESTING STRATEGY
Template
A coverage criteria
An error theory (types of defects)
A strategy to select the test cases
A Model of the reality under test
Sequence of actions (steps)
Expected Result (oracle)
Inputs
Preconditions (context)
21. Testing goal
In which context?
Example: Testbeds Design
TESTING STRATEGY
– verify all properties styles for all controls in the prototype
viewer
– In the state in which is created
– In a different state in which was created
– Reverse interaction: property style is reverted
– Back interaction
22. Actions
under Test
Control under Test
Testbed's name
Navigations between
testbeds with same
testing goal
Example: Testbeds Design
TESTING STRATEGY
33. What we’ve learnt with this approach
• This approach allows us to:
– support developers at the moment of bug fixing
– use testbeds as an input for automation
– document coverage and test design
– test the Prototype Viewer J
– partially cover the Prototype Designer
– identify usability improvements
34. What we’ve learnt with this approach
• This approach was extended to test other functionalities,
not necessarily related to the prototype viewer.
35. • We gained understanding of the required testing effort for a
release and the impact that a fix has from this perspective.
What we’ve learnt as a team
• All team members improved their knowledge of the features
beyond their specific activities.
• We’ve learnt to be flexible enough to adapt and wear other
hats according to the needs of the product and the team.
36. Conclusions
• The role of the tester in the team has become that of
a facilitator.
• As each member has a different perspective (product under
test) and knowledge (discipline), this enriches testing when
we swap hats.
• Importance of defining testing strategies for the mid/long
term when the product under test has a long life.