When there’s no best practices and you’re looking for the right way to test, what do you do? You come up with ideas of what you could try and experiment with them. This talk sums up my experience of replacing a test-case-driven style with a learning-tester-driven style in two organizations. To improve, we take what we’re given and can’t change, and make choices that that help us get the best out of what we have. Finding the appropriate stretch for the context at hand taught me that there’s no better way of keeping the team awake than changing the way we test on a regular basis with continuous experiments. Join me in learning what my teams experimented with and what worked for us, to get ideas of what you could try in your organization to enhance your practice of testing appropriately in your context.
3. What Testing Gives Us
UnitTesting
ExploratoryTesting
SPEC
FEEDBACK
REGRESSION
GRANULARITY
GUIDANCE
UNDERSTANDING
MODELS
SERENDIPITY
Testing as
artifact
creation
Testing as
performanc
e
8. Givens. Things I could not
change.
• Waterfall process.
• Contractual distance between acceptance
testers and subcontractor.
• Test-case metric based reporting.
• I manage, I don’t test.
• Business end users as testers.
9. Experiments. Things I changed.
• Acceptance tester degree of freedom.
• Test cases from step-by-step scripts to test
data with process outline for notes.
• Making “change requests” acceptable.
• Reporting ~20% of testing to 3rd party.
• Inofficial tips sharing sessions with the
subcontractor.
12. Givens. Things I could not
change.
• Roadmapping creating disconnect to
current priorities.
• Tendency for remote work.
• Developers doing majority of testing.
• Requirements / Specifications format as UI
spec
13. Experiments. Things I changed.
• No test cases or wasteful documentation.
• Tester with developer tools.
• Removing “acceptance testing” by moving
testing to the team.
• Continuous delivery (without test
automation).
• Holding space for testing to happen.
• True teamwork with mob programming.
14. Framework of Management
”A day’s work”
Vision (“Sandbox”) Current Charter
Other Charters Details
Bug
Reports
Perception of
quality and
coverage
Quality
ReportDebriefing
Tester
Test
Manager
Past
Results
Obstacles
Outlook
Feelings
?
#
xCharter backlog of the future
testing
Out of
budget
Next in
importance!#, ?, x,
+20:20:60
Session sheets of the past testing
Idea of
exploratio
n
Metrics
summar
y
Coachin
g
Playbooks
All testing may be exploratory, but some of it is focused on creating artifacts. Saying “scripting is just an approach” is belittling when scripting can be the main approach to use the powers when testing.
“there’s a process of knowing” – learning
Does not give as regression; serendipity (safety against things happening randomly) / unwanted serendipity events.
This is what it is and what it could be. There’s a direction to it, not just statement of what it is.
Coaching is not just feedback, it’s pointing them to the right way.
Safety.
EXPERIENCE (the verb) rather than facts ; emotions over facts. REACTIONS.
HISTORY, Lessons learned, checklists. Modeling.
UNDERSTANDING – where you start (knowing the thing (code & environment), knowing the user, knowing the problems, knowing the developers (how to help them and what they do so that you can efficiently test), knowing the hackers (weird use cases outside common ‘have you tried reading it upside down’) , knowing all stakeholders, knowing the business priorities)
Uncovering things I cannot know, giving the application a change to reveal information for me.
This allows you to know things.
Example area: 53 test cases (P1 – visible) and 184 (P2 – not visible)
Minimal energy principle in reviews: ”thanks for feedback, we decide if we change or not”
Disobeying: ”add test case before writing a bug report”
Our written test cases were bad – just as anyone else’s except for one