4. Feedback & Iterative Development
Feedback
All steps
for
shipment
Empirical
management
Learning
by review,
test
• Quick
• Many times
Inspect
adapt
improve
• Learning
• Effective goal
reaching
5. Why Definition of Done ?
Inspect , Adapt and Improve
• All steps of software life cycle (dev to
deployment) get feedback
• Product feedback : Test performance, Demo etc.
• Process feedback : Coding Quality, Peer Review,
Deployment etc.
6. Why Definition of Done ?
Almost done is not done at all
• PO and Dev in discussion
•
•
•
•
•
•
Is it done ?
Yes, almost
Can we go to production ?
No, not yet
Why not ?
Some bugs, some tests, not sure it works on prod,
Webservice not reachable in business domain, manual has
to be written, etc
• When can we go to production ?
• I am not sure…..
7. Why Definition of Done ?
Better release planning
• No need for hardening iterations.
Iterations where bugs are solved, tests are
done, deployment is prepared
• Estimate / plan on iterations
17. Why Definition of Done ?
Minimize the delay of risk
• Undone work reveals itself in production !
18. Why Definition of Done ?
Defines the agility/quality/maturity of the team
• A team should be able to complete a (new)
feature in one iteration and release it
immediatly to production with all steps
defined in the DoD necessary to guarantee
best quality.
19. Definition of Done
Two definitons of done :
-Competence -> Automation (Can’t)
-Maturity - > Won’t
In Use/ Current
• Transparant for Product owner
• Represents capability of the team
• What to improve
Optimal/Ideal
• Where do you want to go
20. Definition of Done
Ideal
•Code checked in
•Code build green on build server
•Coding Quality Check Green(er) (Sonar)
•Unit Test build server OK
•Unit Test build server OK (Code Coverage 80 %)
•Peer reviewed
•(Automated) Deployed on CI Server
•One click on Demo Server
•(All deployment is including automated
database deployment on all mentioned servers)
•(Automated) Integration Test run on CI
•(Automated) Acceptance Test run on CI
•(Automated) Performance Test run on CI
•(Automated) Deployed on ST Server
•(Automated) Deployed on UAT Server
•Exploratory testing done on ST Server
•Integration (chain) testing done on UAT Server
•Demo-ed and approved by Product Owner
•All sprint related bugs solved
•Deployment Guide up to date
•Interface documentation up to date
•Use Cases up to date
•RMS up to date
•Release Notes up to date
•User Manual up to date
•SRS updated
•Iteration Test Rapport (up to date)
•Technical Design updated (when absolutely
necessary)
•Product Backlog up to date
Two definitons of done
Current
•Code checked in
•Code build green on build server
•Coding Quality Check Green(er) (Sonar)
•Unit Test build server OK
•Peer reviewed
•(Automated) Deployed on CI Server
•(All deployment is including automated
database deployment on all mentioned
servers)
•(Automated) Integration Test run on CI
•(Automated) Acceptance Test run on CI
•(Automated) Deployed on ST Server
•Exploratory testing done on ST Server
•Demo-ed and approved by Product Owner
•All sprint related bugs solved
•Deployment Guide up to date
•Interface documation up to date
•Use Cases up to date
•RMS up to date
•Product Backlog up to date
Delay of Risk
Manifestation in production
21. Definition of Done
Ideal
•Code checked in
•Code build green on build server
•Coding Quality Check Green(er) (Sonar)
•Unit Test build server OK
•Unit Test build server OK (Code Coverage 80 %)
•Peer reviewed
•(Automated) Deployed on CI Server
•One click on Demo Server
•(All deployment is including automated
database deployment on all mentioned servers)
•(Automated) Integration Test run on CI
•(Automated) Acceptance Test run on CI
•(Automated) Performance Test run on CI
•(Automated) Deployed on ST Server
•(Automated) Deployed on UAT Server
•Exploratory testing done on ST Server
•Integration (chain) testing done on UAT Server
•Demo-ed and approved by Product Owner
•All sprint related bugs solved
•Deployment Guide up to date
•Interface documentation up to date
•Use Cases up to date
•RMS up to date
•Release Notes up to date
•User Manual up to date
•SRS updated
•Iteration Test Rapport (up to date)
•Technical Design updated (when absolutely
necessary)
•Product Backlog up to date
Two definitons of done
Current
•Code checked in
•Code build green on build server
•Coding Quality Check Green(er) (Sonar)
•Unit Test build server OK
•Peer reviewed
•(Automated) Deployed on CI Server
•(All deployment is including automated
database deployment on all mentioned
servers)
•(Automated) Integration Test run on CI
•(Automated) Acceptance Test run on CI
•(Automated) Deployed on ST Server
•Exploratory testing done on ST Server
•Demo-ed and approved by Product Owner
•All sprint related bugs solved
•Deployment Guide up to date
•Interface documentation up to date
•Use Cases up to date
•RMS up to date
•Product Backlog up to date
•User Manual up to date
22. Definition of Done
Conclusion
Definition of Done helps you with :
•
•
•
•
•
Improving team quality/agility/maturity
Transparancy to stakeholders
Giving burndown charts sense
Better release planning
Minimizing delay of risk
23. Product Backlog
• List of whatever needs to be done in order to successfully deliver a
working software system
• Features, functionality, technology, issues, emergent items
• Prioritized, estimated
• Product Owner responsible for priority
• More detail on higher priority items
• Anyone can contribute
• Posted visible and Maintained
24. Product Backlog Refinement
•
•
•
•
•
•
•
•
•
•
Time boxed meeting +- 1,5 hour every week whole team
Product owner should attend
Split , clarify and estimate work items, user stories, RFC’s
Share new insights with the team
Re-estimate when necessary
Priority determined by Product Owner
Goal is the have a “ready” Product Backlog for next planning
Prevent discussions in the planning session
Visualize release planning
(also known as Backlog Refactoring, Backlog Maintenance,
Backlog Grooming)
25. Product Backlog Refinement
Clear-Fine Items
1.--2.--3.--4.--Vague-Coarse Items
---------------------------------------------------------------------------------------------------------------------------
Items that are detailed and small enough
to be picked up by development for
implementation
Need more details, more discussion,
more acceptance criteria, smaller etc.
28. Planning
Planning session 1 :
Determine capacity of team
Pick userstories based on
“feeling” and velocity in mind
Time : 5-10 minutes
Planning session 2 :
Define tasks and hours
Time 2 hours