2. Topics My Pig Pen The Pigs Developing the Rating System Implementation 6 Month Survey Adaptations
3. My Pig Pen Reorganization – August 2008 10 – 12 developers from 3 organizations merged. Product space included several technologies (REXX, WinBatch, VBScript, C#, C++, Vb6, PLQL, TSQL) Limited use of source control No automated testing
4. My Pig Pen Product Space Approximately 30 “tools” including web sites, GUI apps and command line apps. Significant certification and accreditation hurdles Scale – 350,000 user seats in 12 time zones
5. My Pig Pen Culture No team development High interrupts, extreme lack of organizational focus Certification process induces wait states “Copy Paste” development HDD (Hope Driven Development)
6. The Pigs Senior technical people 10+ years of experience Non coding “architects” No familiarity with Agile Scrum Mid-level developers 3 – 10 years of experience Legacy (C++, VBscript, VB6) skills No team experience Junior developers 1 – 3 years of experience Scripting languages No team experience
7. Pig Rating System Leadership outlined Anonymous Each iteration RATING not RANKING Component of Performance Review 6 Categories Selected by Pigs over 2 one hour meetings Initial list had 4 categories. Final list had 6. Each Category has several elements Each category included several elements, designed to standardize the components include in a category Identified by Pigs over 2 more one hour sessions Posted on SharePoint as Discussion topics to allow the Pigs to clarify what the elements mean
8. Requirements Contribution Identified Use Cases Conducted interviews with selected customer/user representatives Identified scoping/bounding conditions on user requirements
9. Design Contribution Created appropriate and useful Agile design artifacts and models Implemented best practices Environmental Security Requirements Software Industry Implemented Patterns Designed based on re-use of existing code/services Design for run / sero outage
10. Technical Contribution Contributed to code according to ability / REQUIRED documents Wrote good code / REQUIRED documents Solved technical problems Provided technical help to other team members Improved abilities throughout the sprint
11. Quality Contribution Identified data to use in testing during the Sprint Planning week Designed automated tests Implemented "mistake proofing" practices Executed / recorded tests for code they developed Executed integration tests Designed end user tests
12. Scrum Contribution Participation during Sprint Planning week Added clarification to features Participation in Sprint Kickoff Participation in daily SCRUMs Daily Since / Before Obstacles recorded Participation in Sprint Closeout Performed a named role in the close out Participation in Lessons Learned Contributed written Individual Lessons Learned
13. Teamwork Contribution On time for meetings Schedules and leads appropriate meetings Readily available on phone / LCS - Logs on to LCS religiously and is available via all communication channels readily. Returns phone calls/e-mails promptly - Promptly responds to any emails/messages. Created material for meetings - Taking notes, creating UML drawings, etc. Identified tasks - Actively identifies tasks that need to be completed Volunteered for tasks - Takes the initiative to volunteer for tasks Helped with non-technical issues - Assists fellow team members with non-technical issues (i.e. Forwarding emails) Made the work fun Accept and provide constructive opinions and options Solve relevant interpersonal problems Maintain Sprint velocity
14. Iteration Input Spreadsheet per team per iteration Numeric rating (1 [high] – 4 [low]) Everyone rates everyone else Anonymous
16. Trends Consistency Are Pigs generally rated the same across various combinations Normalization Over time does the standard of deviation decreases Improvements Do the ratings move as we emphasize specific Agile areas (Requirements, Design, Coding)?
18. Summary Scrum Team members must have a mechanism to hold each other accountable Ratings, NOT Rankings Leadership establishes the broad guidelines Team Members establish the specifics and standards Tied to performance reviews
Notas del editor
I use the word “Pig” very affectionatelyWe define a Pig as meeting all of three criteriaDedicated to a product sprint 100% of the time (we count any non sprint work as an obstacle to the sprintGets and submits peer ratingsSprints on any ProductWe did three Sprints before we started rating people. We have a rule that a Pig has to complete a two Sprints before than be ratedWhen we did our survey, the most interesting results came from the text responses.
Enterprise applications written by one developer, with no version control and no automated testingOne hit wonders caused tools to be rewritten over and over