Some organizations spend months or even years attempting to create a workable portfolio management solution – often with little success. So, how do organizations determine if their software portfolio is working? Can your portfolio answer basic questions related to productivity or performance? With Agile Portfolio Management and Jira, organizations can apply simple techniques to create a software portfolio that satisfies organizational needs.
Anthony Crain, Delivery Manager at Blue Agility, presents basic and more advanced agile practices, such as innovation accounting, alignment and Portfolio ROI to achieve what more complex portfolio practices often do not. Crain demonstrates how to harness the power of JIRA to standardize and centralize these practices resulting in more dynamic portfolio management and reporting.
Join us to learn how Software Portfolio Management along with JIRA allows organizations to:
*Maximize the ROI of their technology and IT work
*Identify what work to staff and what work to leave in backlog to a later date
*Determine ROI of individual efforts or entire initiatives when going agile
*Identify which innovations are working, such as adopting agile or using offshore resources
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Optimize Portfolio Performance with Simple Agile Techniques and Jira - Part 1 & 2
1. Optimize Portfolio Performance
with Simple Agile Techniques
and Jira
Anthony Crain, Blue Agility (a cPrime Company)
acrain@blue-agility.com
With Simone Chen, cPrime
Simone.chen@cprime.com
www.cprime.com
2. Agenda
1. Purpose of Portfolio management
2. Create your historical baseline
3. Estimate your future state portfolio
4. Validate your current state portfolio
5. Use innovations to improve your portfolio
6. Align initiatives with business objectives
7. Report on Portfolio ROI over time
8. Streamline resource allocation
2
3. 1. Purpose of Portfolio Management
• Maximize ROI of IT Resources
• What to work on
• Who will do the work
• When to abandon bad work
• Anthony’s QPPE “kewpee” measures for all things
• Quality: value, defects, compliance, customer satisfaction
• Predictability: accuracy, time to accuracy
• Productivity: time, cost, scope
• Engagement: skill growth, satisfaction, retention, overtime, successes, innovation effectiveness
• QPPE for Portfolio Management
• Quality: Business alignment, production defects
• Predictability: Estimation accuracy, cost of estimation, initiative success profiling
• Productivity: initiative time, cost, scope, throughput; cost of development
• Engagement: innovation effectiveness
3
4. 2. Create Your Historical Baseline 1 of 2
• Productivity: Initiative Throughput
• Question: Can you show us a report on productivity over the last five years?
• Make it happen
• Create a list of historical initiatives
• Sort from smallest to largest
• Assign Fibonacci numbers to historical initiatives
• 1, 3, 5, 8, 13, 20, 40, 60, 100
• Results: You can now determine:
• If you're productivity is getting better or worse over time
4
5. 2. Create Your Historical Baseline 2 of 2
• Predictability: Estimation Accuracy, Cost of Estimation
• Question: Do you use historical actuals to come up with future estimates?
• Question: How much does it cost you to estimate new initiatives?
• Make it happen:
• Add cost and calendar time
columns to your list of historical initiatives
• Results: You can now determine:
• Cost and time ranges for every new initiative
6
6. 3. Estimate Your Future State Portfolio
• Predictability: Estimation Accuracy
• Question: Are your current proposal estimates
reasonable?
• Make it happen
• Create a list of your proposals.
• Give each a Fibonacci number by comparing them to historical initiatives.
• Compare their current estimates to cost & calendar ranges for their Fibonacci number.
• Results: You can now determine:
• If any proposals are out of the expected bounds for time or cost
• Proposals that are lower than the low: DOUBTFUL.
• Proposals that are higher than the high: WHY?
• Quickly create estimates for any proposals that aren’t estimated yet
8
7. 4. Validate Your Current State Portfolio
• Predictability: Estimation Accuracy, Time to Accuracy
• Question: Are your current initiative estimates reasonable?
• Make it happen
• Do the same for your current initiatives.
• Also keep track of changes to estimates.
• Results: You can now determine:
• The red/yellow/green status for inflight initiatives
• Set them to red if their estimates are outside the bands
• Yellow if they are within 20%
• green otherwise.
• Your time to accuracy. When are your estimates within 80% of actuals?
10
8. Real Results
• Cost of Estimation for a large international bank:
• Decreased time for level 1 estimates by 99.8% (from 5 days to 5 min)
• Decreased cost for level 1 estimates by 98.0% (from $388 to $8, $76k/year)
• assuming 200 initiatives/year
12
9. 5. Use Innovations to Improve Your Portfolio 1 of 3
• Innovation: Impact of Innovations
• Question: Can you show us a report on which
innovations are yielding the highest value?
• Make it happen
• Create a list of innovations (for example: agile)
• Add an innovation column to your historical,
proposal and current initiative lists
• Results: You can now determine:
• Which innovations lead to the best cost and/or time actuals
• Which innovations lead to the worst actuals
• Failed innovations that have no predictable effect
13
10. 5. Use Innovations to Improve Your Portfolio 2 of 3
• Innovation: ROI on Innovation
• Question: Can you tell which innovations are yielding the highest ROI?
• Make it happen
• Treat all innovations as initiatives in the portfolio
• Add cost and calendar time to your list of innovations.
• You can now compare cost of innovation to average
actual initiative cost gains
• You can now set an innovation budget & have
great ideas compete for that bucket.
• Results: You can now determine:
• The measureable ROI on Innovations
15
11. 5. Use Innovations to Improve Your Portfolio 3 of 3
• Innovation: Poor Adoption
• Question: Can you tell why your failed innovations are failing?
• Make it happen
• Have teams self assess: did we do this innovation well?
• Have external teams assess: are we doing this innovation
well?
• Use a skill tracking system that objectively
measures innovation skill growth.
• Results: You can now determine:
• Which innovations are failing vs. which innovations
are being performed poorly
17
12. 6. Align initiatives with Business Objectives 1 of 4
• Quality: Business Alignment
• Question: Does the technology team understand the current business objectives?
• Question: Can you show which technology initiatives support which business objectives?
• Question: Do your initiative categories support your business objectives?
• Make it happen
• Create business aligned initiative types
• Socialize with technology staff
• Add initiative type column to historical, proposals
and current initiatives
• Results: You can now determine:
• Business alignment
18
13. Example: Align Initiatives with Business Objectives
• Typical business objectives
• Growth into new markets
• Improve perceived quality
• Grow current capabilities
• Modernize internal technologies
• Initiative Types
• Greenfield
• Maintenance
• Correction
• Enhancement
• Adaptation
• Innovation
• Results:
You can now determine:
• % technology-budget spent per objective, throughput, backlog
19
Initiative
14. 6. Align Initiatives with Business Objectives 2 of 4
• Quality: Production Defects, Cost of Poor Quality
• Question: Is quality good enough to support additional business objectives?
• Question: Can you show us the cost of poor quality for the last five years?
• Make it happen
• Ensure correction initiatives are one of your
initiative types
• Add up the cost of correction initiatives
• Results: You can now determine:
• The cost of poor quality over time
• However, this must be compared to the defect backlog! Why?
21
15. 6. Align initiatives with Business Objectives 3 of 4
• Quality: Defect Backlog
• Question: Can you show us a report for the defect backlog per product over the last
five years?
• Make it happen
• Track CRUD, WAI and CNRs per product over time
• Add product to your list of historical, proposal and current initiatives
• Results: You can now determine:
• If your % budget spent on correction initiatives is enough
• Which initiatives lead to the best/worst shifts in production quality
• Which innovations are leading to changes in quality
23
16. 6. Align initiatives with Business Objectives 4 of 4
• Quality: Pre-Production Defects
• Question: Can you show us which initiatives had the best or worst quality prior to product
release?
• Make it happen
• Track the number of defects found by the test team
• Track the number of tests run by the test team
• Results: You can now determine:
• If pre-release defects can predict post release quality levels
• How much testing is enough, too little or too much
• Test density: # tests run vs Fibonacci size
• Defect density: # dev defects vs Fibonacci size
• Test effectiveness: # dev defects / # tests run
• Goal: high test density, low defect density
25
17. Defect Backlog vs Releases vs Cost of Poor Quality
0
5000
10000
15000
20000
25000
30000
35000
2014 2015 2016 2017
Cost of Poor Quality
Cost of Poor Quality
0
50
100
150
200
250
300
350
400
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
#OpenDefects
Defects Over Time vs. Releases Over Time
Release Defects
18. 7. Report on Portfolio ROI over Time 1 of 4
• Quality: ROI
• Question: Can you articulate the ROI of your portfolio?
• Make it happen
• Already have cost column so now need value column
• Revenue initiatives: record actual revenue
• Cost reduction initiatives: record actual cost reduction
• Others, pick a technique from next slide
• Results: You can now determine:
• The return on investment in technology for revenue and cost reduction initiatives
28
19. 7. Report on Portfolio ROI over Time 2 of 4
• Quality: Value
• Question: How do you measure non revenue / cost reducing initiatives?
• Make it happen
• Have “other” initiatives record non-revenue measures
• What will increase or decrease if they succeed? Or:
• What new capability will be created if they succeed?
• Examples
• Decrease usability defects found in first quarter after release by 40% (20 defects, from 50 to 30)
• Create the capability to measure the ROI of any initiative in the company
• Monetize any that you can (eg: what is the cost of a defect?)
• Use value points for “other” initiatives
• Sort initiatives from least valuable to most valuable to the customer
• Assign Fibonacci numbers to represent value in the historical, proposal and current portfolios
• Results: You can now determine:
• Value for non revenue / cost reducing initiatives
30
20. 7. Report on Portfolio ROI over Time 3 of 4
• Predictability: Initiative Success
• Question: Can you report on what % of initiatives succeed per year?
• Question: Can you report on the cost of failed initiatives?
• Make it happen
• Define success, partial success, failure
• Epic Success: achieved more than 100% of value on time, on cost, on quality, on happiness
• Success: achieved 100% of value on time, on cost, on quality, on happiness
• Partial success: achieved more than 80% of value at no more than 120% of time,
120% of cost, and 80% happiness
• Impeded: ROI still greater than 0%
• Failure: achieved 0% value and more than 20% of planned cost
• Results: You can now determine:
• Patterns for success and failure
• Cost of failure
• Best innovations, etc.
32
21. 7. Report on Portfolio ROI over Time 4 of 4
• Predictability: Portfolio Position
• Question: When initiatives open risks that then age,
does it really make a difference?
• Make it happen
• Sort all initiatives by ROI and Fibonacci value
• Give each a portfolio position where PP1 is the most important
• Ensure every team knows their position (PP3 of 90)
• When risks to cost or value age > 1 iteration or 20% of waterfall,
adjust the ROI and thus the PP for the team
• Ensure all stakeholders see the PP every iteration / month
• If a initiative dips “below the bar”, it is paused and its team reallocated.
• Results: You can now determine:
• Which initiatives need stakeholder attention.
• Stakeholders will now see their initiatives plunge in the portfolio and be motivated to take action to fix the situation or cancel their
initiative
34
22. Example: ROI Report
• Revenue generators: 1.25 (spent $1M, generated $1.25M) [30% of total spend]
• Cost reduction: 1.15 (spent $1M, estimated reduction $1.15M) [40%]
• Others (spent x) [30%]
• Value points predictability: .85 (planned 1000VP, actual 850VP)
• Non dollar value delivered: 2.5 (spent $1M, estimated value of $2.5M)
• Decrease first quarter post release usability defects by 40% (2000 defects, from 5000 to 3000)
• Increased the number of users by 100% (from 1M to 2M users)
• Note: often these CAN be monetized.
• Portfolio Spend Per initiative Type
• Cost of failed initiatives: ($x) 11%
• Cost of poor quality: ($y) 12%
• Invested in new markets: ($z) 9%
• Invested in current capabilities: ($a) 49%
• Invested in modernization: ($b) 10%
• Invested in innovations ($c) 9%
36
Revenue
Generation
30%
Cost Reduction
40%
Others
30%
% IT Budget 2015
1.245 1.285
1.4375
1.1 1.13
1.25
1.08 1.11 1.15
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1 2 3
ROI
ROI Trend Last Three Years
Average ROI Revenue Generation Cost Reduction
Cost of Failed
Projects
11%
Cost of Poor
Quality
12%
Invested in
Markets
9%
Invested in
Current
Capabilities
49%
Invested in
Modernization
10%
Invested in
Innovations
9%
% IT Budget
23. 8. Streamline Resource Allocation 1 of 2
• Standard Resource Allocation Approach
• Set an IT Budget
• Draw the line at sum of estimated costs = technology budget
• Above the line: funded! Go!
• Below the line: not-funded. Maybe next year.
• IT DOESN'T WORK LIKE THAT!
• You will run out of human resources before you
run out of IT Budget.
• Need a funding approach based on resources,
not dollars
• Once out of resources, use funds remaining to
improve existing resources or bring in new ones
37
24. 8. Streamline Resource Allocation 2 of 2
• Product Aligned Teams
• One Initiative Per Person
• Squads and Floaters
• Skill Based Allocation
• Pick Your Adventure
• Allocation Improvement Requests (AIRs)
• Baseball Cards and Chess Boards
38
25. “Baseball Card”
Technologies: SAP Config L3, .Net L5, PeopleSoft L2
Roles: Arch L3, Analyst L4, Coach L5
Domains: Milling L5, Transportation L1, Costing L1
Core initiative: Trade Pro – Ranked #12
Extended On: Karnak – Ranked #5
Guess: L3
initiatives
Skills
Goal: L4
Role: L1Location
Omaha North
Pat Jones
Partial: L3
Partial: highest level skill for a role
Role: all skills for this role at this level or higher
Goal: shows if this person wishes to be staffed in this role
Guess: allows staffing before real skill evidence data has been submitted
26. Technologies: SAP Config L3, Java L4, OTM L3
Roles: Arch L3, Analyst L3, Dev L3, Test L3, Coach L3
Domains: Transportation L3
Core: Pat, Jo, Billie, Cruise, Shree
Extended: Misty, Ty, Enrique
Team Needed: 5 core, 3 extended
Skills Needed
Location
Omaha North
Trade Pro
“Chess Board”
27. Thank You! Any Final Questions?
1. Purpose of portfolio management
2. Create your historical baseline
3. Estimate your future state portfolio
4. Validate your current state portfolio
5. Use innovations to improve your portfolio
6. Align initiatives with business objectives
7. Report on portfolio ROI over time
8. Streamline resource allocation
41