2. Effectiveness and Efficiency
Efficient Delivery
• Less waiting and bottlenecks
• Less unproductive overhead
• Less defects and rework
Effective Steering
The image cannot be displayed. Your computer may not have
enough memory to open the image, or the image may have
been corrupted. Restart your computer, and then open the file
again. If the red x still appears, you may have to delete the
image and then insert it again.
• Stakeholders
• Marketplace
• Users
Continuous
Feedback
Minimize
Waste
The image cannot be displayed. Your computer
may not have enough memory to open the image,
or the image may have been corrupted. Restart
your computer, and then open the file again. If the
red x still appears, you may have to delete the
image and then insert it again.
Feedback
cycles Efficiency
3. How much time do your teams spend in non-value added work?
Overhead
Productive
80%
20%
50%
50%
DevOps
Transformation
Are we Lean?
4. Improve Efficiencies Through DevOps Adoption
Efficiency
Productive : Waste
Inefficient Leaner Leaner and Smarter
CollaborativeSilo-ed More
Continuous
Process-based
Process-heavy Agile More
Predictable
Manual Automated More
Transparent
OptimizingProduct-based …
…
…
…
Steer
Plan, decide, specify,
architect, sense
and respond
Develop/Test
Design, code, build, release
internal, test and verify
Operate
Monitor, tune
and validate
Deploy
Build, deliver external
and validate
5. 4
The 3 Big Sources of Wasted Efforts
Non-Value-added waste
Value-added production work
Type of Waste Status Quo Problems Transformation Needed
1. Unnecessary
Overhead
Process-based measures
Too much time in supporting artifacts
Product-based pipeline measures
Lean efficiencies
2. Unnecessary
& Late Re-work
Doing the easy things first to show early
progress and over specifying
“Shift-left” steering: Tackle highest value
and highest uncertainty things first
3. Building the
Wrong Things
Late scope changes in usability
Ineffective usability
Accelerated feedback cycles
Continuous delivery of product increments
DevOps
Transformation
6. 5
Measure the Product, NOT the Process
Delivering software based features requires two types of artifacts:
• Primary Artifacts: Product deliverables
– Design, Code, Test
– Working on primary artifacts is predominantly VALUE-ADDED work
• Supporting Artifacts: Artifacts in support of the deliverables
– Plans, specifications, models, documentation, training, test stubs/drivers, progress
reports, measurements, tradeoff studies, change requests, problem reports, compliance
analyses, certifications.
– Working on supporting artifacts is predominantly OVERHEAD work
• Lean efficiencies stem from minimizing resource expended in
supporting artifacts and maximizing efforts devoted to the evolution of
the value-added product.
Measure Product flow through the Supply Chain
7. What is Overhead?
Fat efforts to minimize
Waiting
Training
Reporting
Traceability
Late rework
Duplicate efforts
Metrics collection
Regression testing
Change propagation
Document generation
Meetings/Checkpoints
System administration
Resource accounting
Human inspections
Streamline or automate
More Valuable efforts to improve
Scoping
Learning
Feedback
Refactoring
Designing
Teaming
Coding
Testing
Planning
Engineering
Empowering
Prediction
Deciding
Steering
Facilitate or smarten
8. Idea/Feature/Bug Fix/
Enhancement
Production
Development Build QA SIT UAT Prod
PPM
Requirements/
Analyst
Developer
Customers LOB
Build
Engineer
QA Team Integration Tester User/Tester Operations
Deployment Engineer
Artifact Repository
Release Management
Code Repository
Deploy
Get Feedback
Infrastructure as Code/
Cloud Patterns
Feedback
Customer or
Customer Surrogate
Reporting/Dashboarding
Tasks
Artifacts
Map your Delivery Pipeline
9. Map your Delivery Pipeline: Large Bank
Idea/Feature/Bug Fix/
Enhancement
Production
Development Build QA SIT UAT Prod
PMO
Requirements/
Analyst
Developer
CustomersLine of Business
Build
Engineer
QA Team Integration Tester User/Tester Operations
Artifact Repository
Deployment Engineer
Release Management
Code Repository
Deploy
Get Feedback
Infrastructure as Code/
Cloud Patterns
Feedback
Customer or
Customer Surrogate
Metrics - Reporting/Dashboarding
Tasks
Artifacts
Bottleneck: Rigid ‘One-size-
fits-all’ Development process
Solution: Agile Transformation with
‘Risk-Value’ based Process
Variants
Bottleneck: Ticket Based
Environment Provisioning
Solution: Cloud Hosted
Developer ‘Self-Service’
Bottleneck: Weekend long
Deployments that often fail
Solution: Frequent Deployment
of Small Batches of Change
Bottleneck: Late Discovery of
Architectural Fragility
Solution: Agile ‘Shift Left’
Integration Testing to early in
LifeCycle
10. • Challenge:
• Developers were creating daily
builds
• QA team had a 3 – 5 day cycle
time
• Bottlenecks Identified:
• Lack of Deployment Automation
• Ticket based manual
environment provisioning
• Lack of reliable source of Test
Data
Delivery Pipeline Optimization: Large Bank
§ Three Step Solution:
1. Deployment Automation with IBM UrbanCode Deploy
2. Cloud hosted ‘on-demand’ environments with IBM UrbanCode Deploy with Patterns
3. Test Data Management with IBM Optim Test Data Management
12. Priorities of Global System Integrators
Fat efforts to minimize
Late rework
Waiting
Regression testing
Duplicate efforts
Reporting
Document generation
Training
Metrics collection
Change propagation
Traceability
Human inspections
Meetings/Checkpoints
System administration
Resource accounting
Streamline or automate
More Valuable efforts to improve
Scoping
Designing
Planning
Testing
Reusing
Deciding
Steering
Feedback
Coding
Prediction
Engineering
Learning
Teaming
Refactoring
Facilitate or smarten
13. 12
The 3 Big Sources of Wasted Efforts
Non-Value-added waste
Value-added production work
Type of Waste Status Quo Problems Transformation Needed
1. Unnecessary
Overhead
Process-based measures
Too much time in supporting artifacts
Product-based pipeline measures
Lean efficiencies
2. Unnecessary
& Late Re-work
Doing the easy things first to show early
progress and over specifying
“Shift-left” steering: Tackle highest value
and highest uncertainty things first
3. Building the
Wrong Things
Late scope changes in usability
Ineffective usability
Accelerated feedback cycles
Continuous delivery of product increments
DevOps
Transformation
14. Why is there so much late rework?
1. Design verification done after coding and unit test
§ Is the architecture change resilient?
§ Will the system elements work together?
§ Will the system behaviors perform under peak loads?
è These are the issues that make or break success
2. Overly precise requirements, designs, code early in the life cycle
Requirements
1.1.2.1.1
1.1.3.1.2
1.1.3.2.1
2.1.2.1.3
5 digits of (false)
Precision in artifacts
+
?
1 digit of precision
understanding of
user need
=
Excessive
Downstream
Scrap
And
Rework
15. Unleashing the Power of Shift Left Testing
What shifts left? Design verification è Integration Testing
• Why?
Integration Test
priorities
Unit Test
Completion/coverage
Shift Left Shift Right
Unit tests uncover code
defects that cause
benign
breakage in a single unit
Integration testing uncovers
design and architectural
defects that cause
malignant
breakage across multiple units
16. Integration Testing Precedes Unit Testing
Project Steering.
• Prioritize integration tests as the primary steering mechanisms
• Elaborate usage scenarios for 1) highest value and 2) the most uncertainty.
• Earlier feedback on: performance, integrity, security, usability, and reliability.
Agile Design/development.
• Develop units, services, and components that are always executable and testable
• Initial versions permit execution sufficient to satisfy their system interfaces
• Unit completeness evolves through continuous test releases.
Testing.
• Early testing infrastructure, data sets, sequences, harnesses, drivers, and test cases
that evolve automated regression testing
• Define system behavioral tests, usage tests and performance tests first.
• Then build system coverage tests.
18. Notices and Disclaimers (con’t)
Information concerning non-IBM products was obtained from the suppliers of those products, their published
announcements or other publicly available sources. IBM has not tested those products in connection with this
publication and cannot confirm the accuracy of performance, compatibility or any other claims related to non-IBM
products. Questions on the capabilities of non-IBM products should be addressed to the suppliers of those products.
IBM does not warrant the quality of any third-party products, or the ability of any such third-party products to
interoperate with IBM’s products. IBM EXPRESSLY DISCLAIMS ALL WARRANTIES, EXPRESSED OR IMPLIED,
INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
PARTICULAR PURPOSE.
The provision of the information contained herein is not intended to, and does not, grant any right or license under any
IBM patents, copyrights, trademarks or other intellectual property right.
• IBM, the IBM logo, ibm.com, Bluemix, Blueworks Live, CICS, Clearcase, DOORS®, Enterprise Document
Management System™, Global Business Services ®, Global Technology Services ®, Information on Demand,
ILOG, Maximo®, MQIntegrator®, MQSeries®, Netcool®, OMEGAMON, OpenPower, PureAnalytics™,
PureApplication®, pureCluster™, PureCoverage®, PureData®, PureExperience®, PureFlex®, pureQuery®,
pureScale®, PureSystems®, QRadar®, Rational®, Rhapsody®, SoDA, SPSS, StoredIQ, Tivoli®, Trusteer®,
urban{code}®, Watson, WebSphere®, Worklight®, X-Force® and System z® Z/OS, are trademarks of
International Business Machines Corporation, registered in many jurisdictions worldwide. Other product and
service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on
the Web at "Copyright and trademark information" at: www.ibm.com/legal/copytrade.shtml.
19. Thank You
Your Feedback is
Important!
Access the InterConnect 2015
Conference CONNECT Attendee
Portal to complete your session
surveys from your smartphone,
laptop or conference kiosk.