2. About Maersk Line
• Worlds largest container fleet
• Truely global business
• 14.5% world market share [1]
• 570 container vesssels
• Turnover $24 billion [2]
[1] Source: Alphaliner Jan 2011
[2] Source: Annual Report 2010
3. Fragmented IT
Landscape
• Thin outsourcing model
• Tier 1 vendors only
• 2,500 applications
• Core applications are tightly
coupled
• 23,000 bookings/day
4. How we started the lean-agile journey?
New Existing
Project, Platform, Team Project, Platform, Team
Revolutionary Evolutionary
Lean Product Development
5. X-leap: The goal
Under Maersk Lines paraplystrategi - streamLINE - er der i værksat en række
initiativer, der sikre at rederiet bliver endnu mere konkurrencedygtige gennem
industriens bedste leveringssikkerhed, fortsatte CO2-reducerende initiativer og
sidste men ikke mindst ved at sætte kunden i fokus, oplyser Eivind Kolding til Klaus
Lund & Partneres nyhedsbrev.
X-Leap er Maersk Lines største og vigtigste af disse programmer.
Formålet er at gøre det ligeså enkelt at booke en container hos os
som en bog hos Amazon.com
Maersk Line CEO
Source: http://epn.dk/brancher/transport/skib/article2069838.ece
6. X-leap: How we sold agile to our stakeholders
Maersk is complex Two delivery approaches are common
SAF
1. Waterfall 2. Prototyping
career. marine iReceivab
WebSimo P&O eProfile
USI maersk.c sailing les
n Nedloyd (SCV)
om schedule (MLIS)
s
www.
Mondo- LivePerso Emergen World
einfo maersk.c
RKST search n cy pages map
om
VMLO
GSMS
(CAF)
MARS Referenc
Portal GUPS
service e-Data
MARS
ATS2
SAF
marine
Rates Followup
shipment
≈
s eXport
eRates
booking
Message MEPC W8
broker eDB eXport
CCC
documen
Schedule -tation
MEPC s
Phone SFX
Office WS book 3 ePaymen (docume
NGP3 client/por t nt pouch)
GEO tal
Tracking
No customer facing Lots of functionality
service sROE
3
NGP3
Payment
office MailServi
system
functionality for the early, but no
ce
service
NGP3
mall
SAF
marine RKEM
GEO
mainfram MCS GCSS
MEX
IBM
payment SCV
first 18-24 months connection to backend
(MLIS)
portal e systems
Our approach is fundamentally different
▪ 100’s of backend systems
▪ Convoluted and unstable application Agile SOA
architecture
▪ Inconsistent master data Minimal set of customer
▪ High product complexity facing functionality
– More than 20 000 lines in some contracts delivered with true backend
– More than 500 commodity types Service bus
connections as early as
possibly (in our case 9–10
months)
7. X-leap: What we got right from the outset
• Strong customer focus
• Clear customer experience vision created
• Co-location
• Shared Key Performance Indicators for whole team
• Onboard experienced people
• Willingness to experiment with new approaches
• Great senior leadership support
8. X-leap: 22 practices we (now) know that
need to master
• Visualised Flow and Process
• Continuous Delivery
• Continuous Integration
• Test Driven Development
• Automated Developer (Unit) Tests
• Release Often
• Evolutionary Design
• Simple Design
• Automated Acceptance (Functional) Tests
• Refactoring
• Collective Code Ownership
• Definition of Done
• End2End Iterations
• Single Prioritised Backlog
• Limit Work-in-Progress • Demo
• Test Driven Requirements • Pair Programming (To Drive Standards)
• Feature Teams • Pair Programming (To Ease Platform
• Customer (proxy) Part Of The Team Constraints)
• Stand Up Meetings
11. X-leap: Learnings within team
Manage requirements
• Prioritise effectively between functional & non-functional
requirements
• Break down requirements and agree on what size is appropriate
• Need a process vision to support a customer experience vision
Iteration 0 is surprisingly large
• e.g. Reducing hardening phase took forever
12. X-leap: Value stream analysis for a feature
X-leap: Root cause analysis for why hardening phase takes so long
13. X-leap: Learnings within team
Manage the change
• Engage advisors who focus on optimising the whole
• Own and manage practice adoption progress
Minimise thrashing
• E.g. We still struggle to measure velocity due to constant changes
14. X-leap: Learnings outside team
Stakeholders need careful management
• Reluctant to exchange predictability for speed
• Difficult to explain refactoring & technical debt
• High expectations of delivering fast
Dependencies external to the development team are a
headache
• Feature teams help but are no silver bullet
• There’s no replacement for good project management to identify
and manage external dependencies
• Others have to change their working practice (architects,
infrastructure, other applications)
15. How we are completing the lean-agile journey.
New Existing
Project, Platform, Team Project, Platform, Team
Revolutionary Evolutionary
Lean Product Development
16. 600
Cycle Time Analysis
Median = 150 days
400
# Requirements
GCSS
Over last 24 mo
Med = 280 days
GCSS
Over last 12 mo
200
Med = 373 days
0
Days
Source: Focal Point – requirements that have been put into production over the last 2yrs,
measured from date of creation to when set to working-in-production
17. GCSS: Framing the methodologies
Lean Product Enterprise
Development Practices
Agile Project
Practices
SCRUM Team
Practices
XP* Engineering
Practices
* Extreme Programming
18. Enabling Agility
Business Agility
Fast cycle Smooth Fast Value
Time Flow Feedback Maximised
Discovery Mindset
Customer doesn’t really The developer doesn’t Things
know what they want really know how to build it change
19. GCSS: 8 selected practices
1. Get to initial prioritisation faster
2. Improve prioritisation
3. Pull Requirements from Dynamic Priority List
4. Reduce size of requirements
5. Get to the point of writing code quickly
6. Actively manage Work-In-Progress (WIP)
7. Enable Faster Feedback
8. Enable more frequent releases
20. GCSS: Release Frequency
The effect of creating large release batches upstream
S Des Dev T
Requirements
R25
S Des Dev T R24
S Des Dev T
R23
S Des Dev T
R22
Jan Apr Jul Oct Jan
2011 2012
Development
Perspective: Dev Dev Dev Dev
Estimated ~10,000 hours of idle time in 2010
21. GCSS: More Frequent Releases
Enable the smooth flow of requirements
Requirements
S Des Dev T
Releases
22. Faster Feedback
Eight Standard Measures
Requirement Started Completed Launched
captured coding coding
Requirement Integrated Decided in production
validated & built for launch
Require- Feasible Demonstrated
ments
Accepted
Code
complete
Code
Feature complete
Release Launchable
candidate Launched
23. Faster Feedback
Comparing GCSS with the X-leap on the Eight Measures
All times are in days
24. WIP LIMIT of 8
on bottleneck
GCSS: Actively Manage Work-in-Progress
Department Slide no.
25. GCSS: Work-in-Progress reduced
…whilst at least maintaining throughput
8,8
7,9
7,1
6,0 6,1 6,4
5,2
190
# Requirements*
Rel 19- R23 R24 R25 R26 R27 R28
22
76%
Guesstimate points/week
46
Oct 2010 Jan 2012
*”Authorized” to “Launched”
26. GCSS: Requirement size variability
Before
Max. size
# Requirements
<2 weeks
After
Guesstimate Points
27. GCSS: Standardized Upstream Process
Get to initial prioritisation faster
Get to point of writing code quickly
Dynamic
Triage Priority
Refine Dev Pull to
coding…
Buffer
<1 week List <2 weeks
Max 5
Quickly identify the Expect >10% attrition
ideas that will be the otherwise upstream
most profitable process is too heavy
28. GCSS: Quality improvements
Releases 2010-2011
12,0
-88%
10,0
8,0
Defects
6,0
-80%
Patches
Defects
Delays
4,0
Delays
2,0
0,0
E1+E2 Defects raised in HOAT
Production slippage (in days)
Average Rel18-Rel22
8,2
11,2
Average Rel23-Rel28
1,0
2,2
-85%
Patches
Patches 2wks after Prod 2,0 0,3
Up to June 2011 Since July 2011
29. GCSS: Cycle time
Average time elapsed from starting work to released
Refine Realise Release
0 50 100 150 200
Releases 11 to 22* 208
days
Rel 23
Half
Rel 24
Rel 25
Rel 26 the
Rel 27
104 time
Rel 28
Days
*No data for R18, R19
30. Rolling out!
Feb 2011 May 2011 Sept 2011 Jan 2012 Aug 2012
GCSS Pilot Hermes
Rollout Starter Pack to all delivery streams
SOA
SAP Masterdata
London EDI
Systemic issues
32. Engineering Quality Checklist
New delivery teams need to adopt these as soon as possible in order to build quality in and establish a Deployment
foundation for sustainable delivery of value.
All batch testing of requirements and
the subsequent deployment to
production takes 7 days or less
Development Build & test 18
All assets are checked into a single The build runs all unit tests,
All environments can be recreated
repository (code, config., test scripts, regression tests and all non-manual
using the same automated process
schemas, migration scripts etc) 1 acceptance tests 9 19
Broken builds are fixed (or the check-
Developers check-in code to the in is reverted) before more code is Updates are deployed to production
repository at least daily checked-in without customer downtime
2 10 20
Test stubs ensure all automated tests All deployments are automated
Developers have collective code are independent of other systems (including schemas, migrations &
ownership & responsibility (excl. network & integration tests)
3 11 platform/application configuration) 21
Non-functional requirements are A build is completed within 20 mins of A developer’s environment & tools are
identified and prioritised alongside code check-in and is then deployed to built from a standard configuration
other requirements 4 a non-production environment 12 within 2 hours 22
User interface tests & unit tests are Test coverage and code quality
run by the developer before code Environment provisioning
metrics are monitored
check-in 5 13 Any new environments (excluding
production) required are provisioned
Source control branches are Testing is prioritised using a risk- within a week 23
frequently merged (every 2 weeks or based approach
less) 6 14 Any standard production
environments required are
provisioned within a month 24
Technical debt Some performance tests are run at
least daily
15
The team regularly takes time to Monitoring & improvement
identify and record technical debt
7 Build, test & deployment process
The load-to-failure threshold is
performance is measured and
identified
16 continually improved upon 25
Repaying technical debt is prioritized
alongside other requirements
8 All programmatic interfaces are
How to monitor production health is
permanently available to other
an integral part of the design
v1.0 systems for integration testing 17 26
12-2-2012
33. Learning from rollout so far
• Practices seem to work everywhere
• Mature teams are generally more receptive than newer ones
• The know their process and that it needs improvement
• As with all change programmes, a couple of key individuals in the
team can make a huge difference
• Personnel turnover make changes hard to stick
• There are systemic issues which need addressing
Department
Slide no.
34. What next for Maersk Line?
Max
• Legacy: Rollout 8 starter
pack practices for all legacy
applications
90
days
cycle time
Max
30
• New: Additional practices for
our new Service Oriented
”vision platform”
days
cycle time
Department Slide no.
37. Key Performance Measures for IT
Variable Typical measures Usual outcomes Alternative measures
Time Delivering on a Incentivises hidden time Maximise speed in getting to
predicted date buffers and slower delivery the point where value starts
to be realised
Scope Delivering all of the Incentivises gold plating and Minimize size of work
originally predicted discourages exploitation of packages to maximize both
scope learning. learning and early release of
value
Cost Delivering at or Incentivises hidden cost Maximize value delivered
below a predicted contingencies, pushing (trade development cost
development cost costs up. against the opportunity cost
of delay)
Quality Delivering changes Resistance to making any Shorten feedback cycles at
with zero changes. Overinvestment in many levels (coding, defects…)
downtime and no testing & documentation.
errors
38. KPIs not strictly defined within the control
BPO
ML-IT cycle-time New
IBM quality New
availability
Triage Refine Realise Release
on Cost
on Scope
Dynamic Authorised for Launch Launched
Prioritised List development authorised
on Time
average < 1 week average < 2 weeks
90% < 90 days
70% < 90 days
39. Questions?
Chris Berridge
Programme Manager
Lean Product Development
Maersk Line IT
+45 3363 8165
chris.berridge@maersk.com
Agile Project/Programme Manager of the Year 2011
41. Quick Win
Opportunity
Increments of a
Process Vision
Lightbulb 2 Live
from Idea to Working-in-Production
Triage Refine Realise Release
Idea Generation
Problem Statements
Idea/Problem Broken down into small
increments or iterations
No obvious Dynamic
solution & big
changes Triage Priority
List
Refine Solution
Theories
Pull ideas
that need
to be Timeboxed
broken < 2-4 weeks
down
Small Change
Obvious Candidate
Application RQ5620
Filter of RQ5827
Triage DPL
Feature Team
Refine RQ5864
RQ4659
Pull
Realise
X Pull RQ6843
Service
Redirect Requests
RQ5620
Filter of RQ5827
Triage DPL
GCSS Team
Refine RQ5864
RQ4659
Pull
Realise
B Pull RQ6843
RQ5620
Filter of RQ5827
Triage DPL
SCV Team
Refine RQ5864
RQ4659
Pull
Realise
Pull RQ6843
Parts to be
Filter of delivered
Triage DPL
Coordination
Refine by teams
Release
Team Pull