SlideShare una empresa de Scribd logo
1 de 116
Descargar para leer sin conexión
BUILDING A CREDIBLE
PERFORMANCE
MEASUREMENT BASELINE IN
TWO DAYS
Starting with DID 81650, assemble a credible PMB to
increase the Probability of Program Success (PoPS)
August 23rd and 24th, 2011
Learning Objectives
2
Overview of the Integrated Baseline Review
LO 1
Understand of the motivations for the Performance
Measurement Baseline (PMB) starting with DID 81650
LO 2
Gain the skills in the 6 processes needed to build a
credible PMB, using Risk+ to address DID 81650
LO 3
Develop the framework for schedule, cost, and technical
performance risk categorizations.
LO 4
Gain the skills of executing the PMB, with an integrated
Risk Register to maintain the credibility of the PMB
LO 5
Establish the processes needed to sustain this credibility,
including Risk+ operations, Risk Register functions, and
performance assessment processes
Our Two Day Agenda
3
Day 1 Overview of building a credible Performance Measurement Baseline
08:00 – 08:50 1: Steps to building a credible Performance Measurement Baseline
09:00 – 10:50 2: Individual elements of the Integrated Master Schedule (IMS)
11:00 – 11:50 3: Connecting the dots to an actual IMS
12:00 – 12:50 4: Lunch Break
13:00 – 13:50 5: Example of an Integrated Master Schedule ready for DID 81650
14:00 – 15:50 6: Demonstration of Risk+ integrated with the IMS, and understanding the outcomes
16:00 – 16:50 7: Wrap up for day 1 – feedback from students, corrective actions for Day 2
Day 2 Hands on development of the DRS–MES PMB using DID 81650
08:00 – 08:50 8: DRS–MES IMS structural assessment, gap closure, ready for workshop
09:00 – 10:50 9: Building the risk category values for each Work Package, and updating the risk register
11:00 – 11:50 10: First run of Risk+ and Management Report of confidence of completely on or before planned date
12:00 – 12:50 11: Lunch Break
13:00 – 13:50 12: Adjusting the IMS with this new information
14:00 – 15:50 13: Building the “baseline–able” IMS compliant with DIDS 81650
16:00 – 16:50 14: Final questions, plans for “phone support,” and any remaining closure plans
4
But First A Warning
We’re going to cover a lot of material in two days
5
Day 1
Identify Needed
Capabilities
Establish a
Performance
Measurement
Baseline
Execute the
Performance
Measurement
Baseline
Capabilities
Based Plan
Operational
Needs
Earned Value
Performance
0% /100%
Technical
Performance
Measures
System Value
Stream
Technical
Requirements
Identify
Requirements
Baseline Technical
Performance
Measures
PMB
Changes to
Needed Capabilities
Changes to
Requirements Baseline
Changes to
Performance Baseline
Œ

Ž


DRS–MES6 Deliverables Based Planning ® is a registered trademark of Lewis & Fowler. Copyright ® Lewis & Fowler, 2011
Building the Performance Measurement
Baseline (PMB) from Cost and Schedule
Copyright © 2010, Lewis & Fowler, Use of any or all of this material is prohibited without written permission
7
Integrated Master Schedule
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
Cost and Materiel Baseline
Quarter Quarter Quarter Quarter Quarter
§ CAP contains Budget, Hours, Staff, deliverables,
spread by month, quarter.
§ PMB contains Work Packages, BCWS, EV methods,
sequenced in the proper order.
The integration of the IMS and the Cost Baseline is 2
of the 3 elements of the PMB. The cost spreads by
quarter currently in place are spread to the Work
Packages in the IMS and the BCWS baselined for
performance measurement
SOW Sub #
SOW Sub # SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
SOW Sub #
1.0 Overview
Build a time–phased network of activities describing the work to be performed, the budgeted cost for
this work, the organizational elements that produce the deliverables from this work, and the
performance measures showing this work is proceeding according to plan.
Decompose the program Scope into a product based Work Breakdown Structure (WBS), then further
into Work Packages describing the production of the deliverables traceable to the requirements, and to
the needed capabilities.
3.1
Assign responsibility to Work Packages (the groupings of deliverables) to a named owner
accountable for the management of the resource allocations, cost and schedule baseline, and
technical delivery.
3.2
Arrange the Work Packages in a logical network with defined deliverables, milestones, internal
and external dependencies, with credible schedule, cost, and technical performance margins.
3.3
Develop the Time–Phased Budgeted Cost for Work Scheduled (BCWS) for the labor and material
costs in each Work Package and the Project as a whole. Assure proper resource allocations can be
met and budget profiles match expectations of the program sponsor
3.4
Assign objective Measures of Performance (MoP) and Measures of Effectiveness (MoE) for each Work
Package and summarize these for the Project as a whole.
3.5
Establish a Performance Measurement Baseline (PMB) used to forecast the Work Package and
Project ongoing and completion cost and schedule performance metrics.
3.6
Ž
DRS–MES8
The Road To Project Success Depends On …
Where are we going?
How do we get there?
Are there enough resources?
What are impediments to progress?
How do we measure progress?
DRS–MES9 Deliverables Based Planning ® is a registered trademark of Lewis & Fowler. Copyright ® Lewis & Fowler, 2011
The PLAN is the strategy for the successful completion of the project. The
SCHEDULE is the sequence of work, the assigned resources, and the measures of
progress that implement the Plan.
Both are needed to increase the Probability of Project Success (PoPS)
1: Steps in building a credible PMBDay 1
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
1 Hour
Framework for Increasing the
Probability of Program Success (PoPS)
11
Program Enablers
Program Process Capabilities
Business Enablers
Just a reminder of the project elements
we have control over
12
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
¨ Cost Basis of Estimate (BOE)
built bottom up and
validated top down.
¨ Statement of Work (SOW)
traceable to the Work
Breakdown Structure and all
BOEs
¨ Work Breakdown Structure (WBS) built using MIL-STD-881C
guidance. Products and services only, no functional
departments.
¨ IMP/IMS built using DoD and other guidance to measure
increasing maturity of deliverables.
¨ Technical Performance Measures (TPM) for each major
deliverable in units of measure meaningful to the decision
maker.
DRS–MES13
Want Some Motivation for the WBS?
¨ Forces the creation of
detailed steps but
delineating the products and
services that produce them.
¨ Lays the groundwork for
schedule and budget by
creating “buckets” to assign
resources and costs.
14
¨ Creates accountability by defining explicit connections between
the work to be performed and those performing the work.
¨ Creates commitment by making visible to all project participants
the previous three activities.
What does a good WBS NOT look
like?
¨ It’s not a laundry list of work to be done.
¨ It’s not a functional decomposition.
¨ It’s not a direct map of the requirements.
¨ It’s not a reflection of the underlying software
partitioning.
¨ It’s not the first structure you might think of…
15
Risk
SOW
Cost
IMP/IMS
TPM
PMB
WBS
Connect the WBS to Work Packages and
define the Tasks to produce Deliverables
Business Need
Process Invoices for Top
Tier Suppliers
1st Level
Electronic Invoice
Submittal
1st Level
Routing to Payables
Department
2nd
Level
Payables Account
Verification
2nd Level
Payment Scheduling
2nd
Level
Material receipt
verification
2nd Level
“On hand” balance
Updates
Deliverables defined in WP
16
Risk
SOW
Cost
IMP/IMS
TPM
PMB
WBS
Establishing the Three Elements of the
Performance Measurement Baseline
Cost Baseline
Schedule Baseline
Technical Baseline
Perform
Functional
Analysis
Determine
Scope and
Approach
Develop
Technical
Logic
Develop
Technical
Baseline
Develop
WBS
Define
Activities
Estimate
Time
Durations
Sequence
Activities
Finalize
Schedule
Identify
Apportioned
Milestones
Determine
Resource
Requirements
Prepare Cost
Estimate
Resource
Load
Schedule
Finalize
Apportioned
Milestones
Determine
Funding
Constraints
Approve
PMB
17
Risk
SOWIMP/IMS
TPM
PMB
WBS
Cost
What does a good schedule look
like?
¨ A good schedule is predictive – it shows what is
going to happen in the future and what the
alternatives are if that doesn't actually happen
¨ A good schedule is reflective – it shows where the
project stands in relations to the planned position
against the actual work that has been accomplished
¨ A good schedule is dynamic – it can be adjusted
when the reality of the project changes
18
Risk
SOW
Cost
TPM
PMB
WBS
IMP/IMS
Improving the credibility of the
schedule
¨ Build the requirements in a tool
¨ Build the PLAN before building the SCHEDULE
¨ Manage the project with a Project Management
Tool
¨ Make every task duration fit a predefined guide
¨ Use a RACI and RAM to assign accountability
¨ Every task has a deliverable
¨ Have a plan B and a plan C
¨ All cost and durations are random variables
¨ In the end, it’s always about the people
19
Risk
SOW
Cost
TPM
PMB
WBS
IMP/IMS
A “thread worn” and corny phrase that
still is the best approach to success
20
What does a PLAN Look
Like?21
Risk
SOW
Cost
TPM
PMB
WBS
IMP/IMS
DRS–MES22
DRS–MES23
DRS–MES24
DRS-MES
Mapping the steps to the process of building the
Performance Measurement Baseline
The six steps of physically assembling the
Performance Measurements Baseline cover all the
processes of Establishing the PMB.
Each step in the sequence advances the PMB to its
final maturity – ready for baselining
Decompose
Scope
Assign
Responsibility
ArrangeWork
Packages
DevelopBCWS
Assign
Performance
Measures
SetPerformance
Baseline
Perform functional analysis P
Determine scope and approach P
Develop Work Breakdown Structure P
Develop technical logic P
Develop technical baseline P
Approve performance measure baseline
Define activities P P
Estimate time durations P
Sequence activities P
Indentify apportioned milestones P
Finalize schedule P
Finalized apportioned milestones P
Determine resource requirements P
Prepare cost estimates P
Resource load schedule P
Determine funding constraints P
25
A credible IMS is more than the work, durations, and relationships. It’s an
executable set of activities that implements the program’s strategy – the PLAN. The
IMS buys down risk, provides visibility to project performance, indicates alternative
approaches, and provides actionable information to the decision makers.
2: Individual Elements of an Integrated Master ScheduleDay 1
2 Hours
Critical Success Factors for the Performance
Measurement Baseline
¨ Deliverables represent the required business capabilities
and its value as defined by the business and shared by
the development team.
¨ When all deliverables and their Work Packages are
completed, they are not revisited or reopened.
¤ They are 100% done.
¨ The progression of Work Packages defines the
increasing maturity of the project.
¤ The business value of the deliverables to the customer
increases as Work Packages are completed.
¨ Completion of Work Packages is represented by the
Physical Percent Completion of the project.
¤ Either 0%/100% or Apportioned Milestones are used to
state the completion of each Work Package.
Business
Requirements
Technical
Capabilities
Work Packages
Deliverables
27
DRS–MESIndividual Elements of the Integrated Master Schedule
The Critical Few
1. Estimated durations developed to known
confidence levels.
2. Probability Distributions for categories of work.
3. Risk parameters for each category of work.
4. Credible sequences of work dependencies.
5. Alterative paths through the network to deal with
uncertainty.
6. Measures of performance in units meaningful to
the decision makers.
28
DRS–MESIndividual Elements of the Integrated Master Schedule
Let’s Build the Performance
Measurement Baseline Using The
Eight Steps
29http://www.softwaretechnews.com/images/STN_April_09_lores_Page_29_Image_0001.jpg
This approach is called Product Development Kaizen and is used by Lean Six Sigma firms to ferret out
the system capabilities before any technical or operational requirements are defined.
Use this to reverse engineer or validate the WBS and connect WHAT with WHY before proceeding to
build the CWBS or confirm the WBS. 30
Program
Events
Statement of Work CWBS
Significant
Accomplishments
Accomplishment
Criteria
CDRLs and
Deliverables
Tasks Contained in
Work Packages
Measures the progress to plan using Physical & Complete at the Accomplishment Criteria (AC) and
CWBS level start with making to the following connections
Defines
Aligned Aligned
AlignedAligned
Aligned
Completed SA’s are
entry criteria for
Program Events
Completed Work
Packages are exit criteria
for Tasks
Describes increasing product
maturity as 0/100 or EVMS SD
guidance
Documents the product
maturity that is aligned with
SOW and CWBS
Work necessary to mature
products grouped by
CWBS
Work structure
aligned to SOW
31
Update Contractor
System Spec
Update Program
Development
Allocate Functional
Reqmts
Update Functional
System Design
Develop HWCI
Specifications
Develop SIL
Specifications
Build Astp1
F-18 IRR
SIL Baseline 1.0
Update SIL Test
Cases
Develop Prelim SIL
CSCI
Critical Component s
AstP 1,2
SSpS 1,2,3
1
2
3
4
6
7
5
8
10
9
11
13
14
15
Update AS Test
I&T on CVN
I&T on LHA
12
Contract Award + 15 days
Systems Requirements Review (SRR)
System Functional Review (SFR)
HW Preliminary Design Review (PDR)
System PDR
EDM 1.0 Baseline
EDM 2.0 Baseline
Mfg Docs Available
TBD
TRR 1.0
EDM 7-8 TRR
32
§ Each collection point provides an
assessment of incremental
business or mission value.
§ Defining these points before the
project starts is the basis of
measuring progress to plan.
§ Because then you know what
done looks like before it arrives.
Deliverables
WBS
Tasks and Schedule
Business Need
Process Invoices for Top
Tier Suppliers
1st Level
Electronic Invoice
Submittal
1st Level
Routing to Payables
Department
2nd Level
Payables Account
Verification
2nd Level
Payment Scheduling
2nd Level
Material receipt
verification
2nd Level
“On hand” balance
Updates
Work
Package
(WP)
1 2
3
4
6
5 A
B
Deliverables defined in WP
Terminal Node in the WBS defines
the products or services that
produce the products of the
project
Terminal node of the WBS
defined by a Work Package.
Tasks within the Work
Package produce the
Deliverables
100% Completion of the deliverables is
the measure of performance for the
Work Package
Management of the Work
Package Tasks is the
responsibility of the WP
Manager.
A decomposition of the work
needed to fulfill the business
requirements
33
34
Maturity ActionProduct Product State
Adjective VerbNoun Verb
CompleteDesignModel/SimPreliminary
Program Events
Define the availability
of a Capability at a point in
time.
Accomplishments
Represent requirements
that enable Capabilities.
Criteria
Represent Work Packages that
fulfill Requirements.
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
§ The increasing maturing of a product or service is described through Events or
Milestones, Accomplishments, Criteria, and Work Packages.
§ The presence of these capabilities is measured by the Accomplishments and their
Criteria.
§ Accomplishments are the pre–conditions for the maturity assessment of the
product or service at each Event or Milestone.
§ Performance of the work activities, Work Packages, Criteria, Accomplishments, and
Events or Milestones is measured in units of “physical percent complete” by
connecting Earned Value with Technical Performance Measures.
Work
Package
35
36
37
AC: 005
Task
Task
Task
Task
AC
AC:023
Task
Task
Task
Task
AC
§ The 100% completed work in AC:005 is needed to start the work in AC:023
§ In the IMP/IMS paradigm, there is no Task-to-Task connection across Accomplishment
Criteria (AC) boundaries, only within an AC
§ The AC-to-AC linking states “…all work in the predecessor AC must be complete before
starting the successor work, assuring the minimum of rework due to partially defined
requirements or partially completed products”
38
PE: BPE: A
SA: 001
SA: 002
SA: 003
SA: 004
PE: A
Task
Task
Task
AC: 006
§ The best arrangement has the completion of Event A start the first task in Event B.
§ All work performed beyond the date of Event A is done at risk.
§ At PDR (Event A), approval to proceed Event B (CDR) is given
§ Only long lead items should cross Program Event boundaries
§ All other work terminates on the Program Event where a formal review of the planned
maturity is conducted – SRR, SFR, PDR, CDR, …
§ This topology assures a complete assessment of “progress to plan,” is available at each
Program Event
39
SA: 008
PE: B
40
Risk: CEV-037 - Loss of Critical Functions During Descent
Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)
RiskScore
24
22
20
18
16
14
12
10
8
6
4
2
0
Conduct Force and Moment Wind
Develop analytical model to de
Conduct focus splinter review
Conduct Block 1 w ind tunnel te
Correlate the analytical model
Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
Flight Application of Spacecra
CEV block 5 w ind tunnel testin
In-Flight development tests of
Damaged TPS flight test
31.Mar.05
5.Oct.05
3.Apr.06
3.Jul.06
15.Sep.06
1.Jun.07
1.Apr.08
1.Aug.08
1.Apr.09
1.Jan.10
16.Dec.10
1.Jul.11
Risk Response and
Risk ID in IMS
Milestone Date
traceable between RM
Tool and IMS
41
An estimate must contain a confidence interval
and an error band on that confidence interval to
be credible.
Otherwise it’s just a guess.
1. Estimating Duration of WPs42
DRS–MESIndividual Elements of the Integrated Master Schedule
Steps in Building the Work Packages
¨ Step 1 – define what is going to be delivered to
produce business value
¤ One or more Deliverables produced within a Work
Package.
¨ Step 2 – define the effort and duration along with
the confidence levels
¤ Only effort and total duration.
¤ Level of confidence for effort and duration.
43
DRS–MESIndividual Elements of the Integrated Master Schedule
Define what’s going to be produced to
deliver business value
¨ Step 1 – Define the deliverables and their
apportioned value
Description Deliverable(s) Apportioned Milestones
Transaction processing integration test
complete.
§Test plan compete and approved
§Author – 50%
§Approval – 50%
Define integration testing environment.
§Integration Test Plan complete
§Test platform equipment defined
§Test environment defined
§Test Plan – 25%
§Equipment List – 50%
§Environment – 25%
Business processes defined and
approved.
§Business process flow diagram §100%
User acceptance testing defined. §User Acceptance Plan Developed §100%
User Acceptance Testing Conducted.
§Test environment operational
§User Acceptance Testing performed
with 90% success
§UAT errors documented and allocated
for repair in next release
§Environment – 20%
§UAT Conducted – 70%
§Errors documented –
10%
44
DRS–MESIndividual Elements of the Integrated Master Schedule
Project Deliverables
Notional Percentage
Allocation
Actual Allocation on past
projects
Requirements / Analysis 20%
Product or Service Design 10%
Product or Service Production 25%
System Integration 10%
System Test Processes 15%
User Acceptance Testing Processes 10%
DRS–MESIndividual Elements of the Integrated Master Schedule
Define the effort and duration along with
the confidence levels
¨ Step 2 – construct the estimates within confidence
levels
Description
Duration
Duration
Confidence
Effort
Effort
Confidence
Transaction processing integration test complete 10w 1 2680h 2
Define integration testing environment 4w 1 480h 1
Business processes defined and approved 6w 2 1200h 1
User acceptance testing defined 3w 2 800h 2
User acceptance testing conducted 4w 1 200h 1
46
DRS–MESIndividual Elements of the Integrated Master Schedule
Questions to the Group Answers from the Group
Can we do this in one (1) year? Sure, no problem
How about one (1) week? Oh not hardly, can’t be done in a week
How about six (6) months? Yea, that might be possible
How about four (4) months? That’s cutting it really close, I’m not sure about the 4 months
How about five (5) months? Yea, that’s be about a short as I’d go
To put this into practice requires more discipline of course. But the principle of a Wide Band Delphi
estimating process is well tested in the field and well documented in the literature.
Using the 20 questions game is an easy way to get to an estimate for duration and effort.
Given a software project element, how long will it take and how much effort is expended over that period. This effort
over duration will provide the cost.
¤ We have this requirement for a customer service interface. The functions can be enumerated and the core
technology is known
¤ Ask the following series of questions
So with 5 questions asked of a group of subject matter experts, we can get an estimate of 5 months with a variance of 1
month or so on either side. That’s a 20% accuracy on a simple problem in about 30 seconds.
Scale that to larger or more complex problems and more questions – or better questions – and a bit more
thoughtfulness for the questions and you can get within 20%.
Getting to an estimate without having to
understand all the detailed requirements
47
Conditions for a discrete Work Package
used for Performance Measurement Example of Work Package and its use
Discrete
Combined
Rationale for the Performance Measurement
Outcome of the WP is a technical work
product
Requirements, designs, or test procedures
needs as a set for a downstream task
Y N
If the WP constrains the start or completion of
a subsequent WP, analyze schedule
variances to determine impact on downstream
activities
Outcome of the WP is a set of technical work
products. An individual work product is a
component of the end work product may be
an input to a subsequent WP before
completion of the set, but is not itself a
constraint
Individual requirements, design or test within
a WP that is an input to a downstream task
but is itself not a constraint
Y Y
An individual work product is not a constraint
to a downstream task, there is no need to
monitor its progress at the WP level. It may
be combined with similar work products in a
WP. Only the WP completion must be linked
with the successor activity
Outcome is a scheduled process required to
meet a project objective
The process must be implemented to achieve
planned cost, performance of schedule –
standing up a development environment
Y N
Outcome is a recurring work product that
does not constrain the start or completion of
another recurring WP
Status reporting or documentation of a
recurring meeting
N Y
Recurring work products, although scheduled,
rarely constrains another task. There is no
significant schedule impact to downstream
tasks
Work scope is general or supportive Project management, administrative support N Y
Multiple Level of Effort tasks may be
combined into one WP supporting detail of
time phased budget at the task level should
be maintained
Derived from, Performance–Based Earned Value®, Paul Solomon and Ralph Young, John Wiley & Sons, 2010
DRS–MESIndividual Elements of the Integrated Master Schedule
There are two types of Uncertainty
Uncertainty about the
functional and performance
aspects of the program’s
technology that impacts the
produceability of the product
or creates delays in the
schedule
Uncertainty about the
duration and cost of the
activities that deliver the
functional and performance
elements of the program
independent of the technical
risk
49
Technical Programmatic
DRS–MESIndividual Elements of the Integrated Master Schedule
All elements of a projects, its cost, schedule, and
technical performance, are random variables.
Knowing the underlying probability distribution
of these random variables is a Critical Success
Factor for the application of Monte Carlo
Simulation.
2. Probability Distributions50
DRS–MESIndividual Elements of the Integrated Master Schedule
Risk
Probability Distribution Function is the
Lifeblood of good planning
¨ Probability of
occurrence as a
function of the
number of
samples
¨ “The number of
times a task
duration appears
in a Monte Carlo
simulation”
51
Risk
Task “Most Likely” ≠ Project “Most Likely,”
Must be Understood by Every Planner
¨ PERT assumes
probability
distribution of the
project times is the
same as the tasks
on the critical
path.
¨ Because other
paths can become
critical paths, PERT
consistently
underestimates the
project completion
time.
1 + 1 = 3
3
52
Risk
Inputs
Outputs
The Program is a System, Just like the any other
System with complex interactive parts
53
¨ The programmatic and planning dynamics act as a system.
¨ The “system response” is the transfer function between input and output.
¨ Understanding this transfer
function may appear beyond
our interest.
¤ But it is part of the stochastic
dynamic response to
disruptions in our plans.
¤ “What if” really means “what
if” at this point in the
response curve of the system.
Risk management is
how adults manage
projects.
‒ Tim Lister (IBM
Fellow)
3. Risk Parameters for Planned Work54
55
Risk is measured as any deviation
from the original baseline.
Risk is anything that results in a
variance.
Variance at Completion (VAC) is
the basic measure of risk
encountered by the end of the
contract effort, whether the risk is
rooted in issues related to
planning of scope, estimating,
scheduling, or technical criteria
that are identified during the
normal course of the program
execution
Risk
Why Probabilistic Risk Analysis is Often
Opposed by Management
Many people do not understand the
underlying statistics
¤ Education, practice, guidance
Many planners lack the formal
probability and statistics training
¤ Education, practice, guidance
Most planners perform deterministic
analysis of schedules and cost
¤ Risk is hard work
The fact the probabilistic risk analysis is built on uncertainty is seen as
weakness in the planning process, not a strength
¤ Why can’t you know how long it will take or how much it costs?
People tend to think that the “lack of data” is a reason not to perform
probabilistic schedule risk analysis
¤ The exact opposite is true
56
Level Likelihood
E Near Certainty
D Highly Likely
C Likely
B Low Likelihood
A Not Likely
Level Technical Performance Schedule Cost
A
Minimal or no consequence to
technical performance
Minimal or no impact Minimal or no impact
B
Minor reduction in technical
performance or supportability
Able to meet key dates
Budget increase or unit
production cost
increases.
< **(1% of Budget)
C
Moderate reduction in technical
performance or supportability with
limited impact on program objectives
Minor schedule slip. Able to
meet key milestones with
no schedule float.
Budget increase or unit
production cost
increase
< **(5% of Budget)
D
Significant degradation in technical
performance or major shortfall in
supportability
Program critical path
affected
Budget increase or unit
production cost
increase
< **(10% of Budget)
E
Severe degradation in technical
performance
Cannot meet key program
milestones.
Slip > X months
Exceeds budget
increase or unit
production cost
threshold
DRS–MESIndividual Elements of the Integrated Master Schedule DRS–MES57
This matrix must be built for
each category of risk.
The decision for each dimension
comes from Subject Matter
Experts and the Risk
Management team.
E
D
C
B
A
A B C D E
Putting planned work in the right order is an
iterative process. If you think you’ve got it right
the first time, it’s wrong.
If you think you’ve got it right the 3rd time, you’re
getting close.
Use the Monte Carlo Simulator to assess the
impacts of the work order – the near Critical
Path analysis
4. Credible Sequencing of the Work58
DRS–MESIndividual Elements of the Integrated Master Schedule
Attribute Beneficial Outcome from this Attribute
Maturity Flows through
Program Events
§ Performance measurement is in units of increasing maturity
of the Technical Performance Measures
§ Each event is a mini authorization to proceed
Single outcome for each
work package (AC)
§ Measure Physical Percent Complete at the WP level
§ Use 0/100 for tasks for the vast majority of work
Technical Performance
Measures are explicitly
visible
§ Connect Cost, Schedule, and Technical Performance
§ EV does not provide a means of adjust for “off TPM,” but
make your own adjustments to the risk numbers for now
Risk retirement explicitly
visible
§ Risk retirement is embedded in the IMS
§ Risk mitigation means waiting until the risk happens
IMS flows vertically 1st and
horizontally 2nd
§ All work supports the assessment of maturity
§ Isolate tasks dependencies within a Work Package
No Event linkage except
for long lead items
§ 0/100 requires not partial completion
Decoupled dependences
improves risk
responsiveness
§ 1st round IMS defines a free flowing process
§ Maintaining this decoupling is key to a “dynamic” IMS that
can respond to the natural changes in the program
59
AQuickReview…
The Performance Measurement Baseline (PMB) is a time-phased budget plan for
accomplishing work, against which contract performance is measured. It includes
the budgets assigned to scheduled control accounts and the applicable indirect
budgets. For future effort, not planned to the control account level, the PMB also
includes budgets assigned to higher level Contractor Work Breakdown Structure
(CWBS) elements, and to undistributed budgets. It does not include management
reserve.
— Earned Value Implementation Guide, October 2006
But if you’ve got:
§ The wrong work, performed in the wrong order,
§ Work that can’t measured against the Technical Performance Measures,
§ Insufficient resources to absorb the planned BCWS,
§ No measure of effectiveness (MOE) or measure of performance (MOP) of the
produced products against the planned outcomes, or
§ No risk retirement tasks embedded in the IMS…
… THE PMB IS NOT CREDIBLE
60
We always need a Plan B and many times a
Plan C.
These paths don’t have to be on baseline, but
they have to be in the mind of the Program
Manager, because when they are needed, it’s
usually too late to discover them.
5. Identify Alternative Paths61
DRS–MESIndividual Elements of the Integrated Master Schedule
To Achieve Success …
62
We Need to …
©gapingvoid ltd www.gapingvoidgallery.com
Branching Probabilities – Simple Approach
¨ Plan the risk alternatives that “might”
be needed
¨ Each mitigation has a Plan B branch
¨ Keep alternatives as simple as
possible (maybe one task)
¨ Assess probability of the alternative
occurring
¨ Assign duration and resource estimates
to both branches
¨ Turn off for alternative for a “success”
path assessment
¨ Turn off primary for a “failure” path
assessment
30% Probability
of failure
70% Probability
of success
Plan B
Plan A Current Margin Future Margin
80% Confidence for completion
with current margin
Duration of Plan B Plan A + Margin£
63
DRS–MESIndividual Elements of the Integrated Master Schedule
Managing Margin in the Risk Tolerant IMS
requires the reuse of unused durations
¨ Programmatic Margin is added
between Development, Production
and Integration & Test phases
¨ Risk Margin is added to the IMS
where risk alternatives are
identified
¨ Margin that is not used in the IMS
for risk mitigation will be moved to
the next sequence of risk
alternatives
¤ This enables us to buy back schedule
margin for activities further downstream
¤ This enables us to control the ripple
effect of schedule shifts on Margin
activities
5 Days Margin
5 Days Margin
Plan B
Plan A
Plan B
Plan AFirst Identified Risk Alternative in IMS
Second Identified Risk
Alternative in IMS
3 Days Margin Used
Downstream
Activities shifted to
left 2 days
Duration of Plan B < Plan A + Margin
2 days will be added
to this margin task
to bring schedule
back on track
64
DRS–MESIndividual Elements of the Integrated Master Schedule
Measures of Performance (MoP), Measures of
Effectiveness (MoE), and Technical Performance
Measures (TPM) are the basis of measuring
“done.”
These measures are used with the probabilistic
confidence to provide
6. Meaningful Measures65
DRS–MESIndividual Elements of the Integrated Master Schedule
Do We Know How To Measure Value Along The
Way To Our Destination?
66
¨ How do we increase visibility into program performance?
¨ How do we reduce cycle time to deliver the product?
¨ How do we foster accountability?
¨ How do we reduce risk?
¨ How do we start our journey to success?
What’s Our Motivation for “Connecting the
Dots?”
67
Technical Performance Measures …
¨ Provide program management with information to
make better decisions,
¨ Increase the probability of delivering a solution that
meets both the requirements and mission need.
Measure of Effectiveness (MoE)
¨ Measures of Effectiveness …
¨ Are stated in units meaningful to the buyer,
¨ Focus on capabilities independent of any technical
implementation,
¨ Are connected to the mission success.
“Technical Measurement,” INCOSE–TP–2003–020–01
68
Measure of Performance (MoP)
¨ Measures of Performance are …
¨ Attributes that assure the system has the capability
to perform,
¨ Assessment of the system to assure it meets design
requirements to satisfy the MoE.
“Technical Measurement,” INCOSE–TP–2003–020–01
69
Key Performance Parameters (KPP)
¨ Key Performance Parameters …
¨ Have a threshold or objective value,
¨ Characterize the major drivers of performance,
¨ Are considered Critical to Customer (CTC).
“Technical Measurement,” INCOSE–TP–2003–020–01
70
Technical Performance Measures (TPM)
¨ Technical Performance Measures …
¨ Assess design progress,
¨ Define compliance to performance requirements,
¨ Identify technical risk,
¨ Are limited to critical thresholds,
¨ Include projected performance.
“Technical Measurement,” INCOSE–TP–2003–020–01
71
Dependencies Between These Measures
“Coming to Grips with Measures of Effectiveness,” N. Sproles,
Systems Engineering, Volume 3, Number 1, pp. 50–58
72
A Simple Method of Assembling the TPMs
73
Technical Performance Measures Trends
and Responses
74
25kg
23kg
28kg
26kg
PDRSRRSFRCA TRRCDR
ROM in Proposal
Design Model
Bench Scale Model Measurement
Detailed Design Model
Prototype Measurement
Flight 1st Article
TechnicalPerformanceMeasure
VehicleWeight
DRS–MES
There are many moving parts in the credible IMS. The Critical Few are the
ones we’ll focus on in these sessions.
3: Connecting the Dots to an Actual IMSDay 1
1 Hour
How Can We Measure Credibility?
¨ Statistical credibility
¤ The probability of completing on or before a date
¤ The probability of cost being some value or less
¨ Program architecture credibility
¤ Can the planned maturity be reached with the work
activities shown in the IMP?
¨ Technical performance credibility
¤ What measures of effectiveness (MOE) and measures
of performance (MOP) are needed to assure increasing
technical maturity?
76
Remember, the dots are random variables
77
The critical few for connecting the dots
¨ Work durations that have probabilistic work values
¤ Calibrated – Ordinal – probability distributions
¤ Assignment of risk ranges to classes of work
¨ A Logical flow of work
¤ Work activities are nose to tail
¤ 100% complete assessment before starting next
activity
¤ Resource loaded for BCWS to connect cost to schedule
78
Thinking About Risk Categories
Classification Uncertainty Overrun
A Routine, been done before Low 0% to 2%
B Routine, but possible difficulties Medium to Low 2% to 5%
C Development, with little technical difficulty Medium 5% to 10%
D Development, but some technical difficulty Medium High 10% to 15%
E Significant effort, technical challenge High 15% to 25%
F No experience in this area Very High 25% to 50%
¨ These categories can be used to avoid asking
the “3 point” question for each task
¨ This information will be maintained in the IMS
¨ When updates are made the percentage
change can be applied across all tasks
79
First, the major data elements
80
Task to “watch”
(Number3)
Most Likely
(Duration3)
Pessimistic
(Duration2)
Optimistic
(Duration1)
Distribution
(Number1)
Before lunch a quick look at the end
81
¨ The height of each box indicates how
often the project complete in a given
interval during the run
¨ The S–Curve shows the cumulative
probability of completing on or
before a given date.
¨ The standard deviation of the
completion date and the 95%
confidence interval of the expected
completion date are in the same units
as the “most likely remaining duration”
field in the schedule
Date: 9/26/2005 2:14:02 PM
Samples: 500
Unique ID: 10
Name: Task 10
Completion Std Deviation: 4.83 days
95% Confidence Interval: 0.42 days
Each bar represents 2 days
Completion Date
Frequency
CumulativeProbability
3/1/062/10/06 3/17/06
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
0.02
0.04
0.06
0.08
0.10
0.12
0.14
0.16 Completion Probability Table
Prob ProbDate Date
0.05 2/17/06
0.10 2/21/06
0.15 2/22/06
0.20 2/22/06
0.25 2/23/06
0.30 2/24/06
0.35 2/27/06
0.40 2/27/06
0.45 2/28/06
0.50 3/1/06
0.55 3/1/06
0.60 3/2/06
0.65 3/3/06
0.70 3/3/06
0.75 3/6/06
0.80 3/7/06
0.85 3/8/06
0.90 3/9/06
0.95 3/13/06
1.00 3/17/06
Task to “watch”
80% confidence
that task will
complete by
3/7/06
4: LunchDay 1
Let’s look at an IMS that has been populated with the fields and their contents
that is ready for a Risk+ assessment.
We’ll walk through this set up process later, but here’s the complete product.
5: Example of an IMS ready for DID 81650Day 1
1 Hour
DRS–MESIndividual Elements of the Integrated Master Schedule
Live Example of MSFT
Project IMS
Risk+ requires a set up process, an operational process, and an analysis
process to provide meaningful information to the decision makers.
Risk+ tells us the probability of completing “on or before a date,” at “a cost
or less.”
6: Demonstration of Risk+Day 1
Date: 9/26/2005 2:14:02 PM
Samples: 500
Unique ID: 10
Name: Task 10
Completion Std Deviation: 4.83 days
95% Confidence Interval: 0.42 days
Each bar represents 2 days
Completion Date
Frequency
CumulativeProbability
3/1/062/10/06 3/17/06
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
0.02
0.04
0.06
0.08
0.10
0.12
0.14
0.16 Completion Probability Table
Prob ProbDate Date
0.05 2/17/06
0.10 2/21/06
0.15 2/22/06
0.20 2/22/06
0.25 2/23/06
0.30 2/24/06
0.35 2/27/06
0.40 2/27/06
0.45 2/28/06
0.50 3/1/06
0.55 3/1/06
0.60 3/2/06
0.65 3/3/06
0.70 3/3/06
0.75 3/6/06
0.80 3/7/06
0.85 3/8/06
0.90 3/9/06
0.95 3/13/06
1.00 3/17/06
2 Hours
DRS–MESIndividual Elements of the Integrated Master Schedule
Live Example of Risk+
Quick Look At Monte Carlo
87
sinA l q<
What is Monte Carlo Simulation?
¨ A class of computational algorithms that rely on
repeated random sampling to compute their results.
¨ Useful for simulating systems with many coupled
degrees of freedom.
¨ Used to model phenomena with significant
uncertainty in inputs, such as risk.
¨ Evaluate multidimensional definite integrals with
complicated boundary conditions
88
DRS-MES
Let’s Visit The Risk Classification Again
Classification Uncertainty Overrun
A Routine, been done before Low 0% to 2%
B Routine, but possible difficulties Medium to Low 2% to 5%
C Development, with little technical difficulty Medium 5% to 10%
D Development, but some technical difficulty Medium High 10% to 15%
E Significant effort, technical challenge High 15% to 25%
F No experience in this area Very High 25% to 50%
¨ These classifications can be used to avoid asking
the “3 point” question for each task
¨ This information will be maintained in the IMS
¨ When updates are made the percentage
change can be applied across all tasks
89
DRS-MES
Guiding the Risk Factor Process requires
careful weighting of each level of risk
Min Most
Likely
Max
Low 1.0 1.04 1.10
Low+ 1.0 1.06 1.15
Moderate 1.0 1.09 1.24
Moderate+ 1.0 1.14 1.36
High 1.0 1.20 1.55
High+ 1.0 1.30 1.85
Very High 1.0 1.46 2.30
Very High+ 1.0 1.68 3.00
For tasks marked “Low” a reasonable approach
is to score the maximum 10% greater than the
minimum.
The “Most Likely” is then scored as a geometric
progression for the remaining categories with a
common ratio of 1.5
Tasks marked “Very High” are bound at 200% of
minimum.
¤ No viable project manager would like a task
grow to three times the planned duration without
intervention
The geometric progress is somewhat arbitrary but
it should be used instead of a linear progression
90
DRS-MES
Risk+ Quick Overview
Task to “watch”
(Number3)
Most Likely
(Duration3)
Pessimistic
(Duration2)
Optimistic
(Duration1)
Distribution
(Number1)
91
Monte Carlo Simulation of Schedule Risk
¨ The height of each box indicates how often the project complete in a given
interval during the run
¨ The S–Curve shows the cumulative probability of completing on or before a
given date.
¨ The standard deviation of the completion date and the 95%
confidence interval of the expected completion date are in the same
units as the “most likely remaining duration” field in the schedule.
92
DRS–MES
Date: 9/26/2005 2:14:02 PM
Samples: 500
Unique ID: 10
Name: Task 10
Completion Std Deviation: 4.83 days
95% Confidence Interval: 0.42 days
Each bar represents 2 days
Completion Date
Frequency
CumulativeProbability
3/1/062/10/06 3/17/06
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
0.02
0.04
0.06
0.08
0.10
0.12
0.14
0.16 Completion Probability Table
Prob ProbDate Date
0.05 2/17/06
0.10 2/21/06
0.15 2/22/06
0.20 2/22/06
0.25 2/23/06
0.30 2/24/06
0.35 2/27/06
0.40 2/27/06
0.45 2/28/06
0.50 3/1/06
0.55 3/1/06
0.60 3/2/06
0.65 3/3/06
0.70 3/3/06
0.75 3/6/06
0.80 3/7/06
0.85 3/8/06
0.90 3/9/06
0.95 3/13/06
1.00 3/17/06
Task to “watch”
80% confidence
that task will
complete by
3/7/06
DRS-MES
Integrating Risk and Schedule
Probabilistic
completion times
change as the program
matures
The efforts that
produce these
improvements must be
traceable in the IMS
The “error bands” on
the events must include
the risk mitigation
activities as well
IMS activities show how the “error band” narrows over time.
¤ This is the basis of a “programmatic risk tolerant” IMS
¤ The probabilistic interval becomes more reliable as risk mitigations and
maturity assessments add confidence the to IMS1
Baseline
Plan
80%
Mean
Missed
Launch
Period
Launch
Period
Ready
Early
Oct 07
Nov 07
Dec 07
Jan 08
Feb 08
Mar 08
Apr 08
May 08
Jun 08
Plan
Margin
Current Plan
with risks is the
stochastic schedule
CDR
PDR
SRR
FRR
ATLO
20%
Aug 05 Jan 06 Aug 06 Mar 07 Dec 07 Feb 08
Current Plan
with risks is the
deterministic schedule
Risk
Margin
93
DRS-MES
What Can Confidence Intervals Tell Us
about the validity of the IMS?
¨ As the program
proceeds so
does
¤ Increasing
accuracy
¤ Reduced
schedule risk
¤ Increasing
visual
confirmation
that success can
be reached
Current Estimate Accuracy
94
DRS-MES
The Cost Probability Distributions as a
function of the weighted cost drivers
$
Cost Driver (Weight)
Cost = a + bXc
Cost
Estimate
Historical data point
Cost estimating relationship
Standard percent error boundsTechnical Uncertainty
Combined Cost
Modeling and Technical
Uncertainty
Cost Modeling
Uncertainty
95
The raw materials for connecting the dots is in place. Let’s test that
statement with feedback and plans for tomorrow
7: Wrap Up and FeedbackDay 1
1 Hour
Let’s To Put These Ideas to Work Tomorrow on a Real Project
98
Day1
END
99
Day2
100
OK, enough of the classroom work, let’s go to work
With our “real” IMS let’s look at the structural aspects of the work
efforts before doing any real analysis.
8: Structural Assessment and Gap ClosuresDay 2
1 Hour
Integrating the Cost, Schedule and
Technical Risk Model
Cost, Schedule, Technical Model†
WBS
Task 100
Task 101
Task 102
Task 103
Task 104
Task 105
Task 106
Probability
Density Function
§ Research the Project
§ Find Analogies
§ Ask Endless Questions
§ Analyze the Results
§ What can go wrong?
§ How likely is it to go wrong?
§ What is the cause?
§ What is the consequence?
Monte Carlo Simulation Tool
is Mandatory
1.0
.8
.6
.4
.2
0
Days, Facilities, Parts, People
Cumulative Distribution Function
102
Start with a “notional” arrangement of the
“Bundles” of Work
¨ WPs should not have intermediate connections to other WPs.
7w
7w
5w
5w
1w 3w 3w
3w
2w
7w
5w
¨ The first approach
is to have long
running WPs with
negative or
positive lags to
maintain
sequencing.
¨ A better approach
is to break the WP
into separate
deliverables and
sequence Finish to
Start .
103
Schedule Margin
104
¨ DID 81650 defines schedule margin as a designated
buffer and stipulates it is part of the baseline
–ginMar–
Margin
Margin
“Applying Schedule Reserve to Software Project Management,” Walter Lipke, STSC CrossTalk, March 1999
105 PMB
The simple approach to risk categories is just that “simple.” We’ll need
to understand the concepts of ordinal risk ranking and the interaction
between the risk Probability Distribution Function (PDF) and the Risk +
work processes.
9: Building Risk CategoriesDay 2
2 Hours
Never calculate without first knowing the answer
– John Archibald Wheeler
Risk Ranking of Individual Tasks
108
Risk
Rank
Percent
Variance Notional Interpretation of Risk Ranking
1 – 5% + 10% Normal business, technical & manufacturing processes
are applied
2 – 5% + 15% Normal business & technical processes are applied;
new or innovative manufacturing processes
3 – 5% + 35% Flight software development & certification processes
4 – 10% + 25% Build & qualification of flight components, subsystems
& systems
5 – 10% + 35% Flight software qualification
6 – 5% + 175% ISS thermal vacuum acceptance testing
Let’s take Risk+ out for a ride on our real schedule and discover how
confident we are on the completion dates.
10: First run of Risk+ on a real scheduleDay 2
1 Hour
LunchDay 2
1 Hour
Now that we’ve seen the pictures, what do we do about them? What
decisions can be made? What adjustments are needed to increase our
confidence in meeting the completion dates.
12: Adjusting the IMS with this new informationDay 2
1 Hour
With this information let’s define how much margin is needed where to
put this margin and how to assess the “probability of completing on or
before a specific date.”
13: Building a Baseline–able IMS compliant with 81650Day 2
2 Hours
IMS Improvement Opportunities
¨ All tasks arrange Finish-to-Start
¤ No leads or lags
¤ This allows re-sequencing with little or no effort
¤ Provides visibility to the flow of work
¤ All task work complete before starting the next work
¨ Fidelity improved through complete vertical
integration
¤ A clear boundary between logical flows
¤ Isolates interactions
¨ Risk distributions are optimized by risk class and
program phase
113
IMS Metrics
114
Model
Statistics
Relationship
Types
Lead/Lag
Values
Target Dates Network Status
Total activities Finish to start FS with positive
lag
Records with any
target date type
Activities completed
Total
milestones
Start to start SS with no
negative lag
Hard targets—
Start on, finish on
Activities in progress
Total
relationships
Finish to finish FF with no
negative lag
Activities past due
Average task
duration
Start to finish Activities without
predecessors or
successors
Activities with negative float
Summary tasks Activities with less than
program-defined threshold
Activities with float >100
days
Activities with 1-day duration
Activities with duration <5
days
Next steps, now that we have an understanding of what to do and
what not to do
Final questions and plansDay 2
1 Hour
116
Day2
END

Más contenido relacionado

La actualidad más candente

Why agile is best for managing projects in principle but not always in practice
Why agile is best for managing projects in principle but not always in practiceWhy agile is best for managing projects in principle but not always in practice
Why agile is best for managing projects in principle but not always in practiceGlen Alleman
 
Control Account Manager Short Course
Control Account Manager Short CourseControl Account Manager Short Course
Control Account Manager Short CourseGlen Alleman
 
Process Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMProcess Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMGlen Alleman
 
Six ½ Day Sessions on the Road To Becoming a CAM
Six ½ Day Sessions on the Road To Becoming a CAMSix ½ Day Sessions on the Road To Becoming a CAM
Six ½ Day Sessions on the Road To Becoming a CAMGlen Alleman
 
Establishing the Performance Measurement Baseline
Establishing the Performance Measurement BaselineEstablishing the Performance Measurement Baseline
Establishing the Performance Measurement BaselineGlen Alleman
 
Process Flow and Narrative for Agile
Process Flow and Narrative for AgileProcess Flow and Narrative for Agile
Process Flow and Narrative for AgileGlen Alleman
 
Building a Credible Performance Measurement Baseling
Building a Credible Performance Measurement BaselingBuilding a Credible Performance Measurement Baseling
Building a Credible Performance Measurement BaselingGlen Alleman
 
Integrated Program Performance Management
Integrated Program Performance ManagementIntegrated Program Performance Management
Integrated Program Performance ManagementGlen Alleman
 
Calculating Physical Percent Complete on Agile Projects
Calculating Physical Percent Complete on Agile ProjectsCalculating Physical Percent Complete on Agile Projects
Calculating Physical Percent Complete on Agile ProjectsGlen Alleman
 
The integrated master plan and integrated master schedule
The integrated master plan and integrated master scheduleThe integrated master plan and integrated master schedule
The integrated master plan and integrated master scheduleGlen Alleman
 
A credible pmb is the window to program
A credible pmb is the window to programA credible pmb is the window to program
A credible pmb is the window to programGlen Alleman
 
Integrating cost schedule and technical performance
Integrating cost schedule and technical performanceIntegrating cost schedule and technical performance
Integrating cost schedule and technical performanceGlen Alleman
 
Integrated Master Plan Development
Integrated Master Plan DevelopmentIntegrated Master Plan Development
Integrated Master Plan DevelopmentGlen Alleman
 
Integrating Risk With Earned Value
Integrating Risk With Earned ValueIntegrating Risk With Earned Value
Integrating Risk With Earned ValueGlen Alleman
 
Scrum Lifecycle At Enterprise Levels
Scrum Lifecycle At Enterprise LevelsScrum Lifecycle At Enterprise Levels
Scrum Lifecycle At Enterprise LevelsGlen Alleman
 
Estimating and Reporting Agile Projects
Estimating and Reporting Agile ProjectsEstimating and Reporting Agile Projects
Estimating and Reporting Agile ProjectsGlen Alleman
 
IMP as the Definition of Done
IMP as the Definition of DoneIMP as the Definition of Done
IMP as the Definition of DoneGlen Alleman
 
Integrated Master Schedule
Integrated Master ScheduleIntegrated Master Schedule
Integrated Master Scheduleellefsonj
 
Scrum lifecycle for Enterprise IT
Scrum lifecycle for Enterprise ITScrum lifecycle for Enterprise IT
Scrum lifecycle for Enterprise ITGlen Alleman
 
Integrated Master Plan Development
Integrated Master Plan DevelopmentIntegrated Master Plan Development
Integrated Master Plan DevelopmentGlen Alleman
 

La actualidad más candente (20)

Why agile is best for managing projects in principle but not always in practice
Why agile is best for managing projects in principle but not always in practiceWhy agile is best for managing projects in principle but not always in practice
Why agile is best for managing projects in principle but not always in practice
 
Control Account Manager Short Course
Control Account Manager Short CourseControl Account Manager Short Course
Control Account Manager Short Course
 
Process Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMProcess Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPM
 
Six ½ Day Sessions on the Road To Becoming a CAM
Six ½ Day Sessions on the Road To Becoming a CAMSix ½ Day Sessions on the Road To Becoming a CAM
Six ½ Day Sessions on the Road To Becoming a CAM
 
Establishing the Performance Measurement Baseline
Establishing the Performance Measurement BaselineEstablishing the Performance Measurement Baseline
Establishing the Performance Measurement Baseline
 
Process Flow and Narrative for Agile
Process Flow and Narrative for AgileProcess Flow and Narrative for Agile
Process Flow and Narrative for Agile
 
Building a Credible Performance Measurement Baseling
Building a Credible Performance Measurement BaselingBuilding a Credible Performance Measurement Baseling
Building a Credible Performance Measurement Baseling
 
Integrated Program Performance Management
Integrated Program Performance ManagementIntegrated Program Performance Management
Integrated Program Performance Management
 
Calculating Physical Percent Complete on Agile Projects
Calculating Physical Percent Complete on Agile ProjectsCalculating Physical Percent Complete on Agile Projects
Calculating Physical Percent Complete on Agile Projects
 
The integrated master plan and integrated master schedule
The integrated master plan and integrated master scheduleThe integrated master plan and integrated master schedule
The integrated master plan and integrated master schedule
 
A credible pmb is the window to program
A credible pmb is the window to programA credible pmb is the window to program
A credible pmb is the window to program
 
Integrating cost schedule and technical performance
Integrating cost schedule and technical performanceIntegrating cost schedule and technical performance
Integrating cost schedule and technical performance
 
Integrated Master Plan Development
Integrated Master Plan DevelopmentIntegrated Master Plan Development
Integrated Master Plan Development
 
Integrating Risk With Earned Value
Integrating Risk With Earned ValueIntegrating Risk With Earned Value
Integrating Risk With Earned Value
 
Scrum Lifecycle At Enterprise Levels
Scrum Lifecycle At Enterprise LevelsScrum Lifecycle At Enterprise Levels
Scrum Lifecycle At Enterprise Levels
 
Estimating and Reporting Agile Projects
Estimating and Reporting Agile ProjectsEstimating and Reporting Agile Projects
Estimating and Reporting Agile Projects
 
IMP as the Definition of Done
IMP as the Definition of DoneIMP as the Definition of Done
IMP as the Definition of Done
 
Integrated Master Schedule
Integrated Master ScheduleIntegrated Master Schedule
Integrated Master Schedule
 
Scrum lifecycle for Enterprise IT
Scrum lifecycle for Enterprise ITScrum lifecycle for Enterprise IT
Scrum lifecycle for Enterprise IT
 
Integrated Master Plan Development
Integrated Master Plan DevelopmentIntegrated Master Plan Development
Integrated Master Plan Development
 

Similar a Building a Credible Performance Measurement Baseline in Two Days

Building a Credible Performance Measurement Baseline (v3)
Building a Credible Performance Measurement Baseline (v3)Building a Credible Performance Measurement Baseline (v3)
Building a Credible Performance Measurement Baseline (v3)Glen Alleman
 
Sorge.les
Sorge.lesSorge.les
Sorge.lesNASAPMC
 
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Glen Alleman
 
How should we estimates agile projects (CAST)
How should we estimates agile projects (CAST)How should we estimates agile projects (CAST)
How should we estimates agile projects (CAST)Glen Alleman
 
Deliverables based planning handbook
Deliverables based planning handbookDeliverables based planning handbook
Deliverables based planning handbookGlen Alleman
 
PM Session 4
PM Session 4PM Session 4
PM Session 4dmdk12
 
Increasing the probability of program success
Increasing the probability of program successIncreasing the probability of program success
Increasing the probability of program successGlen Alleman
 
Earned Value Managed Services
Earned Value Managed ServicesEarned Value Managed Services
Earned Value Managed ServicesGlen Alleman
 
Earned Value Management Essentials
Earned Value Management EssentialsEarned Value Management Essentials
Earned Value Management EssentialsGlen Alleman
 
Connecting PMB® with PMBOK®
Connecting PMB® with PMBOK®Connecting PMB® with PMBOK®
Connecting PMB® with PMBOK®Glen Alleman
 
From WBS to Integrated Master Schedule
From WBS to Integrated Master ScheduleFrom WBS to Integrated Master Schedule
From WBS to Integrated Master ScheduleGlen Alleman
 
How to build a credible performance measurement baseline (v5)
How to build a credible performance measurement baseline (v5)How to build a credible performance measurement baseline (v5)
How to build a credible performance measurement baseline (v5)Glen Alleman
 
From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...
From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...
From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...Glen Alleman
 
Earned Value + Agile = Success
Earned Value + Agile = SuccessEarned Value + Agile = Success
Earned Value + Agile = SuccessGlen Alleman
 
Agile in an ANSI-748-C environment
Agile in an ANSI-748-C environmentAgile in an ANSI-748-C environment
Agile in an ANSI-748-C environmentGlen Alleman
 
Managing Government Grant Projects
Managing Government Grant ProjectsManaging Government Grant Projects
Managing Government Grant ProjectsGlen Alleman
 
Defining The Project
Defining The ProjectDefining The Project
Defining The ProjectEarl Tongol
 
Earned Value Management Essentials
Earned Value Management EssentialsEarned Value Management Essentials
Earned Value Management EssentialsGlen Alleman
 

Similar a Building a Credible Performance Measurement Baseline in Two Days (20)

Building a Credible Performance Measurement Baseline (v3)
Building a Credible Performance Measurement Baseline (v3)Building a Credible Performance Measurement Baseline (v3)
Building a Credible Performance Measurement Baseline (v3)
 
Sorge.les
Sorge.lesSorge.les
Sorge.les
 
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
 
How should we estimates agile projects (CAST)
How should we estimates agile projects (CAST)How should we estimates agile projects (CAST)
How should we estimates agile projects (CAST)
 
Deliverables based planning handbook
Deliverables based planning handbookDeliverables based planning handbook
Deliverables based planning handbook
 
Ev+agile=success
Ev+agile=successEv+agile=success
Ev+agile=success
 
PM Session 4
PM Session 4PM Session 4
PM Session 4
 
Increasing the probability of program success
Increasing the probability of program successIncreasing the probability of program success
Increasing the probability of program success
 
Earned Value Managed Services
Earned Value Managed ServicesEarned Value Managed Services
Earned Value Managed Services
 
Earned Value Management Essentials
Earned Value Management EssentialsEarned Value Management Essentials
Earned Value Management Essentials
 
Connecting PMB® with PMBOK®
Connecting PMB® with PMBOK®Connecting PMB® with PMBOK®
Connecting PMB® with PMBOK®
 
From WBS to Integrated Master Schedule
From WBS to Integrated Master ScheduleFrom WBS to Integrated Master Schedule
From WBS to Integrated Master Schedule
 
How to build a credible performance measurement baseline (v5)
How to build a credible performance measurement baseline (v5)How to build a credible performance measurement baseline (v5)
How to build a credible performance measurement baseline (v5)
 
From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...
From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...
From Needed Capabilities to Project Deliverables - On Time, On Budget, On Spe...
 
Earned Value + Agile = Success
Earned Value + Agile = SuccessEarned Value + Agile = Success
Earned Value + Agile = Success
 
Agile in an ANSI-748-C environment
Agile in an ANSI-748-C environmentAgile in an ANSI-748-C environment
Agile in an ANSI-748-C environment
 
Managing Government Grant Projects
Managing Government Grant ProjectsManaging Government Grant Projects
Managing Government Grant Projects
 
Defining The Project
Defining The ProjectDefining The Project
Defining The Project
 
Performing against the wall - SPI after completion
Performing against the wall - SPI after completionPerforming against the wall - SPI after completion
Performing against the wall - SPI after completion
 
Earned Value Management Essentials
Earned Value Management EssentialsEarned Value Management Essentials
Earned Value Management Essentials
 

Más de Glen Alleman

Managing risk with deliverables planning
Managing risk with deliverables planningManaging risk with deliverables planning
Managing risk with deliverables planningGlen Alleman
 
A Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSA Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSGlen Alleman
 
Increasing the Probability of Project Success
Increasing the Probability of Project SuccessIncreasing the Probability of Project Success
Increasing the Probability of Project SuccessGlen Alleman
 
Practices of risk management
Practices of risk managementPractices of risk management
Practices of risk managementGlen Alleman
 
Principles of Risk Management
Principles of Risk ManagementPrinciples of Risk Management
Principles of Risk ManagementGlen Alleman
 
From Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringFrom Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringGlen Alleman
 
NAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideNAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideGlen Alleman
 
Building a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineBuilding a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineGlen Alleman
 
Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Glen Alleman
 
IMP / IMS Step by Step
IMP / IMS Step by StepIMP / IMS Step by Step
IMP / IMS Step by StepGlen Alleman
 
DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)Glen Alleman
 
Making the impossible possible
Making the impossible possibleMaking the impossible possible
Making the impossible possibleGlen Alleman
 
Heliotropic Abundance
Heliotropic AbundanceHeliotropic Abundance
Heliotropic AbundanceGlen Alleman
 
Capabilities based planning
Capabilities based planningCapabilities based planning
Capabilities based planningGlen Alleman
 
Building the Performance Measurement Baseline
Building the Performance Measurement BaselineBuilding the Performance Measurement Baseline
Building the Performance Measurement BaselineGlen Alleman
 
Program Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaProgram Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaGlen Alleman
 
Policy and Procedure Rollout
Policy and Procedure RolloutPolicy and Procedure Rollout
Policy and Procedure RolloutGlen Alleman
 
Project Management Theory
Project Management TheoryProject Management Theory
Project Management TheoryGlen Alleman
 
Increasing the Probability of Project Success with Five Principles and Practices
Increasing the Probability of Project Success with Five Principles and PracticesIncreasing the Probability of Project Success with Five Principles and Practices
Increasing the Probability of Project Success with Five Principles and PracticesGlen Alleman
 
Seven Habits of a Highly Effective agile project manager
Seven Habits of a Highly Effective agile project managerSeven Habits of a Highly Effective agile project manager
Seven Habits of a Highly Effective agile project managerGlen Alleman
 

Más de Glen Alleman (20)

Managing risk with deliverables planning
Managing risk with deliverables planningManaging risk with deliverables planning
Managing risk with deliverables planning
 
A Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSA Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMS
 
Increasing the Probability of Project Success
Increasing the Probability of Project SuccessIncreasing the Probability of Project Success
Increasing the Probability of Project Success
 
Practices of risk management
Practices of risk managementPractices of risk management
Practices of risk management
 
Principles of Risk Management
Principles of Risk ManagementPrinciples of Risk Management
Principles of Risk Management
 
From Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringFrom Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems Engineering
 
NAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideNAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guide
 
Building a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineBuilding a Credible Performance Measurement Baseline
Building a Credible Performance Measurement Baseline
 
Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Integrated master plan methodology (v2)
Integrated master plan methodology (v2)
 
IMP / IMS Step by Step
IMP / IMS Step by StepIMP / IMS Step by Step
IMP / IMS Step by Step
 
DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)
 
Making the impossible possible
Making the impossible possibleMaking the impossible possible
Making the impossible possible
 
Heliotropic Abundance
Heliotropic AbundanceHeliotropic Abundance
Heliotropic Abundance
 
Capabilities based planning
Capabilities based planningCapabilities based planning
Capabilities based planning
 
Building the Performance Measurement Baseline
Building the Performance Measurement BaselineBuilding the Performance Measurement Baseline
Building the Performance Measurement Baseline
 
Program Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaProgram Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six Sigma
 
Policy and Procedure Rollout
Policy and Procedure RolloutPolicy and Procedure Rollout
Policy and Procedure Rollout
 
Project Management Theory
Project Management TheoryProject Management Theory
Project Management Theory
 
Increasing the Probability of Project Success with Five Principles and Practices
Increasing the Probability of Project Success with Five Principles and PracticesIncreasing the Probability of Project Success with Five Principles and Practices
Increasing the Probability of Project Success with Five Principles and Practices
 
Seven Habits of a Highly Effective agile project manager
Seven Habits of a Highly Effective agile project managerSeven Habits of a Highly Effective agile project manager
Seven Habits of a Highly Effective agile project manager
 

Último

Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesBoston Institute of Analytics
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdflior mazor
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Principled Technologies
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodJuan lago vázquez
 
Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024SynarionITSolutions
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 

Último (20)

Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
GenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdfGenAI Risks & Security Meetup 01052024.pdf
GenAI Risks & Security Meetup 01052024.pdf
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
Deploy with confidence: VMware Cloud Foundation 5.1 on next gen Dell PowerEdg...
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024Top 10 Most Downloaded Games on Play Store in 2024
Top 10 Most Downloaded Games on Play Store in 2024
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 

Building a Credible Performance Measurement Baseline in Two Days

  • 1. BUILDING A CREDIBLE PERFORMANCE MEASUREMENT BASELINE IN TWO DAYS Starting with DID 81650, assemble a credible PMB to increase the Probability of Program Success (PoPS) August 23rd and 24th, 2011
  • 2. Learning Objectives 2 Overview of the Integrated Baseline Review LO 1 Understand of the motivations for the Performance Measurement Baseline (PMB) starting with DID 81650 LO 2 Gain the skills in the 6 processes needed to build a credible PMB, using Risk+ to address DID 81650 LO 3 Develop the framework for schedule, cost, and technical performance risk categorizations. LO 4 Gain the skills of executing the PMB, with an integrated Risk Register to maintain the credibility of the PMB LO 5 Establish the processes needed to sustain this credibility, including Risk+ operations, Risk Register functions, and performance assessment processes
  • 3. Our Two Day Agenda 3 Day 1 Overview of building a credible Performance Measurement Baseline 08:00 – 08:50 1: Steps to building a credible Performance Measurement Baseline 09:00 – 10:50 2: Individual elements of the Integrated Master Schedule (IMS) 11:00 – 11:50 3: Connecting the dots to an actual IMS 12:00 – 12:50 4: Lunch Break 13:00 – 13:50 5: Example of an Integrated Master Schedule ready for DID 81650 14:00 – 15:50 6: Demonstration of Risk+ integrated with the IMS, and understanding the outcomes 16:00 – 16:50 7: Wrap up for day 1 – feedback from students, corrective actions for Day 2 Day 2 Hands on development of the DRS–MES PMB using DID 81650 08:00 – 08:50 8: DRS–MES IMS structural assessment, gap closure, ready for workshop 09:00 – 10:50 9: Building the risk category values for each Work Package, and updating the risk register 11:00 – 11:50 10: First run of Risk+ and Management Report of confidence of completely on or before planned date 12:00 – 12:50 11: Lunch Break 13:00 – 13:50 12: Adjusting the IMS with this new information 14:00 – 15:50 13: Building the “baseline–able” IMS compliant with DIDS 81650 16:00 – 16:50 14: Final questions, plans for “phone support,” and any remaining closure plans
  • 4. 4 But First A Warning We’re going to cover a lot of material in two days
  • 6. Identify Needed Capabilities Establish a Performance Measurement Baseline Execute the Performance Measurement Baseline Capabilities Based Plan Operational Needs Earned Value Performance 0% /100% Technical Performance Measures System Value Stream Technical Requirements Identify Requirements Baseline Technical Performance Measures PMB Changes to Needed Capabilities Changes to Requirements Baseline Changes to Performance Baseline Œ  Ž   DRS–MES6 Deliverables Based Planning ® is a registered trademark of Lewis & Fowler. Copyright ® Lewis & Fowler, 2011
  • 7. Building the Performance Measurement Baseline (PMB) from Cost and Schedule Copyright © 2010, Lewis & Fowler, Use of any or all of this material is prohibited without written permission 7 Integrated Master Schedule SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # Cost and Materiel Baseline Quarter Quarter Quarter Quarter Quarter § CAP contains Budget, Hours, Staff, deliverables, spread by month, quarter. § PMB contains Work Packages, BCWS, EV methods, sequenced in the proper order. The integration of the IMS and the Cost Baseline is 2 of the 3 elements of the PMB. The cost spreads by quarter currently in place are spread to the Work Packages in the IMS and the BCWS baselined for performance measurement SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # SOW Sub # 1.0 Overview
  • 8. Build a time–phased network of activities describing the work to be performed, the budgeted cost for this work, the organizational elements that produce the deliverables from this work, and the performance measures showing this work is proceeding according to plan. Decompose the program Scope into a product based Work Breakdown Structure (WBS), then further into Work Packages describing the production of the deliverables traceable to the requirements, and to the needed capabilities. 3.1 Assign responsibility to Work Packages (the groupings of deliverables) to a named owner accountable for the management of the resource allocations, cost and schedule baseline, and technical delivery. 3.2 Arrange the Work Packages in a logical network with defined deliverables, milestones, internal and external dependencies, with credible schedule, cost, and technical performance margins. 3.3 Develop the Time–Phased Budgeted Cost for Work Scheduled (BCWS) for the labor and material costs in each Work Package and the Project as a whole. Assure proper resource allocations can be met and budget profiles match expectations of the program sponsor 3.4 Assign objective Measures of Performance (MoP) and Measures of Effectiveness (MoE) for each Work Package and summarize these for the Project as a whole. 3.5 Establish a Performance Measurement Baseline (PMB) used to forecast the Work Package and Project ongoing and completion cost and schedule performance metrics. 3.6 Ž DRS–MES8
  • 9. The Road To Project Success Depends On … Where are we going? How do we get there? Are there enough resources? What are impediments to progress? How do we measure progress? DRS–MES9 Deliverables Based Planning ® is a registered trademark of Lewis & Fowler. Copyright ® Lewis & Fowler, 2011
  • 10. The PLAN is the strategy for the successful completion of the project. The SCHEDULE is the sequence of work, the assigned resources, and the measures of progress that implement the Plan. Both are needed to increase the Probability of Project Success (PoPS) 1: Steps in building a credible PMBDay 1 Risk SOW Cost WBS IMP/IMS TPM PMB 1 Hour
  • 11. Framework for Increasing the Probability of Program Success (PoPS) 11 Program Enablers Program Process Capabilities Business Enablers
  • 12. Just a reminder of the project elements we have control over 12
  • 13. Risk SOW Cost WBS IMP/IMS TPM PMB ¨ Cost Basis of Estimate (BOE) built bottom up and validated top down. ¨ Statement of Work (SOW) traceable to the Work Breakdown Structure and all BOEs ¨ Work Breakdown Structure (WBS) built using MIL-STD-881C guidance. Products and services only, no functional departments. ¨ IMP/IMS built using DoD and other guidance to measure increasing maturity of deliverables. ¨ Technical Performance Measures (TPM) for each major deliverable in units of measure meaningful to the decision maker. DRS–MES13
  • 14. Want Some Motivation for the WBS? ¨ Forces the creation of detailed steps but delineating the products and services that produce them. ¨ Lays the groundwork for schedule and budget by creating “buckets” to assign resources and costs. 14 ¨ Creates accountability by defining explicit connections between the work to be performed and those performing the work. ¨ Creates commitment by making visible to all project participants the previous three activities.
  • 15. What does a good WBS NOT look like? ¨ It’s not a laundry list of work to be done. ¨ It’s not a functional decomposition. ¨ It’s not a direct map of the requirements. ¨ It’s not a reflection of the underlying software partitioning. ¨ It’s not the first structure you might think of… 15 Risk SOW Cost IMP/IMS TPM PMB WBS
  • 16. Connect the WBS to Work Packages and define the Tasks to produce Deliverables Business Need Process Invoices for Top Tier Suppliers 1st Level Electronic Invoice Submittal 1st Level Routing to Payables Department 2nd Level Payables Account Verification 2nd Level Payment Scheduling 2nd Level Material receipt verification 2nd Level “On hand” balance Updates Deliverables defined in WP 16 Risk SOW Cost IMP/IMS TPM PMB WBS
  • 17. Establishing the Three Elements of the Performance Measurement Baseline Cost Baseline Schedule Baseline Technical Baseline Perform Functional Analysis Determine Scope and Approach Develop Technical Logic Develop Technical Baseline Develop WBS Define Activities Estimate Time Durations Sequence Activities Finalize Schedule Identify Apportioned Milestones Determine Resource Requirements Prepare Cost Estimate Resource Load Schedule Finalize Apportioned Milestones Determine Funding Constraints Approve PMB 17 Risk SOWIMP/IMS TPM PMB WBS Cost
  • 18. What does a good schedule look like? ¨ A good schedule is predictive – it shows what is going to happen in the future and what the alternatives are if that doesn't actually happen ¨ A good schedule is reflective – it shows where the project stands in relations to the planned position against the actual work that has been accomplished ¨ A good schedule is dynamic – it can be adjusted when the reality of the project changes 18 Risk SOW Cost TPM PMB WBS IMP/IMS
  • 19. Improving the credibility of the schedule ¨ Build the requirements in a tool ¨ Build the PLAN before building the SCHEDULE ¨ Manage the project with a Project Management Tool ¨ Make every task duration fit a predefined guide ¨ Use a RACI and RAM to assign accountability ¨ Every task has a deliverable ¨ Have a plan B and a plan C ¨ All cost and durations are random variables ¨ In the end, it’s always about the people 19 Risk SOW Cost TPM PMB WBS IMP/IMS
  • 20. A “thread worn” and corny phrase that still is the best approach to success 20
  • 21. What does a PLAN Look Like?21 Risk SOW Cost TPM PMB WBS IMP/IMS
  • 25. DRS-MES Mapping the steps to the process of building the Performance Measurement Baseline The six steps of physically assembling the Performance Measurements Baseline cover all the processes of Establishing the PMB. Each step in the sequence advances the PMB to its final maturity – ready for baselining Decompose Scope Assign Responsibility ArrangeWork Packages DevelopBCWS Assign Performance Measures SetPerformance Baseline Perform functional analysis P Determine scope and approach P Develop Work Breakdown Structure P Develop technical logic P Develop technical baseline P Approve performance measure baseline Define activities P P Estimate time durations P Sequence activities P Indentify apportioned milestones P Finalize schedule P Finalized apportioned milestones P Determine resource requirements P Prepare cost estimates P Resource load schedule P Determine funding constraints P 25
  • 26. A credible IMS is more than the work, durations, and relationships. It’s an executable set of activities that implements the program’s strategy – the PLAN. The IMS buys down risk, provides visibility to project performance, indicates alternative approaches, and provides actionable information to the decision makers. 2: Individual Elements of an Integrated Master ScheduleDay 1 2 Hours
  • 27. Critical Success Factors for the Performance Measurement Baseline ¨ Deliverables represent the required business capabilities and its value as defined by the business and shared by the development team. ¨ When all deliverables and their Work Packages are completed, they are not revisited or reopened. ¤ They are 100% done. ¨ The progression of Work Packages defines the increasing maturity of the project. ¤ The business value of the deliverables to the customer increases as Work Packages are completed. ¨ Completion of Work Packages is represented by the Physical Percent Completion of the project. ¤ Either 0%/100% or Apportioned Milestones are used to state the completion of each Work Package. Business Requirements Technical Capabilities Work Packages Deliverables 27 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 28. The Critical Few 1. Estimated durations developed to known confidence levels. 2. Probability Distributions for categories of work. 3. Risk parameters for each category of work. 4. Credible sequences of work dependencies. 5. Alterative paths through the network to deal with uncertainty. 6. Measures of performance in units meaningful to the decision makers. 28 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 29. Let’s Build the Performance Measurement Baseline Using The Eight Steps 29http://www.softwaretechnews.com/images/STN_April_09_lores_Page_29_Image_0001.jpg
  • 30. This approach is called Product Development Kaizen and is used by Lean Six Sigma firms to ferret out the system capabilities before any technical or operational requirements are defined. Use this to reverse engineer or validate the WBS and connect WHAT with WHY before proceeding to build the CWBS or confirm the WBS. 30
  • 31. Program Events Statement of Work CWBS Significant Accomplishments Accomplishment Criteria CDRLs and Deliverables Tasks Contained in Work Packages Measures the progress to plan using Physical & Complete at the Accomplishment Criteria (AC) and CWBS level start with making to the following connections Defines Aligned Aligned AlignedAligned Aligned Completed SA’s are entry criteria for Program Events Completed Work Packages are exit criteria for Tasks Describes increasing product maturity as 0/100 or EVMS SD guidance Documents the product maturity that is aligned with SOW and CWBS Work necessary to mature products grouped by CWBS Work structure aligned to SOW 31
  • 32. Update Contractor System Spec Update Program Development Allocate Functional Reqmts Update Functional System Design Develop HWCI Specifications Develop SIL Specifications Build Astp1 F-18 IRR SIL Baseline 1.0 Update SIL Test Cases Develop Prelim SIL CSCI Critical Component s AstP 1,2 SSpS 1,2,3 1 2 3 4 6 7 5 8 10 9 11 13 14 15 Update AS Test I&T on CVN I&T on LHA 12 Contract Award + 15 days Systems Requirements Review (SRR) System Functional Review (SFR) HW Preliminary Design Review (PDR) System PDR EDM 1.0 Baseline EDM 2.0 Baseline Mfg Docs Available TBD TRR 1.0 EDM 7-8 TRR 32 § Each collection point provides an assessment of incremental business or mission value. § Defining these points before the project starts is the basis of measuring progress to plan. § Because then you know what done looks like before it arrives.
  • 33. Deliverables WBS Tasks and Schedule Business Need Process Invoices for Top Tier Suppliers 1st Level Electronic Invoice Submittal 1st Level Routing to Payables Department 2nd Level Payables Account Verification 2nd Level Payment Scheduling 2nd Level Material receipt verification 2nd Level “On hand” balance Updates Work Package (WP) 1 2 3 4 6 5 A B Deliverables defined in WP Terminal Node in the WBS defines the products or services that produce the products of the project Terminal node of the WBS defined by a Work Package. Tasks within the Work Package produce the Deliverables 100% Completion of the deliverables is the measure of performance for the Work Package Management of the Work Package Tasks is the responsibility of the WP Manager. A decomposition of the work needed to fulfill the business requirements 33
  • 34. 34 Maturity ActionProduct Product State Adjective VerbNoun Verb CompleteDesignModel/SimPreliminary
  • 35. Program Events Define the availability of a Capability at a point in time. Accomplishments Represent requirements that enable Capabilities. Criteria Represent Work Packages that fulfill Requirements. Work Package Work Package Work Package Work Package Work Package Work Package Work Package Work Package § The increasing maturing of a product or service is described through Events or Milestones, Accomplishments, Criteria, and Work Packages. § The presence of these capabilities is measured by the Accomplishments and their Criteria. § Accomplishments are the pre–conditions for the maturity assessment of the product or service at each Event or Milestone. § Performance of the work activities, Work Packages, Criteria, Accomplishments, and Events or Milestones is measured in units of “physical percent complete” by connecting Earned Value with Technical Performance Measures. Work Package 35
  • 36. 36
  • 37. 37
  • 38. AC: 005 Task Task Task Task AC AC:023 Task Task Task Task AC § The 100% completed work in AC:005 is needed to start the work in AC:023 § In the IMP/IMS paradigm, there is no Task-to-Task connection across Accomplishment Criteria (AC) boundaries, only within an AC § The AC-to-AC linking states “…all work in the predecessor AC must be complete before starting the successor work, assuring the minimum of rework due to partially defined requirements or partially completed products” 38
  • 39. PE: BPE: A SA: 001 SA: 002 SA: 003 SA: 004 PE: A Task Task Task AC: 006 § The best arrangement has the completion of Event A start the first task in Event B. § All work performed beyond the date of Event A is done at risk. § At PDR (Event A), approval to proceed Event B (CDR) is given § Only long lead items should cross Program Event boundaries § All other work terminates on the Program Event where a formal review of the planned maturity is conducted – SRR, SFR, PDR, CDR, … § This topology assures a complete assessment of “progress to plan,” is available at each Program Event 39 SA: 008 PE: B
  • 40. 40
  • 41. Risk: CEV-037 - Loss of Critical Functions During Descent Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete) RiskScore 24 22 20 18 16 14 12 10 8 6 4 2 0 Conduct Force and Moment Wind Develop analytical model to de Conduct focus splinter review Conduct Block 1 w ind tunnel te Correlate the analytical model Conduct w ind tunnel testing of Conduct w ind tunnel testing of Flight Application of Spacecra CEV block 5 w ind tunnel testin In-Flight development tests of Damaged TPS flight test 31.Mar.05 5.Oct.05 3.Apr.06 3.Jul.06 15.Sep.06 1.Jun.07 1.Apr.08 1.Aug.08 1.Apr.09 1.Jan.10 16.Dec.10 1.Jul.11 Risk Response and Risk ID in IMS Milestone Date traceable between RM Tool and IMS 41
  • 42. An estimate must contain a confidence interval and an error band on that confidence interval to be credible. Otherwise it’s just a guess. 1. Estimating Duration of WPs42 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 43. Steps in Building the Work Packages ¨ Step 1 – define what is going to be delivered to produce business value ¤ One or more Deliverables produced within a Work Package. ¨ Step 2 – define the effort and duration along with the confidence levels ¤ Only effort and total duration. ¤ Level of confidence for effort and duration. 43 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 44. Define what’s going to be produced to deliver business value ¨ Step 1 – Define the deliverables and their apportioned value Description Deliverable(s) Apportioned Milestones Transaction processing integration test complete. §Test plan compete and approved §Author – 50% §Approval – 50% Define integration testing environment. §Integration Test Plan complete §Test platform equipment defined §Test environment defined §Test Plan – 25% §Equipment List – 50% §Environment – 25% Business processes defined and approved. §Business process flow diagram §100% User acceptance testing defined. §User Acceptance Plan Developed §100% User Acceptance Testing Conducted. §Test environment operational §User Acceptance Testing performed with 90% success §UAT errors documented and allocated for repair in next release §Environment – 20% §UAT Conducted – 70% §Errors documented – 10% 44 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 45. Project Deliverables Notional Percentage Allocation Actual Allocation on past projects Requirements / Analysis 20% Product or Service Design 10% Product or Service Production 25% System Integration 10% System Test Processes 15% User Acceptance Testing Processes 10% DRS–MESIndividual Elements of the Integrated Master Schedule
  • 46. Define the effort and duration along with the confidence levels ¨ Step 2 – construct the estimates within confidence levels Description Duration Duration Confidence Effort Effort Confidence Transaction processing integration test complete 10w 1 2680h 2 Define integration testing environment 4w 1 480h 1 Business processes defined and approved 6w 2 1200h 1 User acceptance testing defined 3w 2 800h 2 User acceptance testing conducted 4w 1 200h 1 46 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 47. Questions to the Group Answers from the Group Can we do this in one (1) year? Sure, no problem How about one (1) week? Oh not hardly, can’t be done in a week How about six (6) months? Yea, that might be possible How about four (4) months? That’s cutting it really close, I’m not sure about the 4 months How about five (5) months? Yea, that’s be about a short as I’d go To put this into practice requires more discipline of course. But the principle of a Wide Band Delphi estimating process is well tested in the field and well documented in the literature. Using the 20 questions game is an easy way to get to an estimate for duration and effort. Given a software project element, how long will it take and how much effort is expended over that period. This effort over duration will provide the cost. ¤ We have this requirement for a customer service interface. The functions can be enumerated and the core technology is known ¤ Ask the following series of questions So with 5 questions asked of a group of subject matter experts, we can get an estimate of 5 months with a variance of 1 month or so on either side. That’s a 20% accuracy on a simple problem in about 30 seconds. Scale that to larger or more complex problems and more questions – or better questions – and a bit more thoughtfulness for the questions and you can get within 20%. Getting to an estimate without having to understand all the detailed requirements 47
  • 48. Conditions for a discrete Work Package used for Performance Measurement Example of Work Package and its use Discrete Combined Rationale for the Performance Measurement Outcome of the WP is a technical work product Requirements, designs, or test procedures needs as a set for a downstream task Y N If the WP constrains the start or completion of a subsequent WP, analyze schedule variances to determine impact on downstream activities Outcome of the WP is a set of technical work products. An individual work product is a component of the end work product may be an input to a subsequent WP before completion of the set, but is not itself a constraint Individual requirements, design or test within a WP that is an input to a downstream task but is itself not a constraint Y Y An individual work product is not a constraint to a downstream task, there is no need to monitor its progress at the WP level. It may be combined with similar work products in a WP. Only the WP completion must be linked with the successor activity Outcome is a scheduled process required to meet a project objective The process must be implemented to achieve planned cost, performance of schedule – standing up a development environment Y N Outcome is a recurring work product that does not constrain the start or completion of another recurring WP Status reporting or documentation of a recurring meeting N Y Recurring work products, although scheduled, rarely constrains another task. There is no significant schedule impact to downstream tasks Work scope is general or supportive Project management, administrative support N Y Multiple Level of Effort tasks may be combined into one WP supporting detail of time phased budget at the task level should be maintained Derived from, Performance–Based Earned Value®, Paul Solomon and Ralph Young, John Wiley & Sons, 2010 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 49. There are two types of Uncertainty Uncertainty about the functional and performance aspects of the program’s technology that impacts the produceability of the product or creates delays in the schedule Uncertainty about the duration and cost of the activities that deliver the functional and performance elements of the program independent of the technical risk 49 Technical Programmatic DRS–MESIndividual Elements of the Integrated Master Schedule
  • 50. All elements of a projects, its cost, schedule, and technical performance, are random variables. Knowing the underlying probability distribution of these random variables is a Critical Success Factor for the application of Monte Carlo Simulation. 2. Probability Distributions50 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 51. Risk Probability Distribution Function is the Lifeblood of good planning ¨ Probability of occurrence as a function of the number of samples ¨ “The number of times a task duration appears in a Monte Carlo simulation” 51
  • 52. Risk Task “Most Likely” ≠ Project “Most Likely,” Must be Understood by Every Planner ¨ PERT assumes probability distribution of the project times is the same as the tasks on the critical path. ¨ Because other paths can become critical paths, PERT consistently underestimates the project completion time. 1 + 1 = 3 3 52
  • 53. Risk Inputs Outputs The Program is a System, Just like the any other System with complex interactive parts 53 ¨ The programmatic and planning dynamics act as a system. ¨ The “system response” is the transfer function between input and output. ¨ Understanding this transfer function may appear beyond our interest. ¤ But it is part of the stochastic dynamic response to disruptions in our plans. ¤ “What if” really means “what if” at this point in the response curve of the system.
  • 54. Risk management is how adults manage projects. ‒ Tim Lister (IBM Fellow) 3. Risk Parameters for Planned Work54
  • 55. 55 Risk is measured as any deviation from the original baseline. Risk is anything that results in a variance. Variance at Completion (VAC) is the basic measure of risk encountered by the end of the contract effort, whether the risk is rooted in issues related to planning of scope, estimating, scheduling, or technical criteria that are identified during the normal course of the program execution
  • 56. Risk Why Probabilistic Risk Analysis is Often Opposed by Management Many people do not understand the underlying statistics ¤ Education, practice, guidance Many planners lack the formal probability and statistics training ¤ Education, practice, guidance Most planners perform deterministic analysis of schedules and cost ¤ Risk is hard work The fact the probabilistic risk analysis is built on uncertainty is seen as weakness in the planning process, not a strength ¤ Why can’t you know how long it will take or how much it costs? People tend to think that the “lack of data” is a reason not to perform probabilistic schedule risk analysis ¤ The exact opposite is true 56
  • 57. Level Likelihood E Near Certainty D Highly Likely C Likely B Low Likelihood A Not Likely Level Technical Performance Schedule Cost A Minimal or no consequence to technical performance Minimal or no impact Minimal or no impact B Minor reduction in technical performance or supportability Able to meet key dates Budget increase or unit production cost increases. < **(1% of Budget) C Moderate reduction in technical performance or supportability with limited impact on program objectives Minor schedule slip. Able to meet key milestones with no schedule float. Budget increase or unit production cost increase < **(5% of Budget) D Significant degradation in technical performance or major shortfall in supportability Program critical path affected Budget increase or unit production cost increase < **(10% of Budget) E Severe degradation in technical performance Cannot meet key program milestones. Slip > X months Exceeds budget increase or unit production cost threshold DRS–MESIndividual Elements of the Integrated Master Schedule DRS–MES57 This matrix must be built for each category of risk. The decision for each dimension comes from Subject Matter Experts and the Risk Management team. E D C B A A B C D E
  • 58. Putting planned work in the right order is an iterative process. If you think you’ve got it right the first time, it’s wrong. If you think you’ve got it right the 3rd time, you’re getting close. Use the Monte Carlo Simulator to assess the impacts of the work order – the near Critical Path analysis 4. Credible Sequencing of the Work58 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 59. Attribute Beneficial Outcome from this Attribute Maturity Flows through Program Events § Performance measurement is in units of increasing maturity of the Technical Performance Measures § Each event is a mini authorization to proceed Single outcome for each work package (AC) § Measure Physical Percent Complete at the WP level § Use 0/100 for tasks for the vast majority of work Technical Performance Measures are explicitly visible § Connect Cost, Schedule, and Technical Performance § EV does not provide a means of adjust for “off TPM,” but make your own adjustments to the risk numbers for now Risk retirement explicitly visible § Risk retirement is embedded in the IMS § Risk mitigation means waiting until the risk happens IMS flows vertically 1st and horizontally 2nd § All work supports the assessment of maturity § Isolate tasks dependencies within a Work Package No Event linkage except for long lead items § 0/100 requires not partial completion Decoupled dependences improves risk responsiveness § 1st round IMS defines a free flowing process § Maintaining this decoupling is key to a “dynamic” IMS that can respond to the natural changes in the program 59
  • 60. AQuickReview… The Performance Measurement Baseline (PMB) is a time-phased budget plan for accomplishing work, against which contract performance is measured. It includes the budgets assigned to scheduled control accounts and the applicable indirect budgets. For future effort, not planned to the control account level, the PMB also includes budgets assigned to higher level Contractor Work Breakdown Structure (CWBS) elements, and to undistributed budgets. It does not include management reserve. — Earned Value Implementation Guide, October 2006 But if you’ve got: § The wrong work, performed in the wrong order, § Work that can’t measured against the Technical Performance Measures, § Insufficient resources to absorb the planned BCWS, § No measure of effectiveness (MOE) or measure of performance (MOP) of the produced products against the planned outcomes, or § No risk retirement tasks embedded in the IMS… … THE PMB IS NOT CREDIBLE 60
  • 61. We always need a Plan B and many times a Plan C. These paths don’t have to be on baseline, but they have to be in the mind of the Program Manager, because when they are needed, it’s usually too late to discover them. 5. Identify Alternative Paths61 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 62. To Achieve Success … 62 We Need to … ©gapingvoid ltd www.gapingvoidgallery.com
  • 63. Branching Probabilities – Simple Approach ¨ Plan the risk alternatives that “might” be needed ¨ Each mitigation has a Plan B branch ¨ Keep alternatives as simple as possible (maybe one task) ¨ Assess probability of the alternative occurring ¨ Assign duration and resource estimates to both branches ¨ Turn off for alternative for a “success” path assessment ¨ Turn off primary for a “failure” path assessment 30% Probability of failure 70% Probability of success Plan B Plan A Current Margin Future Margin 80% Confidence for completion with current margin Duration of Plan B Plan A + Margin£ 63 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 64. Managing Margin in the Risk Tolerant IMS requires the reuse of unused durations ¨ Programmatic Margin is added between Development, Production and Integration & Test phases ¨ Risk Margin is added to the IMS where risk alternatives are identified ¨ Margin that is not used in the IMS for risk mitigation will be moved to the next sequence of risk alternatives ¤ This enables us to buy back schedule margin for activities further downstream ¤ This enables us to control the ripple effect of schedule shifts on Margin activities 5 Days Margin 5 Days Margin Plan B Plan A Plan B Plan AFirst Identified Risk Alternative in IMS Second Identified Risk Alternative in IMS 3 Days Margin Used Downstream Activities shifted to left 2 days Duration of Plan B < Plan A + Margin 2 days will be added to this margin task to bring schedule back on track 64 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 65. Measures of Performance (MoP), Measures of Effectiveness (MoE), and Technical Performance Measures (TPM) are the basis of measuring “done.” These measures are used with the probabilistic confidence to provide 6. Meaningful Measures65 DRS–MESIndividual Elements of the Integrated Master Schedule
  • 66. Do We Know How To Measure Value Along The Way To Our Destination? 66 ¨ How do we increase visibility into program performance? ¨ How do we reduce cycle time to deliver the product? ¨ How do we foster accountability? ¨ How do we reduce risk? ¨ How do we start our journey to success?
  • 67. What’s Our Motivation for “Connecting the Dots?” 67 Technical Performance Measures … ¨ Provide program management with information to make better decisions, ¨ Increase the probability of delivering a solution that meets both the requirements and mission need.
  • 68. Measure of Effectiveness (MoE) ¨ Measures of Effectiveness … ¨ Are stated in units meaningful to the buyer, ¨ Focus on capabilities independent of any technical implementation, ¨ Are connected to the mission success. “Technical Measurement,” INCOSE–TP–2003–020–01 68
  • 69. Measure of Performance (MoP) ¨ Measures of Performance are … ¨ Attributes that assure the system has the capability to perform, ¨ Assessment of the system to assure it meets design requirements to satisfy the MoE. “Technical Measurement,” INCOSE–TP–2003–020–01 69
  • 70. Key Performance Parameters (KPP) ¨ Key Performance Parameters … ¨ Have a threshold or objective value, ¨ Characterize the major drivers of performance, ¨ Are considered Critical to Customer (CTC). “Technical Measurement,” INCOSE–TP–2003–020–01 70
  • 71. Technical Performance Measures (TPM) ¨ Technical Performance Measures … ¨ Assess design progress, ¨ Define compliance to performance requirements, ¨ Identify technical risk, ¨ Are limited to critical thresholds, ¨ Include projected performance. “Technical Measurement,” INCOSE–TP–2003–020–01 71
  • 72. Dependencies Between These Measures “Coming to Grips with Measures of Effectiveness,” N. Sproles, Systems Engineering, Volume 3, Number 1, pp. 50–58 72
  • 73. A Simple Method of Assembling the TPMs 73
  • 74. Technical Performance Measures Trends and Responses 74 25kg 23kg 28kg 26kg PDRSRRSFRCA TRRCDR ROM in Proposal Design Model Bench Scale Model Measurement Detailed Design Model Prototype Measurement Flight 1st Article TechnicalPerformanceMeasure VehicleWeight DRS–MES
  • 75. There are many moving parts in the credible IMS. The Critical Few are the ones we’ll focus on in these sessions. 3: Connecting the Dots to an Actual IMSDay 1 1 Hour
  • 76. How Can We Measure Credibility? ¨ Statistical credibility ¤ The probability of completing on or before a date ¤ The probability of cost being some value or less ¨ Program architecture credibility ¤ Can the planned maturity be reached with the work activities shown in the IMP? ¨ Technical performance credibility ¤ What measures of effectiveness (MOE) and measures of performance (MOP) are needed to assure increasing technical maturity? 76
  • 77. Remember, the dots are random variables 77
  • 78. The critical few for connecting the dots ¨ Work durations that have probabilistic work values ¤ Calibrated – Ordinal – probability distributions ¤ Assignment of risk ranges to classes of work ¨ A Logical flow of work ¤ Work activities are nose to tail ¤ 100% complete assessment before starting next activity ¤ Resource loaded for BCWS to connect cost to schedule 78
  • 79. Thinking About Risk Categories Classification Uncertainty Overrun A Routine, been done before Low 0% to 2% B Routine, but possible difficulties Medium to Low 2% to 5% C Development, with little technical difficulty Medium 5% to 10% D Development, but some technical difficulty Medium High 10% to 15% E Significant effort, technical challenge High 15% to 25% F No experience in this area Very High 25% to 50% ¨ These categories can be used to avoid asking the “3 point” question for each task ¨ This information will be maintained in the IMS ¨ When updates are made the percentage change can be applied across all tasks 79
  • 80. First, the major data elements 80 Task to “watch” (Number3) Most Likely (Duration3) Pessimistic (Duration2) Optimistic (Duration1) Distribution (Number1)
  • 81. Before lunch a quick look at the end 81 ¨ The height of each box indicates how often the project complete in a given interval during the run ¨ The S–Curve shows the cumulative probability of completing on or before a given date. ¨ The standard deviation of the completion date and the 95% confidence interval of the expected completion date are in the same units as the “most likely remaining duration” field in the schedule Date: 9/26/2005 2:14:02 PM Samples: 500 Unique ID: 10 Name: Task 10 Completion Std Deviation: 4.83 days 95% Confidence Interval: 0.42 days Each bar represents 2 days Completion Date Frequency CumulativeProbability 3/1/062/10/06 3/17/06 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 0.02 0.04 0.06 0.08 0.10 0.12 0.14 0.16 Completion Probability Table Prob ProbDate Date 0.05 2/17/06 0.10 2/21/06 0.15 2/22/06 0.20 2/22/06 0.25 2/23/06 0.30 2/24/06 0.35 2/27/06 0.40 2/27/06 0.45 2/28/06 0.50 3/1/06 0.55 3/1/06 0.60 3/2/06 0.65 3/3/06 0.70 3/3/06 0.75 3/6/06 0.80 3/7/06 0.85 3/8/06 0.90 3/9/06 0.95 3/13/06 1.00 3/17/06 Task to “watch” 80% confidence that task will complete by 3/7/06
  • 83. Let’s look at an IMS that has been populated with the fields and their contents that is ready for a Risk+ assessment. We’ll walk through this set up process later, but here’s the complete product. 5: Example of an IMS ready for DID 81650Day 1 1 Hour
  • 84. DRS–MESIndividual Elements of the Integrated Master Schedule Live Example of MSFT Project IMS
  • 85. Risk+ requires a set up process, an operational process, and an analysis process to provide meaningful information to the decision makers. Risk+ tells us the probability of completing “on or before a date,” at “a cost or less.” 6: Demonstration of Risk+Day 1 Date: 9/26/2005 2:14:02 PM Samples: 500 Unique ID: 10 Name: Task 10 Completion Std Deviation: 4.83 days 95% Confidence Interval: 0.42 days Each bar represents 2 days Completion Date Frequency CumulativeProbability 3/1/062/10/06 3/17/06 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 0.02 0.04 0.06 0.08 0.10 0.12 0.14 0.16 Completion Probability Table Prob ProbDate Date 0.05 2/17/06 0.10 2/21/06 0.15 2/22/06 0.20 2/22/06 0.25 2/23/06 0.30 2/24/06 0.35 2/27/06 0.40 2/27/06 0.45 2/28/06 0.50 3/1/06 0.55 3/1/06 0.60 3/2/06 0.65 3/3/06 0.70 3/3/06 0.75 3/6/06 0.80 3/7/06 0.85 3/8/06 0.90 3/9/06 0.95 3/13/06 1.00 3/17/06 2 Hours
  • 86. DRS–MESIndividual Elements of the Integrated Master Schedule Live Example of Risk+
  • 87. Quick Look At Monte Carlo 87 sinA l q<
  • 88. What is Monte Carlo Simulation? ¨ A class of computational algorithms that rely on repeated random sampling to compute their results. ¨ Useful for simulating systems with many coupled degrees of freedom. ¨ Used to model phenomena with significant uncertainty in inputs, such as risk. ¨ Evaluate multidimensional definite integrals with complicated boundary conditions 88
  • 89. DRS-MES Let’s Visit The Risk Classification Again Classification Uncertainty Overrun A Routine, been done before Low 0% to 2% B Routine, but possible difficulties Medium to Low 2% to 5% C Development, with little technical difficulty Medium 5% to 10% D Development, but some technical difficulty Medium High 10% to 15% E Significant effort, technical challenge High 15% to 25% F No experience in this area Very High 25% to 50% ¨ These classifications can be used to avoid asking the “3 point” question for each task ¨ This information will be maintained in the IMS ¨ When updates are made the percentage change can be applied across all tasks 89
  • 90. DRS-MES Guiding the Risk Factor Process requires careful weighting of each level of risk Min Most Likely Max Low 1.0 1.04 1.10 Low+ 1.0 1.06 1.15 Moderate 1.0 1.09 1.24 Moderate+ 1.0 1.14 1.36 High 1.0 1.20 1.55 High+ 1.0 1.30 1.85 Very High 1.0 1.46 2.30 Very High+ 1.0 1.68 3.00 For tasks marked “Low” a reasonable approach is to score the maximum 10% greater than the minimum. The “Most Likely” is then scored as a geometric progression for the remaining categories with a common ratio of 1.5 Tasks marked “Very High” are bound at 200% of minimum. ¤ No viable project manager would like a task grow to three times the planned duration without intervention The geometric progress is somewhat arbitrary but it should be used instead of a linear progression 90
  • 91. DRS-MES Risk+ Quick Overview Task to “watch” (Number3) Most Likely (Duration3) Pessimistic (Duration2) Optimistic (Duration1) Distribution (Number1) 91
  • 92. Monte Carlo Simulation of Schedule Risk ¨ The height of each box indicates how often the project complete in a given interval during the run ¨ The S–Curve shows the cumulative probability of completing on or before a given date. ¨ The standard deviation of the completion date and the 95% confidence interval of the expected completion date are in the same units as the “most likely remaining duration” field in the schedule. 92 DRS–MES Date: 9/26/2005 2:14:02 PM Samples: 500 Unique ID: 10 Name: Task 10 Completion Std Deviation: 4.83 days 95% Confidence Interval: 0.42 days Each bar represents 2 days Completion Date Frequency CumulativeProbability 3/1/062/10/06 3/17/06 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 0.02 0.04 0.06 0.08 0.10 0.12 0.14 0.16 Completion Probability Table Prob ProbDate Date 0.05 2/17/06 0.10 2/21/06 0.15 2/22/06 0.20 2/22/06 0.25 2/23/06 0.30 2/24/06 0.35 2/27/06 0.40 2/27/06 0.45 2/28/06 0.50 3/1/06 0.55 3/1/06 0.60 3/2/06 0.65 3/3/06 0.70 3/3/06 0.75 3/6/06 0.80 3/7/06 0.85 3/8/06 0.90 3/9/06 0.95 3/13/06 1.00 3/17/06 Task to “watch” 80% confidence that task will complete by 3/7/06
  • 93. DRS-MES Integrating Risk and Schedule Probabilistic completion times change as the program matures The efforts that produce these improvements must be traceable in the IMS The “error bands” on the events must include the risk mitigation activities as well IMS activities show how the “error band” narrows over time. ¤ This is the basis of a “programmatic risk tolerant” IMS ¤ The probabilistic interval becomes more reliable as risk mitigations and maturity assessments add confidence the to IMS1 Baseline Plan 80% Mean Missed Launch Period Launch Period Ready Early Oct 07 Nov 07 Dec 07 Jan 08 Feb 08 Mar 08 Apr 08 May 08 Jun 08 Plan Margin Current Plan with risks is the stochastic schedule CDR PDR SRR FRR ATLO 20% Aug 05 Jan 06 Aug 06 Mar 07 Dec 07 Feb 08 Current Plan with risks is the deterministic schedule Risk Margin 93
  • 94. DRS-MES What Can Confidence Intervals Tell Us about the validity of the IMS? ¨ As the program proceeds so does ¤ Increasing accuracy ¤ Reduced schedule risk ¤ Increasing visual confirmation that success can be reached Current Estimate Accuracy 94
  • 95. DRS-MES The Cost Probability Distributions as a function of the weighted cost drivers $ Cost Driver (Weight) Cost = a + bXc Cost Estimate Historical data point Cost estimating relationship Standard percent error boundsTechnical Uncertainty Combined Cost Modeling and Technical Uncertainty Cost Modeling Uncertainty 95
  • 96. The raw materials for connecting the dots is in place. Let’s test that statement with feedback and plans for tomorrow 7: Wrap Up and FeedbackDay 1 1 Hour
  • 97. Let’s To Put These Ideas to Work Tomorrow on a Real Project
  • 100. 100 OK, enough of the classroom work, let’s go to work
  • 101. With our “real” IMS let’s look at the structural aspects of the work efforts before doing any real analysis. 8: Structural Assessment and Gap ClosuresDay 2 1 Hour
  • 102. Integrating the Cost, Schedule and Technical Risk Model Cost, Schedule, Technical Model† WBS Task 100 Task 101 Task 102 Task 103 Task 104 Task 105 Task 106 Probability Density Function § Research the Project § Find Analogies § Ask Endless Questions § Analyze the Results § What can go wrong? § How likely is it to go wrong? § What is the cause? § What is the consequence? Monte Carlo Simulation Tool is Mandatory 1.0 .8 .6 .4 .2 0 Days, Facilities, Parts, People Cumulative Distribution Function 102
  • 103. Start with a “notional” arrangement of the “Bundles” of Work ¨ WPs should not have intermediate connections to other WPs. 7w 7w 5w 5w 1w 3w 3w 3w 2w 7w 5w ¨ The first approach is to have long running WPs with negative or positive lags to maintain sequencing. ¨ A better approach is to break the WP into separate deliverables and sequence Finish to Start . 103
  • 104. Schedule Margin 104 ¨ DID 81650 defines schedule margin as a designated buffer and stipulates it is part of the baseline
  • 105. –ginMar– Margin Margin “Applying Schedule Reserve to Software Project Management,” Walter Lipke, STSC CrossTalk, March 1999 105 PMB
  • 106. The simple approach to risk categories is just that “simple.” We’ll need to understand the concepts of ordinal risk ranking and the interaction between the risk Probability Distribution Function (PDF) and the Risk + work processes. 9: Building Risk CategoriesDay 2 2 Hours
  • 107. Never calculate without first knowing the answer – John Archibald Wheeler
  • 108. Risk Ranking of Individual Tasks 108 Risk Rank Percent Variance Notional Interpretation of Risk Ranking 1 – 5% + 10% Normal business, technical & manufacturing processes are applied 2 – 5% + 15% Normal business & technical processes are applied; new or innovative manufacturing processes 3 – 5% + 35% Flight software development & certification processes 4 – 10% + 25% Build & qualification of flight components, subsystems & systems 5 – 10% + 35% Flight software qualification 6 – 5% + 175% ISS thermal vacuum acceptance testing
  • 109. Let’s take Risk+ out for a ride on our real schedule and discover how confident we are on the completion dates. 10: First run of Risk+ on a real scheduleDay 2 1 Hour
  • 111. Now that we’ve seen the pictures, what do we do about them? What decisions can be made? What adjustments are needed to increase our confidence in meeting the completion dates. 12: Adjusting the IMS with this new informationDay 2 1 Hour
  • 112. With this information let’s define how much margin is needed where to put this margin and how to assess the “probability of completing on or before a specific date.” 13: Building a Baseline–able IMS compliant with 81650Day 2 2 Hours
  • 113. IMS Improvement Opportunities ¨ All tasks arrange Finish-to-Start ¤ No leads or lags ¤ This allows re-sequencing with little or no effort ¤ Provides visibility to the flow of work ¤ All task work complete before starting the next work ¨ Fidelity improved through complete vertical integration ¤ A clear boundary between logical flows ¤ Isolates interactions ¨ Risk distributions are optimized by risk class and program phase 113
  • 114. IMS Metrics 114 Model Statistics Relationship Types Lead/Lag Values Target Dates Network Status Total activities Finish to start FS with positive lag Records with any target date type Activities completed Total milestones Start to start SS with no negative lag Hard targets— Start on, finish on Activities in progress Total relationships Finish to finish FF with no negative lag Activities past due Average task duration Start to finish Activities without predecessors or successors Activities with negative float Summary tasks Activities with less than program-defined threshold Activities with float >100 days Activities with 1-day duration Activities with duration <5 days
  • 115. Next steps, now that we have an understanding of what to do and what not to do Final questions and plansDay 2 1 Hour