The document discusses process maturity and the Capability Maturity Model Integration (CMMI) framework. It explains that CMMI was developed to address software project failures and defines five levels of process maturity for organizations, from initial/ad hoc processes at level 1 to optimized continuous improvement at level 5. The document provides examples of characteristics and processes typical of lower and higher maturity levels. It argues that technology management should be a core competency for all organizations to help ensure project success and avoid failures resulting from a lack of technology expertise or defined processes.
08448380779 Call Girls In Friends Colony Women Seeking Men
Process Maturity Key to Managing Complex Tech Projects
1. Introduction to Process Maturity
AAM Annual Conference
April 27, 2008
Michael Edson
Director, Web and New Media Strategy
Smithsonian Institution
From the session
Good Projects Gone Bad
Managing and Delivering Complex Technology Projects
Michael Edson Nik Honeysett
Director, Web and New Media Strategy Head of Administration
Smithsonian Institution J. Paul Getty Museum
2. Table of Contents
Abstract .....................................................................................................................................................................1
Projects in Trouble .........................................................................................................................................................2
Technology management as core competency .........................................................................................................3
Process Maturity and Capability Maturity Model Integration .......................................................................................5
Understanding different levels of capability maturity ..............................................................................................6
Using CMMI ...................................................................................................................................................................9
First, figure out where you are ..................................................................................................................................9
Ratchet up one level at a time ................................................................................................................................11
Don’t try to skip levels .............................................................................................................................................11
Don’t slip back .........................................................................................................................................................12
Pick projects appropriate for your level ..................................................................................................................13
Assign responsibility and measure measure measure ............................................................................................13
Some Practical Ways to increase process maturity .....................................................................................................14
Classic mistakes avoidance......................................................................................................................................14
Transparency through standardized reporting .......................................................................................................21
Governance Structure .............................................................................................................................................23
Consequences and phenomena ..................................................................................................................................24
Web 2.0: Lightweight Development Frameworks ...................................................................................................24
Governance and Control .........................................................................................................................................26
Capability mismatch ................................................................................................................................................24
Real World Examples ...................................................................................................................................................27
Capability Mismatch: Smithsonian Multimedia Guide ............................................................................................27
Lightweight Software Development: SAAM’s Eye Level Blog .................................................................................29
Matching goals to capacity and maturity: SAAM “Findability” project ...................................................................30
Conclusion ...................................................................................................................................................................31
Smithsonian Web and New Media Strategy .................................................. Error! Bookmark not defined.
3. Abstract
Museum Web and New Media software projects offer tantalizing rewards, but the road to success can
be paved with uncertainty and risk. To small organizations these risks can be overwhelming, and even
large organizations with seemingly limitless resources can flounder in ways that profoundly affect staff
morale, public impact, the health and fitness of our partners in the vendor community, and our own
bottom lines. Something seems to happen between the inception of projects, when optimism and
beneficial outcomes seem clear and attainable, and somewhere down the road when schedules,
budgets, and outcomes go off course. What is it? And what can we do to gain control?
This paper, created for the 2008 annual conference of the American Association of Museums, describes
some common ways that technology projects get into trouble. It examines a proven project-process
framework called the Capability Maturity Model and how that model can provide insight and guidance
to museum leaders and project participants, and it tells how to improve real-world processes that
contribute to project success. The paper includes three brief case studies and a call-to-action which
argues that museum leaders should make technology stewardship an urgent priority.
The intended audience is people who are interested in understanding and improving how museum-
technology gets done. The paper’s primary focus is Web and New Media software projects, but the core
ideas are applicable to projects of all kinds.
A note on style and formatting
I decided write this paper, instead of just creating PowerPoint slides, to force myself to delve deeper
into these ideas, tie them together, and give them a home on the Web where others can find them, use
them, critique them, and improve them. I’m not a trained academic writer and I haven’t benefited from
editorial assistance, so I ask for the reader’s forgiveness for the errors in consistency and style that they
will certainly find. I suppose my paper-writing process maturity is “evolving”—a joke that will be funnier
when you’ve reached the final page.
1
4. Projects in Trouble
Back in the 1980’s the Federal Government had a problem. Software projects were failing—expensively,
painfully, publicly failing.
The US General Services Administration in a report titled “Mission-Critical Systems: Defense Attempting
to Address Major Software Challenges” 1 observed:
As systems become increasingly complex, successful software development
becomes increasingly difficult. Most major system developments are fraught with
cost, schedule, and performance shortfalls. We have repeatedly reported on costs
rising by millions of dollars, schedule delays of not months but years, and
multibillion-dollar systems that don’t perform as envisioned.
The problem wasn’t just that the government couldn’t complete software projects on time or on
budget, or that it couldn’t predict which projects it was currently working on would succeed or fail—
though these were both significant and severe problems—but most worrisome from my perspective is
that it couldn’t figure out which new projects it was capable of doing in the future. If a business case or
museum mission justifies an investment in technology that justification is based on the assumption that
the technology can be competently implemented. If instead the assumption is that project execution is a
crap shoot, the business case and benefit-to-mission arguments crumble and managers are stuck,
unable to move forward (because of the risk of failure) and unable to not move forward because
business and mission needs still call.
The pace of change in foundational technologies and assumptions (or dreams) about what could be
done with software makes it very difficult for managers to know what skills, competencies, and
capacities they need to ensure success.2 There’s little in most museum employees’ training or
experience to prepare them for making technology decisions, and there are few patterns to follow.
Unlike the building-construction and maintenance trades, software engineering is a relatively new
profession in museums and doesn’t benefit from generations of established practice, training,
certification, standards, and lessons learned through trial-and-error. One wouldn’t try to run a museum
building without a building manager and building engineer—your insurance company would probably
cancel your policy if you tried, yet few people would raise a red flag if you ran a museum without the
equivalent technology expertise.
1
General Accounting Office, 1992, IMTEC-93-13 Mission-Critical Systems: Defense Attempting to Address Major
Software Challenges, http://archive.gao.gov/d36t11/148399.pdf accessed 4/21/2008
2
At the 2006 Gilbane Conference on Content Technologies in Government (Washington, DC. June 2006) CIO’s
confessed that they had given up multi-year planning because the pace of change in the industry was just too
great. (I attended and presented at this conference.)
2
5. Technology management as core competency
As a case-in-point, the AAM accreditation process requires museums to provide detailed information
about the operation and maintenance of physical facilities—including detailed documentation of
physical security, health and safety programs, grounds and land-management plans, and something
called an “RC-AAM Standard Facility Report for the museum’s buildings”—but there are no questions
about information-technology beyond a single item on “Internet-related interpretive activities.” It would
appear that a museum can become AAM certified without an exploration of its information-technology
operations, or at the very least that the process does not draw museums’ attention to IT best practices
in the same way that it does for buildings and grounds.3 I expect this sends a message to museum
leaders that that information technology is not significantly important to their long-term success.
But it is important—critically so. Even if you’re a small museum with fewer than ten employees,
somebody is involved in software development. They’re making Excel spreadsheets, Access databases,
collections of word documents and forms. Maybe they’ve got you involved in YouTube, Flickr, or
Facebook. And if you staff isn’t doing this, your visitors are. I’d be surprised to find an AMM member
museum that doesn’t have single technology initiative, and I’m sure that you could not find a museum
whose audiences were ambivalent about technology.
And museums can’t choose not to focus on technology. Witness the story of Doug Morris, Chair and CEO
of Universal Music Group, which I offer as a cautionary tale.
Doug Morris, Chair and CEO Universal Music Group (Photo: Getty Images)4
Mr. Morris, by all appearances, is a successful tycoon, running a $7 billion-a-year pop culture empire 5
and hobnobbing with the rich-and-famous—he would be recognizable and comfortable as a donor and
3
Based on the AAM Accreditation Self-Study Questionnaire (2007) and the author’s recent experience with the
AAM Accreditation process.
4
http://www.wired.com/entertainment/music/magazine/15-12/mf_morris, accessed 4/26/2008
5
Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away, Wired, 11/27/2007,
http://www.wired.com/entertainment/music/magazine/15-12/mf_morris accessed 4/21/2008
3
6. member on museum boards. (He was Director of the Rock-and-Roll Hall of Fame6.) Mr. Morris is also a
creative person: he wrote quot;Sweet Talkin' Guy for The Chiffons in 1966 and produced quot;Smokin' In the
Boys Roomquot; for Brownsville Station in 19737.
But at the helm of his $7 billion-a-year business Mr. Morris chose to opt-out of the technology business
in the 1990’s, just when digital music and the Internet went supernova. The awkward stumbling of the
music business in the last 15 years, the acrimony caused by the relentless pursuit of its customers, and a
cascade of technology failures, missed boats, and squandered opportunities was the result.
From a Wired Magazine interview:
quot;There's no one in the record company that's a technologist,quot; Morris explains. quot;That's a
misconception writers make all the time, that the record industry missed this. They
didn't. They just didn't know what to do. It's like if you were suddenly asked to operate
on your dog to remove his kidney. What would you do?quot;
quot;We didn't know who to hire,quot; he says, becoming more agitated. quot;I wouldn't be able to
recognize a good technology person — anyone with a good bullshit story would have
gotten past me.8quot;
As New York Entertainment’s blog Vulture observed this about Mr. Morris’s confession:
Even though we shouldn't be, we're actually a little shocked. We'd always assumed the
labels had met with a team of technology experts in the late nineties and ignored their
advice, but it turns out they never even got that far — they didn't even try!
New York Entertainment continues:
Understanding the Internet certainly isn't easy — especially for an industry run by a
bunch of technology-averse sexagenarians — but it's definitely not impossible. The
original Napster hit its peak in 1999 — kids born since then have hacked into CIA
computers. Surely it wouldn't have taken someone at Universal more than a month or
two to learn enough about the Internet to know who to call to answer a few questions.
They didn't even have any geeky interns?9
6
Vivendi board bio http://www.vivendi.com/corp/en/governance/dir_morris.php accessed 4/21/2008
7
Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away, Wired, 11/27/2007,
http://www.wired.com/entertainment/music/magazine/15-12/mf_morris accessed 4/21/2008
8
Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away, Wired, 11/27/2007,
http://www.wired.com/entertainment/music/magazine/15-12/mf_morris accessed 4/21/2008
9
Apropos of Nothing, New York Entertainment
http://nymag.com/daily/entertainment/2007/11/universal_music_ceo_doug_morris.html accessed 4/19/2008 .
4
7. So what’s the headline here? It’s that large and small businesses have a lot to gain from focusing on how
to get good and stay good at technology, nobody is immune from failure, and nobody gets to opt-out.
The irony is that many museums are drawn to complex technology initiatives and the risks of getting in
over their heads just as they reach the point where successful technology projects can have a positive
impact.10
Process Maturity and Capability Maturity Model Integration
Capability Maturity Model Integration (CMMI), was developed by the Software Engineering Institute
(SEI) at Carnegie Mellon University (http://www.sei.cmu.edu) in 1991 to help the Federal Government
understand the capabilities of its software vendors and deal proactively with the problem of out-of-
control software projects. It became and remains a best-practice software-development framework11
and its core ideas can help organizations of all kinds escape from, as Steve McConnell puts it in his
software development bible Rapid Development (Microsoft Press, 1996), the Gilligan’s Island cycle of
under-performing projects.
Figure 1. Use CMMI to help your team escape from Gilligan's Island
CMM posits that organizations, or groups or processes within organizations, function at one of five
levels of process maturity, with level 1 being the lowest or least mature level, and level 5 as the highest
or most mature level.12
10
Edson, Michael, Data Access Strategy, in J. Trant and D. Bearman (eds.). Museums and the Web 2006:
Proceedings, Toronto: Archives & Museum Informatics, published March 1, 2006 at
http://www.archimuse.com/mw2006/papers/edson/edson.html
11
Gartner Research: CMMI Remains the Standard for Software Process Frameworks. ID Number: G00156315.
Published 4/18/2008.
12
The descriptions from this list are from Paulk, M., et al (1995). The Capability Maturity Model: Guidelines for
Improving the Software Process, New York: Addison-Wesley Professional, but the names of the five levels show the
updated terminology established by the Software Engineering Institute in 2003.
5
8. 1. Initial – Processes, if they are defined at all, are ad hoc. Successes depend on individual
heroics and are generally not repeatable.
2. Managed – Basic project management practices are established and the discipline is in place
to repeat earlier successes with similar projects.
3. Defined – Processes are documented and standardized and all projects use approved, tailored
versions of the standard processes.
4. Quantitatively Managed – The performance of processes and the quality of end-products are
managed with quantitative measurement and analysis.
5. Optimizing – Continuous process improvement is enabled by quantitative feedback from the
process and from piloting innovative ideas.
Capability Maturity Model
5. Optimizing
4. Quantitatively Managed
3. Defined
2. Managed
1. Initial
Figure 2. The five levels of the Capability Maturity Model
The five levels should be understood as a kind of staircase, lowest maturity on the bottom and highest
on the top, with each level serving as the foundation for the level above (figure 1).
Understanding different levels of capability maturity
Paulk et. al. in The Capability Maturity Model: Guidelines for Improving the Software Process (New York:
Addison-Wesley Professional, 1995) lay out a very useful chart that helps bring into focus how the model
relates to our own organizations and what our work worlds would look like if capability and maturity
were improved. (See Table 1.)
Typical in the low-maturity column (Level 1) are phrases like
success depends on individual heroics
and,
few stable processes exist or are used
Typical in the high-maturity column are phrases like
6
9. strong sense of teamwork exists across the organization
and,
everyone is involved in process improvement
I myself, have worked on more than a few projects that functioned at level 1. At level 1, not much is
written down, nobody is sure who is doing what or who “owns” various parts of the project, nobody can
really tell what the schedule is or whether you are on schedule or running late, meetings are held but
nobody takes notes or records actionable assignments, and products are often only partially finished,
with lots of surprises and defects discovered at the last minute before “completion.”
This isn’t to say that you can’t have successes at level 1. I worked for a museum that produced several
award-winning technology projects this way, but the successes were built on individual heroics and the
effort almost killed them. (Just after winning awards for a major Web site this organization spent over
two years trying to complete what was to have been a six-month redesign.) Functioning at this maturity
level certainly diminished their willingness to stay together as a team or build on their success, and it
bread a distrust of technology-content initiatives in general that made it difficult to get buy-in for urgent
and necessary new projects.
7
10. Table 1. Implications of advancing through CMM levels. Which columns best describe your organization? (This table was very slightly modified to
enhance clarity for non-software professionals.)
Level 1. Level 2. Level 3. Level 4. Level 5.
Success depends on Success depends on Project groups work Strong sense of teamwork Strong sense of teamwork
People
individual heroics individuals together, perhaps as an exists within each project exists across the
integrated team organization.
“Fire fighting” is a way of life Commitments are
understood and managed Training is planned and Everyone is involved in
Relationships between provided according to roles process improvement
disciplines are People are trained
uncoordinated, perhaps even
adversarial
Few stable processes exist or At the individual project Integrated management and Processes are quantitatively Processes are continuously
Processes
are used level, documented and stable engineering (how things get understood and stabilized and systematically improved.
estimating, planning and built) processes are used
“Just do it!” Sources of individual Common sources of
commitment processes are across the organization
problems are understood and problems are understood and
used
Problems are anticipated and eliminated eliminated
Problems are recognized and prevented, or their impacts
corrected as they occur are minimized
Data collection and analysis Planning and management Data are collected and used Data definition and collection Data are used to evaluate
Measurement
are ad hoc data used by individual in all defined processes are standardized across the and select process
projects organization improvements
Data are systematically
shared across projects Data are used to understand
work processes quantitatively
and stabilize them
Introduction of new Technology supports New technologies are New technologies are New technologies are
Technology
technology is risky established, stable activities evaluated on a qualitative evaluated on a quantitative proactively pursued and
basis basis deployed
8
11. Using CMMI
Using the CMMI can be a relatively informal process that involves understanding and applying process-
improvement best practices to your organization. Or, it can be a formal process that involves extensive
training, creation of a process improvement infrastructure, appraisals, and more.
To avoid confusing people who are familiar with heavy-duty process-improvement efforts I must draw a
distinction between the formal CMMI process defined by the Software Engineering Institute and what
I’m talking about here. In this paper I argue that many organizations can benefit from what CMMI has to
offer, but I am not advocating a full-fledged CMMI program which typically involves formal assessment
teams, rigid interpretations of CMMI, a great deal of work: these kinds of efforts don’t deliver good
return-on-investment for organizations at emerging maturity levels.13 What I advocate is a kind of
CMMI-Lite in which organizations borrow the most useful aspects of CMMI without becoming overly
bound to the formal doctrine. As Gartner, Inc. says, “Organizations should use CMM as a guidebook, not
a ‘cookbook.’ Results-based improvement should be the key.”14
First, figure out where you are
Unless you’re working with a formal CMM assessment team the first step to understanding and
improving your capability maturity is to look at Table 1 and identify the statements that best describe
how your team does work. You don’t have to think across every kind of project your organization does:
pick one or two projects or activities that you think would benefit from some improvement. Note that
it’s not uncommon for organizations to have some processes that are very mature and some that are
very immature. CMMI orthodoxy recognizes this and encourages a methodology of continuous
improvement at varying levels of maturity.
You may find it useful to modify table 1 or the overarching CMMI levels of maturity listed above and to
cast them in terms that better describe your organization, or your project.
13
Phone interview with Sanjeev Sharma, CMM specialist and IT Manager Safety & Mission Assurance, NASA
Goddard Space Flight Center in Greenbelt, MD. 3/11/08
14
Gartner Research, CMMI Remains the Standard for Software Process Frameworks. Article ID #G00156315,
4/16/2008. The exact quote is “Internal development organizations should use CMMI as a guidebook, not a
‘cookbook.’ Results-based improvement should be the key.”
9
12. Capability Maturity Model
Figure out 5. Optimizing
where you
are? 4. Quantitatively Managed
3. Defined
2. Managed
1. Initial
Figure 3. Figure out where you are on the Capability Maturity Model
For example, in 2006 I modified the out-of-the-box CMM level definitions to be more meaningful to a
data-strategy project at the Smithsonian American Art Museum.15 The definitions shown below helped
me understand the roadmap and projects that were needed to get us from where we were (level 2) to
where we wanted to be (levels 3 and 4).
Level 1 – Limited data federation; often with redundant and inconsistent data. Data strategy is
not even on the organizational radar.
Level 2 – Limited data consolidation; documenting redundancies and inconsistencies. Some
isolated departments are trying to raise awareness and initiate projects.
Level 3 – Data integration initiated; new ‘disintegration’ is discouraged. Multi-departmental
teams begin working on policies and procedures to advance a data strategy.
Level 4 – Data integration widely adapted; ‘disintegration’ is penalized. All projects in the
organization adhere to data integration policies and managers are held accountable for
variances.
If you conclude that you’re at a low level of maturity, you’re not alone. Gartner research finds that most
organizational software development teams function at Level 1 or Level 2, “which means that, at best,
they have some reasonably good project management practices,” and less than 25% of teams function
at level 3 or higher (Hotle, 'Just Enough Process' for Applications). Taken at face value, this means that
most software development efforts can be expected to produce inconsistent results with little control of
budget and timelines. Though this is appauling, the good news is that basic process improvement
initiatives could have a dramatic effect on the productivity and predictability of a great many software
projects.
15
Edson M., Data Access Strategy, in J. Trant and D. Bearman (eds.). Museums and the Web 2006: Proceedings,
Toronto: Archives & Museum Informatics, published March 1, 2006 at
http://www.archimuse.com/mw2006/papers/edson/edson.html. I’m not sure why I described four levels instead
of five.
10
13. Ratchet up one level at a time
If you’re at level 1, what small steps can you take to get to level 2? The Software Engineering Institute
says that you can get from level 1 to level 2 just by establishing sound project management practices
(CMMI for Acquisition, 2007). Such practices might include activities such as tracking and communicating
project status, measuring effort and outcomes, or ensuring roles-and-responsibilities are adequately
defined.
Capability Maturity Model
5. Optimizing
Ratchet up
gradually 4. Quantitatively Managed
over time
3. Defined
2. Managed
1. Initial
Figure 4. Improve maturity gradually, one level at a time
These process-improvement efforts don’t need to take a lot of time and effort. Matt Hotle of Gartner
says that he very seldom sees an basic process improvement effort that takes more than a couple of
weeks” (interview with the author, 4/24/08).
The Software Engineering Institute notes that improvements that move a group from level 1 to level 2
may depend on “heroics” of individual staff members until the concepts of process improvements are
more widely understood and supported (CMMI for Acquisition, 2007).
Don’t try to skip levels
It’s very tempting to try to skip from low levels of maturity to high ones without going through the
intermediate steps. For example, if your organization really wants to use new technologies on the
cutting edge, but your current state is that the “introduction of new technology is risky” (Level 1 from
Table 1) then you would be well served to work first on ratcheting your technology adoption capabilities
up to level 2, “technology supports established, stable activities” and see how that goes.
11
14. Capability Maturity Model
Don’t skip steps 5. Optimizing
4. Quantitatively Managed
3. Defined
2. Managed
1. Initial
Figure 5. Avoid the temptation to skip steps. It’s risky.
Trying to leapfrog from level 1 to level 4 or five doesn’t give your organization time to establish the core
competencies needed to succeed at high levels of expected performance. The Software Engineering
Institute (SEI) says “Because each maturity level forms a necessary foundation for the next level, trying
to skip maturity levels is usually counterproductive.” (CMMI Project Team, 2007.) The SEI further notes
that “processes without the proper foundation may fail at the point they are needed most—under
stress.” John P. Kotter, in the Harvard Business Review notes that “Skipping steps creates only an illusion
of speed and never produces a satisfying result.” (Kotter, 1995)
Don’t slip back
A recent book on evolution16 stated that Charles Darwin’s greatest contribution was not that he thought
up modification with descent (natural selection), but that his research and writing tied the idea down so
firmly so that it could never drift away. There’s an important lesson here for process improvement: try
to ensure that whatever improvements you do make to software development processes become
codified and formalized so that as staff and managers come and go and teams adapt and change your
hard won progress doesn’t atrophy. Remember that every level is a foundation for the one that comes
next.
16
I read this somewhere recently but have not been able to track down the citation!
12
15. Capability Maturity Model
5. Optimizing
Don’t slip back! 4. Quantitatively Managed
3. Defined
2. Managed
1. Initial
Figure 6. Solidify gains in maturity so that they're permanent.
Pick projects appropriate for your level
This is related to “don’t skip steps” pattern, but is more focused on tailoring what you need to get done
with what you’re capable of doing. Usually, at lower levels of maturity this means breaking ambitious
visions into smaller, less costly, and less risky sub-projects that together, achieve the vision. This
approach is harmonious with a lot of recent thinking, particularly in Web application development, there
are significant beneficial consequences for organizations at all levels of maturity. (More on this later.)
Capability Maturity Model
Pick projects
Appropriate 5. Optimizing
For your
4. Quantitatively Managed
level
3. Defined
2. Managed
1. Initial
Figure 7. Pick projects appropriate for your current
capability maturity level
Assign responsibility and measure measure measure
Matt Hotle, Gartner’s CMMI expert, states that assigning responsibility for process improvement
initiatives is one of the most important highest-value steps an organization can take.17 Gartner strongly
asserts that assigning responsibility for process improvement and measuring efforts are the most critical
steps.
17
Phone interview with the author, 4/24/2008
13
16. Of measurement, Hotle writes
Application people are generally terrible with measurement. This actually may be a
kind statement, because we believe that fewer than 20% of application
organizations have a usable measurement system. From a governance perspective,
it's a sin to have a set of processes that have been defined, but have no feedback
loop to understand whether the processes are doing well. (The 'Seven Deadly Sins'
of Application Governance. ID Number: G00155896, 2008)
So what should you measure at lower levels of maturity? For typical museum Web
development projects, start by measuring staff-hour projections, actual staff-hours spent,
defects (bugs and errors), and the movement of content through development, review, and
approval processes.
Some Practical Ways to increase process maturity
Classic mistakes avoidance
Steve McConnell, in his classic book Rapid Development: Taming Wild Software Schedules (Microsoft
Press, 1996) uses the concept of Classic Mistakes to help software developers avoid commonly
encountered, and repeated, errors. Classic Mistakes identify things that often go wrong with People,
Processes, and Technology and they are often related to immature work processes. Avoiding Classic
Mistakes is one of the best ways to move towards successful technology development.
The following list of classic mistakes is adapted from Adapted from Rapid Development (McConnell,
1996.)
14
17. Classic mistakes enumerated
• Process-Oriented Mistakes
– Lack of project management plan
Failure to define, up front, what project management practices will be used
– Failure to follow-through on project management plan
Good plan at start of project but not followed and implemented day-to-day
– Failure to define requirements up front
Team fails to define, in writing, what is to be delivered.
– Failure to accurately estimate time and resources
Related to requirements gathering
– Micro Management
… by project sponsors or managers.
– Failure to define roles and responsibilities
Who is responsible for what?
– Failure to develop creative brief
Lack of codified creative direction leads to stress with sponsors and partners. Related to
requirements.
– Failure to empower creative team
creative team hobbled by unclear sponsorship.
– Failure to maintain project visibility
Related to lack of project management plan
– Wishful thinking
Related to lack of requirements gathering, estimation
– Overly optimistic schedules
Related to wishful thinking, failure to estimate accurately
– Insufficient risk management
Known and obvious risks are not accounted for in management plan
– Wasted time upstream
Most projects waste time in beginning of project
– insufficient quality assurance
Failure to produce and follow a test plan. (Stems from failure to define requirements in advance.)
– Feature Creep
Related to lack of management controls
– Insufficient management controls
Project metrics lacking, deliverables unclear, visibility poor
– Failure to produce a design
15
18. “Design” in the architecture and requirements sense, not graphic design.
– Gold-plating requirements
Unrealistic desire to have all bells & whistles
– Ineffective management of contractors
Related to requirements gathering, management plan.
• People Oriented Mistakes
– Friction within team
Unaddressed problem relationships within team lower productivity and morale for entire project
team
– Friction with customers/partners
#2 complaint of software development teams
– Weak personnel
#1 complaint of software development teams
– Reliance on heroics to complete a project
This is related to wishful thinking and lack of requirements, management controls, etc.
– Unrealistic expectations
“We can just code like hell and get this done.” Related to lack of requirements, management
controls, etc
– Lack of effective project sponsorship
Ambiguous or inconsistent direction/participation from sponsors
– Lack of stakeholder buy-in
Ramming a project down a stakeholder’s throat. Related to sponsorship.
– Lack of user input
Failure to maintain relationship with customers
• Technology Oriented Mistakes
– Switching tools or technologies in the middle of a project
False promise of productivity or performance improvements often derail projects
– Lack of content or source-code control
Developers/authors overwrite each others documents.
– Silver-bullet syndrome
too much faith put in benefits of new technology and not enough thought put into how well they
would do in your organization
– Overestimated savings from new tools or methods
Organizations seldom improve in giant leaps
16
19. In 2004 I surveyed an experienced, award-winning project team about which classic mistakes they felt
were likely to occur during a software project we were initiating. The results were sobering, team
members identified 26 classic mistakes that they thought had a 1:3 or greater chance of occurring
during the course of the project. As a result, the top ten most-likely classic mistakes, and how to avoid
them, were described in the project’s management plan.
Table 2. Top-ten Classic Mistakes, from a 2004 project management plan
Estimated
Rank Classic Mistake Action to Take
Probability of
Occurrence
Lack of content or source-
1 68% Implement source-code control practices
code control
Failure to produce a design Produce a design, Ex Post Facto, starting week of
2 60%
document August 25th
Lack of project management
3 60% Project plan v 1.0 completed August 20th
plan
Failure to maintain project
4 60% Project visibility addressed in project plan.
visibility
5 60% Feature Creep Produce a design. Prioritize feature set.
6 58% Wasted time upstream The cow is already out of the barn on this one!
Reliance on heroics to Define roles and responsibilities. Emphasize
7 57%
accurate estimation. Implement management
complete a project
controls to track progress and anticipate delays.
Address proactively with team members and
8 53% Friction within team
management.
Failure to accurately estimate Related to lack of design. Having a project
9 53%
management plan should help. Managers must
time and resources
ensure staff accurately defines and estimates tasks.
Failure to define requirements
10 50% create requirements doc Ex Post Facto.
up front
Use Spiral Project Plans
If you’re not familiar with any particular project management frameworks then you might want to start
with a Spiral Project Plan. Spiral project plans are described by Steve McConnell as an iterative project-
management approach that is particularly appropriate for times when you’re not exactly sure of scope
17
20. and functionality when you start a project. (This is often the case with small-scale Web development
projects.)
Spiral project plans are organized around loops of increasing effort and complexity. Initial loops are
brief: subsequent loops last longer, take more effort, and have more impact. Each loop includes
activities where requirements are described and analyzed, some tangible product is created, results are
evaluated, decisions are made, and the next loop is planned. In early loops the products created may be
simple purpose statements or paper prototypes that are tested quickly on sample users. Later loops may
involve significant blocks of code and functionality that are tested with automated test scripts or in
usability labs, or, the project may transition into some other project-management framework
(McConnell, 1996).
Spiral Project Plan
Evaluate
Then Design
plan
next
loop
START
Test
Build
Figure 8. Spiral Project Plan, sometimes called
“the cinnamon roll” because of the distinctive
shape of the spiral
The beauty of spiral project plans is apparent in three ways. First, it provides a flexible lightweight
process that practically any team of adults can implement. It doesn’t take a Project Management
Institute certified engineer to work this way. Second, teams can use this process structure projects at
their earliest moments of planning, way before funds are committed and programmers are hired, when
rational processes can have their greatest effect. Third and finnally, they provide a mechanism for reality
checks at the end of each loop where stakeholders can provide input on whether the project is still
aligned with goals. This enables teams to make adjustments before outcomes are set in concrete.
Roles and Responsibilities
Most tasks that fail to get done, fail because of unclear or non-existent ownership, and friction within
projects is frequently caused by ambiguous responsibilities. Conversely, tasks that have clear owners are
likely to get done. One of my favorite techniques for improving basic project management
improvements is to get project teams to define roles-and-responsibilities formally, before work begins in
earnest.
I developed the following list of role definitions to clarify roles-and-responsibilities for Web projects at
the Smithsonian.
18
21. Excerpt from Roles and Responsibilities Definition (Edson, Smithsonian)
• Managerial Roles
– Sponsor
• Internal client(s) for whom we’re producing the project. Defines goals. Supervises
Project Owner and provides resources and direction to Project Owner and team.
Provides “head above the trees” perspective of overall effort.
– Project Owner
• Responsible for,
• high level organization and execution of project.
• requirements analysis
• creative brief
• interface with project sponsors
• team selection
• high-level definition project lifecycle
• monitoring and periodic reviews of content/functionality over entire project
lifecycle
• Usually reports to the Project Management Team
19
22. Sample Roles and Responsibilities Template
Role and Responsibility Assignments
Roles are assigned to individuals for the purpose of a) ensuring that all roles have someone to play them, and b) to
promote clarity for the purpose of project management. Many team members will have more than one role. In
general, individuals are encouraged to participate/collaborate/contribute beyond their strict role assignments! (Table
is partially filled out as an example.)
(add team members as
Team Members
appropriate)
Dennis
Cathy
Amy
Bob
Sponsor
X
Project Owner
X
Project
Management
X X
Team
Project
X
Managerial Roles
Manager
Technical
X
Director
Quality Control
Manager
Partner
Content
Provider
Creative
Director
Lead
Content Production Roles
Writer/Editor
Creative
Producer
Writer/Editor
Graphics
Producer
Graphic
Designer
Graphical User
Interface
Designer
Information
Architect
Software
Analyst
Programmer
Database
Technical Production Roles
Designer
Image
Production
System
Architect
Web Server
Administrator
20
23. Transparency through standardized reporting
Many projects are only transparent at their inception and completion. The goal of standardized
reporting is to give managers and participants insight into project status and direction so they can make
decisions and manage.
Some examples of simple project reporting methods are shown below.
Example 1, a weekly project status PowerPoint file for general consumption by stakeholders. This
template was filled out weekly by the project manager. The PowerPoint format encouraged brevity and
focus on the most important points.
Figure 9. Two slides showing project status for a Website redesign project
Example 2, a bi- weekly status for parallel projects. This Microsoft Word template was used by 14 senior
managers to report on the status of their projects for the reopening of the Smithsonian American Art
Museum in 2006. Each manager had their own document in a network folder and individual documents
were rolled-up into a “master” document (using Word’s linking feature) for a bi-weekly progress-review
meeting.
21
24. Figure 10. Bi-weekly status reports
Example 3, weekly meeting minutes emphasizing assignments and decisions made. I used this type of
report for a network installation project. Note the use of the term “Action Required” to call attention to
specific assignments. The creation and tracking of Action Items is a highly effective process
improvement. These reports were typed in Microsoft Word’s Outline view as the meeting progressed.
Reports were distributed to team members and uploaded to a project extranet site.
Figure 11. Weekly meeting minutes emphasizing actions required and decisions made
22
25. Governance Structure
Many organizations lack a formal and uniformly understood mechanism for gathering input on proposed
technology projects and determining which should be submitted for consideration and approval by
senior decision makers. It doesn’t take a lot of process to be effective in this area— “just enough” as
Gartner says. At the Smithsonian American Art Museum, I instituted a simple Web site proposal form
that asked people initiating projects to answer basic questions about the goals and processes.
This process had several beneficial outcomes. First, it ensured that everyone involved in a project agreed
on a project’s scope and assumptions before it began. Second, it forced stakeholders to discuss
priorities, content, direction, and timing before resources were committed. Third, it elevated the
discussion of previously under-valued processes such as roles-and-responsibilities and maintenance
lifecycles. And finally, it provided a single, transparent gateway for all recommendations going to the
Director. Was the process perfect? No. But it was “just enough” process to allow Web-development
projects to begin to be managed, rather than ad-hoc.
Sample Document: Web-site proposal form.
Purpose
The purpose of this form is to provide an overview of proposed objectives and production/maintenance lifecycles
for new Web content. This form requires information needed to support the editorial decision-making process. A
completed form serves as a contract between project sponsors, team members, and SAAM decision makers.
Process
This section is written with the project manager/project leader in mind
1. Somebody generates an idea and you take ownership of it: you are the project leader. You discuss the
idea with potential partners, team members, and SAAM management. You define a project and walk it
through the approval process.
2. You discuss the idea/project at the SAAM Web Weekly, and (optionally) at the SAAM Web Quarterly.
3. If the idea passes through informal discussions you formalize the creative and management aspects of the
project and fill out this form.
4. You present the project and this form to the SAAM Web Quarterly and lead a discussion. You can review
simple projects via e-mail: more complex projects require a meeting of the Web Quarterly and may
require several meetings.
5. The SAAM Web Quarterly approves the idea (or engages you in an iterative process of questions,
comments and review) and makes a recommendation to the Director.
6. The Director approves the idea.
7. You begin the next stages of planning and execution.
From this point on project management is handled at a detailed level by a Project Management Plan.
What kinds of projects should use this process?
It is hard to describe this categorically. We’ll be using common sense case-by-case.
Web Site Proposal Form
23
26. 1. Who will be leading this idea though the approval process?
2. Who will be the project sponsor?
3. Who will be the project owner?
4. What other project “roles” are defined?
5. What is the title of the idea?
6. Please give an overview of the idea as you would pitch it to the Director and the Web Quarterly.
7. What deadlines are associated with this idea?
8. What partners (internal or external) will be involved?
9. Please describe the 3-year lifecycle of this idea.
10. What staff resources will be required for the 3-year lifecycle?
11. What financial resources will be required for the 3-year lifecycle?
12. What technological resources will be required for the 3-year lifecycle?
Consequences and phenomena
Three consequences and phenomena related to the pursuit of process-improvements for museum-
technology projects are worth noting. They are capability mismatches, the difficulties of getting buy-in
for governance and control efforts, and what “lightweight” Web 2.0 software development practices
have to offer museums.
Capability mismatch
Capability mismatch describes a situation in which different groups on a project have incompatible
processes or radically different levels of process maturity. For example, capability mismatches often
occur when small to medium sized museums with few defined processes and not much project
management expertise or hire accomplished outside technology companies. Successful technology
companies tend to be very process and results oriented and often have staff with formal training and
advanced certification in project management, software development, measurement and analysis, and
business-process engineering. These people speak a different language than most museum teams,
which is not to say that they are always right, but the disconnect between intuitive decision-making
cultures and structured business cultures can cause problems.
Capability mismatches aren’t found only in internal-external relationships. Mismatches are also found
between work groups within museums. In mature organizations it would be the responsibility of a
Project Management Office (PMO) to establish standard practices and resolve mismatches, but museum
technology projects seldom benefit from this kind of function.
24
27. Capability Maturity Mismatch
When you and your vendor have different capability
maturity levels there can be a disruptive shearing effect
on project processes
Figure 12. Capability maturity mismatches create a disruptive shearing effect
In a mismatched engagement, technology vendors working with museum clients often see behavior on
the museum side such as
conflicting institutional voices/opinions (client doesn’t speak with one voice)
adversarial relationships (“I don’t feel like we’re on the same team”)
wrong people in key positions
unrealistic expectations
content-approval deadlines are not met
undefined decision-making processes
little or no measurement of key performance indicators
insufficient staffing for the task at hand
completed projects are not maintained after delivery
I have interviewed vendors of all sizes to gain insight into this phenomenon. Most say the thing they
want most from their museum clients are unified decision making processes and a willingness for senior
managers to “hear what’s realistic and act accordingly” when confronted with evidence of flawed
internal processes or unrealistic expectations.
I have seen more than one museum technology project struggle, under-perform, or fail because of
capability mismatches, and this is an area where vendors and clients need to help each other out.
Capability Maturity Mismatch
Museum Vendor
Low Maturity Low Maturity
The blind leading the blind
Low Maturity High Maturity
Common: most organizations select industry leaders
High Maturity Low Maturity
Rare, except when there are non-business factors
(like pleasing a VIP)
Figure 13. Mismatches are caused by differences in the way groups approach work
25
28. Governance and Control
Many work groups and departments balk at the idea of new rules, procedures, controls, or governance
structures being imposed on them. As one museum professional I interviewed put it: “Museums workers
often have a kind of entrenched eccentricity that treats all efforts to institute standard procedures as
infringements on creativity.” (Anonymous interview, 4/27/2008.) And museums are not alone. Matt
Hotle writes “Most [software] development organizations seem to have a clear avoidance mechanism
when it comes to ‘process.’ However, using a ‘just enough’ approach to processes enables an
organization's behaviors to match its goals. ( 'Just Enough Process' for Applications, 2007.)
Gartner’s “just enough” approach encourages managers to keep rules and governance to the absolute
minimum required to help get products completed the “right” way, and I have found that use of the
“just enough” phrase itself sends a positive and soothing message to concerned stakeholders.
Governance and control efforts need internal marketing and wise stewardship to get buy-in and
acceptance, but ultimately governance and control will be accepted by teams when they see that the
new rules and procedures benefit their work, reduce errors and rework, and free them up to perform
more creative and rewarding tasks. Most wisdom on this topic asserts that a light hand, “more carrots:
fewer sticks” (positive incentives rather than the threat of punishment) is the most successful way to
bring governance structures into an organization.
Web 2.0: Lightweight Development Frameworks
The way Websites are built and improved has changed dramatically in the last few years, and these
changes are good for small organizations wanting to have a greater impact online. In the client-server or
mainframe computing era, software applications were meticulously planned in excruciating detail
months or years ahead of delivery and the final software product worked for the task it was designed for
(or not) and that was more-or-less the end of the story. If requirements changed or new opportunities
arose not much could be done in the short-term, and end-users had little or no opportunity to add or
change the product’s functionality to suit their own distinctive needs. Making software this way
required large teams working at high levels of process maturity to make a product. It made Microsoft
rich in the 80’s and 90’s, but there are new models now.
There’s a phenomenal amount of hope and hype around the term Web 2.0, which is typically associated
with social networking Web sites, tagging, and user-created content. But publisher Tim O’Reilly sees
something deeper going on here in the way that these kinds of sites are being developed. In his
manifesto on the subject, What is Web 2.0: Design Patterns and Business Models for the Next
Generation of Software,18 O’Reilly describes how powerful, effective, and wildly profitable Web
applications can (and should) be built using lightweight, rapid-development processes and continuous
improvement and innovation fueled by interaction with (and contributions by) customers. In contrast
18
http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html, accessed 4/27/2008
26
29. with previous practices, these sites go public with basic functionality and are constantly modified,
adjusted, and expanded according to what works and what doesn’t.
O’Reilly writes,
Cal Henderson, the lead developer of Flickr, recently revealed that they deploy new
builds up to every half hour. This is clearly a radically different development model!
While not all web applications are developed in as extreme a style as Flickr, almost
all web applications have a development cycle that is radically unlike anything from
the PC or client-server era. It is for this reason that a recent ZDnet editorial
concluded that Microsoft won't be able to beat Google: quot;Microsoft's business model
depends on everyone upgrading their computing environment every two to three
years. Google's depends on everyone exploring what's new in their computing
environment every day.quot;
I wouldn’t argue that Google doesn’t require mature software development processes—quite the
contrary in fact—but lightweight framework models do demonstrate that valuable software can be built
in small, manageable pieces by small manageable teams. The acceptance and mainstreaming of free and
open source software further lowers the barriers to entry: a team of two or three developers can
produce flexible and high-quality Web applications in small iterative steps at low cost and low risk. (Or,
museums can opt-out of software development altogether and adapt the same blog, wiki, tagging, and
file-sharing software that’s available to the general public right now. )
A full discussion of the Web 2.0 platform is beyond the scope of this paper, but Tim O’Reilly’s article is
required reading for anybody thinking seriously about the future of software development.
Real World Examples
The following brief case studies give concrete examples of how process maturity and an understanding
of capability maturity models can affect the direction and outcome of projects.
Capability Mismatch: Handheld Multimedia Guide
Overview
In 2004 the Smithsonian issued a request for proposals (RFP) for a pilot project for multimedia handheld
tours at six museums, with the hope that the successful system would eventually be extended to all
Smithsonian museums. (Four Smithsonian museums were in the process of implementing their own
handheld-guide pilot projects or had just completed them.)
A press release described the to-be functionality of the device,
Visitors to the Smithsonian will have the option to rent a lightweight, wireless
handheld device that is the heart of the SIguide solution. With the SIguide handheld
device, Smithsonian patrons will be able to take preset tours or customized tours
27
30. that match their interests; view multimedia content such as documents, photos, and
audio and video clips; locate and be directed to exhibits, landmarks or other
members of their group; communicate with someone or everyone in their group;
create a schedule of activities and receive reminders when events are due to begin;
save content, messages, sketches and notes to a scrapbook they subsequently can
access via the Web; and much more.19
The RFP stipulated that the Smithsonian would contribute content and staff-hours to the project, but no
capital funds: development costs would be borne by the vendor and recouped by sharing revenue from
device rentals. Technical specifications and requirements were assembled from needs and wish-lists
submitted by participating Smithsonian museums. The project had executive-level sponsorship and high
visibility throughout the Institution.
The contract was awarded to a small startup with a compelling vision. Company founders demonstrated
a history of successful involvement in museum content/technology deployments, but no track record
delivering the technology required by this project on this scale. The technology specification included
installing multiple wireless networks; developing a system of automated kiosks to distribute, charge, and
synchronize the inventory of handheld devices; a complex database infrastructure; integration with e-
commerce and customer databases; and “wireless positioning” technology to relate the moment-to-
moment location of each handheld device with maps of artifacts and related content.
Process Maturity
To buffer the project from risk, the awardee created a project management office (PMO) through a sub-
contract with a local technology company with a highly mature project management group. The intent
of the PMO was to provide a system of checks and balances that could match the realities of day-to-day
execution and decision making with the idealized vision of the project. However, there was a significant
capability mismatch between the startup’s culture and that of the PMO and after several weeks the
awardee disbanded the PMO (with the Smithsonian’s approval). No similarly mature project-
management expertise was established to replace that which was lost.
Technology development was not vetted through previously defined processes but was fast-tracked.
(Part of the rationale for this was, perhaps, the fact that the Smithsonian did not have capital
investments at risk.) Furthermore, project success was contingent on the on-time delivery and
satisfactory performance of several critical assumptions about a) the accuracy and performance of the
wireless positioning system, b) the performance and reliability of the automated kiosks, c) revenue
projections, and d) operating costs. These assumptions were not rigorously tested and risk-mitigation or
contingency plans, if they existed, were not well known.
19
Press release, labeled as coming from the Smithsonian, http://www.lynx-net.net/web/SIguide.php, accessed
4/26/2008. Also see From the Secretary: Guiding Light at
http://www.smithsonianmag.com/travel/10013371.html, accessed 4/26/2008
28
31. Outcome and Lessons Learned
This was a complex project and a full description of its promise and flaws is beyond the scope of this
paper and the knowledge of its author. But it’s important to try to learn from what went wrong and two
process-maturity mistakes are apparent. Ineffective process controls allowed stakeholders to define a
project that was beyond the process maturity level of both the vendor and the Institution, and the lack
of a PMO (caused by a capability mismatch) allowed early warning signs to go unnoticed or insufficiently
addressed. Given the Smithsonian’s highly mature content-creation processes and the recognizably
bleeding-edge nature of this project’s technology and business models, a better approach might have
been to gradually increase investments in successful museum-based pilot projects to test theories about
audience acceptance, technology, and operations in a more controlled manner. This kind of evolutionary
roadmap has been tried successfully at several other institutions.
Lightweight Software Development: SAAM’s Eye Level Blog
Overview
In May of 2005 Smithsonian American Art Museum’s (SAAM’s) New Media Initiatives department
proposed creating a blog for SAAM’s reopening, which was fourteen months away. The New Media
team knew they needed to establish a new Web site to support the outreach goals of reopening and
build buzz leading up to opening day, but the museum’s normal content-creation teams were pinned-
down with day-to-day tasks pertaining to the bricks-and-mortar museum reopening and money was
tight. In addition, museum managers realized that high-visibility projects (such as Web and kiosk
applications for the Luce Foundation Center for American Art) would leave little capacity for complex
software and content development efforts.
Process Maturity
The blog was identified as an achievable objective specifically because it required a low level of process
maturity, had a very small budget impact, and had a low risk of failure (and not enormous consequences
if it did.) Through a structured governance process the project’s goals, risks, roles-and-responsibilities,
and project management methodology was articulated and reviewed. Some project stakeholders were
uncomfortable with unknowns in the content-creation and editorial process, so the project was
approved for a trial-run in which the blog site available on a password-protected page available only to
SAAM employees. After a short period running internally, stakeholders became comfortable with
production processes and the blog was approved for external publication.
Lessons Learned
This approach was effective for SAAM.
29
32. Matching goals to capacity and maturity: SAAM “Findability” project
Overview
The creation of new Web sites for the Smithsonian American Art Museum’s (SAAM’s) reopening in 2006
also created problems with navigation, branding, and information architecture. Rather than initiate a
redesign, SAAM chose to take an iterative approach to Web site improvement by focusing on findability
(making SAAM’s Web content easier to find) and structuring work so that results would be achieved
through a series of short, low-risk sub projects (rather than one large, monolithic project as SAAM had
done in the past).
Process Maturity
SAAM was extremely focused on conducting a controlled and managed development process that
avoided the pitfalls and distractions of traditional Web redesign projects. The SAAM Web team was
aware of capability and maturity weaknesses and did not have great confidence in its capacity to
manage a large Web redesign. The team was more comfortable and experienced with smaller projects
of two or three month duration so the RFP it issued explicitly required vendors to structure work into
short sub-projects and use processes similar to the Spiral Project Plan. (See text box below.)
Excerpt from RFP
C.1.2. STRUCTURE WORK TO REDUCE RISK
Proposed work plans shall be designed to achieve desired results through a series of contained,
low-risk sub-projects (as opposed to a monolithic, all-or-nothing methodology). Methodologies
should include continuous testing, measurement, assessment, and refinement. As the saying
goes, “teach us to fish” rather than build us some fancy boats and go away. This is especially
important if parts of the work plan include community-building or visitor-created content.
In addition, the RFP that intentionally avoided the term “redesign” and instead focused attention on
making measureable improvements to end-user perceptions of findability, including the performance of
search engines, information architecture, labeling, and overall usability—but only if those facets could
be tied back to findability. Use of the word “redesign” was discouraged in project meetings, documents,
and discussion.
Lessons Learned
The project has not concluded and the outcomes are not clear.
30
33. Conclusion
The road to effienent development town.
Redrawn from Rapid Development, Steve McConnell, Microsoft Press, 1996
31
34. References
Carroll, Sean B. 2005. Endless Forms Most Beautiful: The New Science of Evo Devo. New York : Norton,
2005.
CMMI Project Team. 2007. CMMI for Acquisition, Version 1.2: Improving Processes for Acquiring Better
Products and Services. Pittsburgh : Software Engineering Institute, Carnegie Mellon, 2007.
—. 2006. CMMI for Development, Version 1.2. Pittsburgh : Software Engineering Institute, Carnegie
Mellon University, 2006.
Edson, Michael. 2006. Data Access Strategy. Museums and the Web: Conference Proceedings. [Online] 3
1, 2006. [Cited: 4 22, 2008.] http://www.archimuse.com/mw2006/papers/edson/edson.html.
General Accounting Office. 1992. IMTEC-93-13 Mission-Critical Systems: Defense Attempting to Address
Major Software Challenges. [Online] 1992. [Cited: 4 22, 2008.]
http://archive.gao.gov/d36t11/148399.pdf.
Hotle, Matthew. 2007. 'Just Enough Process' for Applications. ID Number: G00145561. s.l. : Gartner, Inc.,
2007.
—. 2007. The Little Big Application Organization: How a Big Organization Can Still Remain Small. ID
Number: G00146962. s.l. : Gartner, Inc., 2007.
—. 2008. The 'Seven Deadly Sins' of Application Governance. ID Number: G00155896. s.l. : Gartner, Inc.,
2008.
Jones, Peter. 2008. We Tried To Warn You, Part 2: Failure is a matter of timing. Boxes and Arrows.
[Online] 3 26, 2008. [Cited: 4 22, 2008.] http://www.boxesandarrows.com/view/we-tried-to-warn-
you32.
Keen, Andrew. 2007. The Cult of the Amateur: How today's internet is killing our culture. New York :
Doubleday, 2007.
Kopcho, Joanne and Hottle, Matthew. 2008. CMMI Remains the Standard for Software Process
Frameworks. Article ID #G00156315. Gartner.com. [Online] 4 18, 2008. [Cited: 4 22, 2008.]
http://gartner.com.
Leading Change: Why Transformation Efforts Fail. Kotter, John P. 1995. 1995, Harvard Business Review;
Mar/Apr95, Vol. 73 Issue 2, pp. 59 - 67.
McConnell, Steve. 1996. Rapid Development: Taming Wild Software Schedules. Redmond : Microsoft
Press, 1996.
32
35. O'Reilly, Tim. 2005. Web 2.0: Design Patterns and Business Models for the Next Generation of Software.
O'Reilly.com. [Online] 9 30, 2005. [Cited: 4 24, 2008.]
http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html.
2007. Universal Music CEO Doug Morris Speaks, Recording Industry in Even Deeper Shit Than We
Thought. Aprapos of Nothing. [Online] 11 26, 2007. [Cited: 4 22, 2008.]
http://nymag.com/daily/entertainment/2007/11/universal_music_ceo_doug_morris.html accessed
4/19/2008.
2007. Universal's CEO Once Called iPod Users Thieves. Now He's Giving Songs Away. Wired.com. [Online]
11 27, 2007. [Cited: 4 22, 2008.] http://www.wired.com/entertainment/music/magazine/15-
12/mf_morris.
Vivendi Board Biography of Doug Morris. [Online] [Cited: 4 22, 2008.]
http://www.vivendi.com/corp/en/governance/dir_morris.php.
33