The document discusses how service level agreements are evolving from conventional models focused on individual services to outcome-based agreements measured by overall business outcomes. It introduces CAST software as a tool for objectively measuring key performance indicators like reliability, maintainability, and security risk at the application level to establish benchmarks and monitor performance over time in support of outcome-based pricing constructs. The document argues that standard software quality measurement creates visibility and leads to cost reduction and improved business agility.
10. CAST Confidential
Software is Eating the Business Process
10
Map the Processes Identify the
Dependencies
Identify Failure
Points
DefineSMART
SLAs / KPIs /
OLMs
Link SLAs /
KPIs / OLMs
to E2E SLAs
Business Outcome
Billing Timeliness
Process #1 Process #1
Application
Database & Middleware
Operating System
Compute
Storage Network
OUTCOMES
► Success and accuracy
PROCESSES& APPLICATIONS
► Straight Through Processing – Number of re-
keying steps, queue pauses processes
► Availability (incl. Change Mgmt.)
LOGICAL INFRASTRUCTURE
► Query and procedure call level response
► Service session response
PHYSICAL INFRASTRUCTURE
► Availability (incl. Change Mgmt.)
► Utilization & Capacity
► Throughput (Input/Output)
Bad software cripples
Good processes, yet companies
measure everything but the software.
PRODUCT
MEASURES
PROCESS
MEASURES
Time &
Duration
Effort &
Budget
Function
& Scope
Quality
Risk
SECURITY RISK
PERFORMANCE RISKS
ROBUSTNESS – RISK OF OUTAGES
MAINTAINABILITY
INCIDENTS
DEFECTRATES
SYSTEMAVAILABILITY
PROJECTTRACKING
SCHEDULING
TIME TRACKING
BUDGET
REPORTING
EARNED VALUE
FUNCTIONALTESTING
USER ACCEPTANCE TESTS
USABILITY
11. CAST Confidential
Industry standard measures of good software
11
Description Prevent
Transferability
Determines how easily a new team or
team member can be productive when
assigned to work on the application
SME Dependency, ramp up delays
Delivery inefficiency, reduced output
Time-to-market delays
Changeability
Determines how easily and quickly an
application can be modified
Correction and evolution delays
Late delivery of new features
Inability to resume services
Robustness
Determines the risk of failures or
defects that could occur in production
Operational downtime
Application outage
Inability to test source code updates
Performance
Efficiency
Determines the risk of performance
issues of an application
Application Degradation
Response time degradation
Denial of Service, Logic issues
Security
Determines the risk of security
breaches for an application
Damaging Business & Operations
Security failures
Maintainability
(SEI)
Determines the cost and
difficulty/ease to maintain
an application
Drifts on maintenance costs
12. CAST Confidential
1. Identify outcomes that are critical to the business / business process
2. Map to KPIs / Risk Indices
3. Measure to establish baseline / benchmark against industry
4. Monitor overtime – before software is released into production
End User productivity
Customers satisfaction
Brand equity
Uptime / Availability
Cycle Time
Time-to-market
Business agility
SECURITY
Risk of
Failures in
Production
Risk of
Performance
Issues
Risk of
Security
Breaches
Ease of
Modifying and
Learning
RELIABILITY PERFORMANCE MAINTAINABILITY
Constructing Outcome-Based Measures
14. CAST Confidential
Product Measures Lead to Cost Reduction and Improved Agility
14
Study of maintenance effort across 20 customers shows tight
correlation between maintenance fix tickets and CAST TQI
Increase of CAST TQI by 0.24 = decrease in maintenance activity by 50%
TechnicalQualityIndex(TQI)
2.2
2.4
2.6
2.8
3
3.2
3.4
3.6
0 50 100 150 200
Ticket Volume
The current volume of chatter around outcome-based pricing is a phenomenon common to sourcing and enterprise IT – the conflation of terms, sometimes deliberate
Output-based pricing is not the same as outcome-based pricing: Much of what is being called outcome-based pricing is actually output-based per transaction. An example is pricing by test script executed or per resolved ticket. Granted the term ‘output’ is often considered a ‘successful output’. It would probably be a stretch to accurately call such arrangements outcome-based pricing.
Outcome-based pricing never happens in application development-only projects: Application development is too far removed from business outcomes for the outcome-based model to work. ISG does not find much evidence of emerging pricing models in application development & maintenance (ADM). The age old fixed fees of time and material models still account for the vast majority of application development contracts.
Outcome-based pricing requires end-to-end control: Outcome-based pricing frequently involves both ADM and business process outsourcing (BPO). As would be obvious, business processes better link service provider managed tasks to business outcomes more so than ADM. Typically; the service provider manages both, a multitude of business processes and the underlying system. Naturally, establishing attribution would require control of significant swathes of the processes and systems influencing the outcome. In a few cases, ISG has seen the vendor providing business consulting as well, which goes further down the path of influencing outcomes.
The model is still rare: Instances of true outcome-based pricing are rare, accounting for fewer than 10% of contracts involving ADM. This does not imply the number is stagnant or not registering a much higher proportion in the long term. Instead, the number of ADM contracts are low, even among top tier service providers who manage large end-to-end projects comprising both ADM and BPO.
Factors Supporting BPaaS Service Levels
Strong Service Integration Function
A mature, functioning Service Integration layer established.
Operating processes are clearly defined and fully supported with tools and automation.
Obligations Shared Amongst Providers
Where applicable, Each Provider (Internal & External) that supports part of the solution that achieves BPaaS SLAs is contractually obligated.
Clear Measurements
The BPaaS SLAs have clearly defined, repeatable, and non-judgmental measurements established.
Achievable
The BPaaS SLAs performance levels are achievable with the supporting systems and processes.
Financially Reasonable
The financial credits for missing the BPaaS SLAs are reasonable when compared to the potential business impact and investments.
Factors Inhibiting BPaaS Service Levels
Ineffective Service Integration Function
A Service Integration layer is only partially implemented, or is otherwise ineffective.
Operating processes are not clearly defined, implemented, and executed.
Disproportional Obligations Among Providers
Providers (Internal & External) are not equally and clearly contractually obligated to achieve the BPaaS SLAs
Unclear or Subjective Measurements
The BPaaS SLA measurements are not clearly defined, agreed to and repeatable.
Not Achievable
The BPaaS systems and operations cannot be reasonably expected to achieve the defined performance expectations.
Unreasonable Financial Credits
The financial credits are too high in comparison to potential business impacts and investments made in end-to-end supporting systems and processes.
SLA’s should be business outcome focuses
SLA’s should be objective, measureable, forward looking and automated
SLA’s should be based on Industry Standards
Metrics should be actionable:
Stimulate conversations between buyer/seller
Trigger additional investigations
Trigger change in process or product