The document discusses different methods for communicating the business value of user experience research to stakeholders. It provides examples of quantifying value in terms of time saved, opportunity cost, errors avoided, false starts prevented, and safety improvements. Each method is described in terms of when it works best, challenges to consider, and examples. The overall goal is to make the case for engaging in early and frequent UX research.
17. Time SavedUsing the hourly rate of end users to quantify the value of faster
processes and workflows
($/hr.) x hrs. saved x annual count of process x total users
Works best with
• Internal users
• Low time saved / High user volume
• Stakeholders love numbers
• Need fast, reliable quantification
Challenges
• Assumes compensation only for
productive time
• Need rate knowledge
• We’ve found to be lowest value
Moving data from one application to another required manual unit conversion.
$200 x 0.25hrs. x 30 x 400 = $600,000/yr.
18. Opportunity CostCalculating the value of what users would be doing with time saved
What would your day look like if you didn’t have to do this?
What would you do with 2 extra hours? 2 extra days?
What do you wish you could be doing?
Works best with
• In-depth interviews, work shadows
• Users with specialized skill sets
• Making the case that improving
efficiency is worth it
• Users who feel like there’s “never
enough time”
• Larger amount of time saved
Challenges
• Quantification requires direct
questioning
• Time saved may be non-productive
Processes to ensure wells are running as expected allowed no time for
optimizing output.
$2M/day
19. ErrorsUsing the cost of past errors to predict the value of identifying and
solving the cause
Did anything ever go wrong using this?
Did problems ever occur with your work-arounds?
Works best with
• Users relying on work-arounds
• Older software
• Data movement across products
• Manual processes in longer workflows
• High-risk/high-value activities
Challenges
• May be no past costly errors
• Need to ask directly
• Users may be embarrassed or
withholding
• May report others’ experiences
A work-around for a cumbersome process allowed a typo to go unnoticed.
Resulted in a well shut-in for 14 days.
$20M
20. False StartsSaving the cost of developing a product that wouldn’t work
Design $ + development $ + implementation $
Works best with
• Research is engaged early
• Research drives design/development
decisions
• Stakeholders value research
• Stakeholders already think they have
the solution
• Evaluations to guide vendor product
purchase decisions
Challenges
• Can be difficult to convince
stakeholders findings are valid
• Need estimate cost of potential productStakeholders believed a mobile application would improve complex calculation
workflows. Users reported needing two screens and multiple hours.
$2M saved
21. SafetyDecreasing injury risk for users
Is there anything about this process that makes you nervous?
Have you ever felt discomfort or unsafe while doing this?
Works best with
• High-risk work processes
• Reducing user time in a dangerous
activity or location
• Organizations with a safety culture
• Contextual inquiry
• Repetitive processes with ergonomic
risks
Challenges
• Difficult to quantify
• Users may be hesitant to reveal unsafe
practices or work-arounds
Data quality issues were related to analog dials and monthly, pen-and-paper
data collection. Higher tech solutions are at risk of being stolen by pirates.
Decreased boat time in pirated waters.