Más contenido relacionado Best practices for defining, evaluating, & communicating Key Performance Indicators (KPIs) of user experience1. Best Practices for Defining,
Evaluating & Communicating Key
Performance Indicators (KPIs) of
User Experience
Meng Yang
User Experience Researcher
IBM Software Group
@IBMSocialBizUX
© Copyright IBM Corporation 2012
2. Agenda
Why measuring user experience?
What user experience KPIs or metrics?
Future work
© Copyright IBM Corporation 2012
3. If you cannot measure it, you cannot improve it.
-Lord Kelvin
© Copyright IBM Corporation 2012
4. Design: Intuition-driven or data-driven?
Reference: Metrics-driven design by Johua porter:r
http://www.slideshare.net/andrew_null/metrics-driven-design-by-joshua-porter
© Copyright IBM Corporation 2012
5. 5 Reasons why metrics are a designer’s best friend
Metrics reduce arguments based on opinion.
Metrics give you answers about what really works.
Metrics show you where you’re strong as a designer.
Metrics allow you to test anything you want.
Clients love metrics.
Reference: Metrics-driven design by Joshua porter
http://www.slideshare.net/andrew_null/metrics-driven-design-by-joshua-porter
© Copyright IBM Corporation 2012
7. Lean start-up/Lean UX movement
Reference: Lean Startup by Eric Ries http://theleanstartup.com/principles
© Copyright IBM Corporation 2012
8. What make good metrics?
Actionable Business alignment
Accessible Honest assessment
Auditable Consistency
Repeatability and
Powerful reproducibility
Low-cost Actionability
Easy-to-use Time-series tracking
Predictability
Peer comparability
Reference: 2). Forrest Breyfogle’s book “Integrated
1). Eric Ries: Lean Start up: Enterprise Excellence Volume II: Business
http://theleanstartup.com/ Deployment”
© Copyright IBM Corporation 2012
9. User experience metrics used
Task-level Product-level
Task success rate System Usability Scale (SUS) score
Task easiness rating (SEQ) Net Promoter Score (NPS)
Task error rate
Task time
First click analysis
Heat map
Number of clicks
Predicative human modeling
(CogTool) task time and clicks for
optimal path
© Copyright IBM Corporation 2012
10. System Usability Scale (SUS) Positive Version
Why chosen?
Free, short, valid, and reliable. [1]
A single SUS score can be calculated
and a grade can be assigned. [1]
Over 500 user studies to be compared
with. [1]
The most sensitive post-study
questionnaire. [2]
References
[1] Jeff Sauro's blog entry: Measuring
Usability with the System Usability Scale
(SUS)
http://www.measuringusability.com/sus.p
hp
[2] Book by Jeff Sauro & James Lewis:
Quantifying the User Experience:
Practical Statistics for User Research .
http://www.amazon.com/Quantifying-
User-Experience-Practical-
Statistics/dp/0123849683/ref=sr_1_1?
s=books&ie=UTF8&qid=1327605730&sr
=1-1 © Copyright IBM Corporation 2012
11. SUS score calculation
Calculation [1]
For odd items: subtract one from the user
response.
For even-numbered items: subtract the user
responses from 5
This scales all values from 0 to 4 (with four being
the most positive response).
Add up the converted responses for each user
and multiply that total by 2.5. This converts the
range of possible values from 0 to 100 instead
of from 0 to 40.
Calculation package
Jeff Sauro provides a SUS guide & calculator
package ($299.99 for a site liscense)
http://www.measuringusability.com/products/SUS
pack
Reference Example SUS score: 66.7, level C
[1] Jeff Sauro's blog entry: Measuring Usability
with the System Usability Scale (SUS)
http://www.measuringusability.com/sus.php
© Copyright IBM Corporation 2012
13. Task success rate
Large-scale usability testing
Self-reported success in UserZoom
Small-scale moderated usability testing
Through observation
Example task-based
unmoderated usability
testing through UserZoom
© Copyright IBM Corporation 2012
14. SEQ (Single Ease Question)
Why chosen?
Reliable, sensitive & valid. [1]
Short, easy to respond to, easy to administer & easy to score [1]
The secondly most sensitive after-task questions, next to SMEQ, but
much simpler. [2]
References
[1] Jeff Sauro's blog entry: If you could only ask one question, use this
one http://www.measuringusability.com/blog/single-question.php
[2] Book by Jeff Sauro & James Lewis: Quantifying the User
Experience: Practical Statistics for User Research .
http://www.amazon.com/Quantifying-User-Experience-Practical-
Statistics/dp/0123849683/ref=sr_1_1?
s=books&ie=UTF8&qid=1327605730&sr=1-1
© Copyright IBM Corporation 2012
15. Example task performance table
Fake data for illustration purposes
Task performance data summary
(% means percentage of all participants for each task)
Successful with the Considered task
Highlights of the problem Recommendation
task easy
Task 1 90% 95% Easy task None
Task 2 61% 42% Hard to find the action button xx
Task 3 55% 30% Too many clicks xx
Task 4 85% 90% Relatively easy task xx
Task performance data summary
(% means percentage of all participants for each task) Task performance data summary
Successful with the task Considered task easy (% means percentage of all participants for each task)
Successful with the task Considered task easy
Study 1 Study 2 Change Study 1 Study 2 Change Study 1 Study 2 Change Study 1 Study 2 Change
Task 1 89% 84% -5.1% 60% 63% 3.0% Task 1 89% 84% -5.1% 60% 63% 3.0%
Task 2 89% 70% -19.0% 65% 60% -5.2% Task 2 89% 70% -19.0% 65% 60% -5.2%
Task 3 62% 55% -6.8% 75% 87% 12.0% Task 3 62% 55% -6.8% 75% 87% 12.0%
Task 4 71% 90% 19.0% 56% 80% 24.0% Task 4 71% 90% 19.0% 56% 85% 29.0%
Benchmark comparison between two
studies on the same tasks
© Copyright IBM Corporation 2012
16. Use cases and tasks dashboard
Fake data for illustration purposes
© Copyright IBM Corporation 2012
17. Clickstream data in large-scale user testings
Good to have
Yet another way to visually
illustrate the problems which
are shown in other metrics
such as task success rate and
easiness ratings.
But hard to implement &
analyze
Approach 1: asking participants
to install a plugin, which
reduces the participation rate.
Approach 2: inserting a line of
javascript code on every page
of the website, which is hard to
achieve.
© Copyright IBM Corporation 2012
18. Task time and storyboard in CogTool
What is CogTool
Produce a valid cognitive model
predicting how long it will take a
skilled user to complete a task.
Developed by Bonnie John from CMU
(now at IBM Research).
Pros
Free and easy to install.
Good way to get competitive data
(task time).
Task visualization shows the most
time-consuming task steps.
Cons
Hard to learn at first.
Don’t address issues for novice users
Inter-rater validity and consistency
Reference:
CogTool website
http://cogtool.hcii.cs.cmu.edu/
© Copyright IBM Corporation 2012
19. Best-practices
Focus on core use cases and top tasks.
Use standardized questions/metrics for peer
comparability.
Don’t always need large-scale (unmoderated) usability
studies to gather metrics.
Visualization is the key to effective communication.
visual.ly is a good site to create infographics and visualization
KPIs/metrics catch people’s attention, but qualitative
information provides the insights.
© Copyright IBM Corporation 2012
20. Future work
Align UX metrics with business goals.
Communicate UX metrics to Influence product strategy.
Incorporate UX metrics in the agile development
process.
Collaborate with analytics team to gather metrics such
as Engagement/Adoption/Retention.
How do we measure usefulness (vs. ease of use)?
© Copyright IBM Corporation 2012
21. © Copyright IBM Corporation 2012
IBM Lotus Software
550 King St.
Littleton, MA 01460
U.S.A.
Produced in the United States of America
May 2012
All Rights Reserved
IBM, the IBM logo and ibm.com are trademarks or registered trademarks
of International Business Machines Corporation in the United States, other
countries, or both. If these and other IBM trademarked terms are marked
on their first occurrence in this information with a trademark symbol (® or
™), these symbols indicate U.S. registered or common law trademarks
owned by IBM at the time this information was published. Such trademarks
may also be registered or common law trademarks in other countries. A
current list of IBM trademarks is available on the Web at “Copyright and
trademark information” at ibm.com/legal/copytrade.shtml
Other company, product and service names may be trademarks or service
marks of others.
References in this publication to IBM products and services do not
imply that IBM intends to make them available in all countries in which
IBM operates.
© Copyright IBM Corporation 2012