SlideShare a Scribd company logo
1 of 53
Download to read offline
BENCHMARKING MINI-SERIES PART #2
Conducting Quick, Cost-Effective
Benchmarking at Scale
Presented by: Dana Bishop, Sr. Director of UX Research
Quick Housekeeping
• Control panel on the side of your screen if you have any
comments during the presentation
• Time at the end for Q&A
• Today’s webinar will be recorded for future viewing
• All attendees will receive a copy of the
slides/recording
• Continue the discussion using #uzwebinar
• Stay tuned until the end for the raffle winner announcement!
Let’s make sure you’re all set up for the webinar!
What We’re Covering in Part #2
Planning & conducting UX benchmarking
Collecting and tracking data with scorecards
UserZoom’s qxScore & Scorecard program
10 expert benchmarking tips
Dana Bishop
Sr. Director, UX Research
Dana has 25 years of industry experience,
with a focus on UX Benchmarking and
Syndicated Research.
Email dbishop@userzoom.com
Can UX be measured?
● UX Benchmarking is critical because it’s testing your online properties to get
baseline usability metrics for improvements.
● Which are then tested and compared against the baseline over time in order
to understand how your site, app or other digital product is progressing.
● The outcome of benchmark studies also help identify problem areas for
improvement as well as building a vision for future releases.
Yes, absolutely!
MEASURE
Why Benchmark?
Quantify How Changes Have
Impacted Product Performance03
Get a holistic view of of the
customer journey02
Understand How to Improve Your
Products01
Engage Stakeholders &
Leadership06
Generate Insights worth
Investigating05
Get Competitive Intelligence04
Who should be benchmarking?
• UX Researchers
• Product Owners
• Product Marketing
What can you benchmark?
• Desktop site
• Mobile Site
• Mobile App
• Live sites or Prototypes
• Internal Sites or Intranets
Basically anywhere your users and customers are accessing your site or product
online can and should be benchmarked.
When should I be Benchmarking?
“We don’t know what our
research roadmap is for
next year”
“Our boss is obsessed with [x]
competitor”
“We want execs to know &
care about what we do”
“We’re about to start
journey mapping / new
product innovation”
TRIGGERS
Benchmarking can provide a useful context
before engaging in discovery. When mapping out research for the
year, benchmarking helps identify
areas of weakness against competitors
across a range of customer journeys.
If they mention some competitors, or the
HiPPO has a company they often bring up
as best practice - a benchmark can answer
questions on the best approach.
Benchmarking can be a useful exercise to
excite and engage stakeholders outside of
UX within an org.
ü Fast, convenient, and cost effective
ü Flexibility - professional services research or DIY
ü Study templates
ü Quick results
ü Recruit your target audience
ü Benchmarking all devices / digital properties
ü Get the complete picture with both types of data
ü Ease of including competitors
Why conduct UX benchmarking online?
1. Create a Project Plan - Outline your goals, objectives, and timeline
2. Study Design - participant screener, tasks, follow-up questions, and make sure they tie back
to your project plan
3. Study Build in UZ – now you are ready to “build” it
4. QA your study build
5. Launch & Monitor
6. Analysis & Reporting
Next time is even easier, because you’ve already done the heavy lifting!
Just repeat steps 5 & 6!
How do I start? What does the process look like?
You’re ready to benchmark. Now what?
STAY FOCUSED
Don’t set too many goals
or ask questions that are
not directly tied back to
helping answer your top
questions.
ROADMAP
Collect the data you need
to assist in making sound
decisions.
KNOWN PROBLEMS
Address problems you or
your customers have
already identified.
YOUR HYPOTHESIS
Design your tasks and questions to get the
answer you need to confirm or refute your
hypothesis.
INDUSTRY BEST
PRACTICE
DEFINE
Be clear and specific when you
establish your benchmarking goals.
Step 1: Project Plan | Set Your Research Goals & Objectives
Step 2: Study Design | Design Once, Run Multiple Times
DESIGNING A BENCHMARKING STUDY
Tasks should be consistent across all
competitors. Review each site to
confirm all tasks can be completed
across competitors.
Use the same participant profile.
CONSISTENCY
Carefully consider the usability
and business metrics you want
to track.
KEY PERFORMANCE INDICATORS (KPIs
The UserZoom Platform
makes it quick and easy to
design and build your
benchmark, then repeat/ re-
run as needed in the future.
TEMPLATES
Include the primary tasks users
do on your site or app.
TASKS
Step 2: Study Design | Methodology
Our methodology is unique in that it includes both task-based large sample
usability studies (50n)
A smaller Qualitative think out loud usability study (10n).
Along with a reliable way to measure and track the user experience
with Consistent UX Metrics (KPIs) via our qxScore (Quality of Experience Score).
Initial questionnaire
Task 1
Final questionnaireTask 2
Task 3
Screener
Step 3: Study Build in UZ | Templates, screener library
Step 4: QA Your Build | Preview the Participant Perspective
Step 5: Launch & Monitor | Start reviewing your results
Step 6: Analysis & Reporting | Work smarter, not harder
UserZoom Results
Step 6: Analysis & Reporting | Work smarter, not harder
Insights: Quantitative
Step 6: Analysis & Reporting | Work smarter, not harder
Insights: Qualitative
Non-Success rate of all sites for this task at 62%
Of those who were not successful:
• 42% used incomplete address information in the locator tool
• 20% had chosen a location without a physical branch
Participant ID 130:
“…It says Union Trust. Is that the name of the bank? I’m confused.”
Sample
Participant
Video
qxScore Calculations
Understanding the qxScore
WHAT’S IT
BASED ON?
This score is based on a
number plotted on a
100 point scale that
references the UZ Index
HOW’S IT
CALCULATED?
This score is comprised
of behavioral task
success data combined
with attitudinal data
WHAT WE CALL THIS
“SINGLE SCORE”?
We call this single score the quality of
experience score, better known as the
qxScore
Industry Best Practice | UX KPIs
BEHAVIORAL (WHAT THEY DO) ATTITUDINAL (WHAT THEY SAY)
Ease of Use
Trust & Credibility
Appearance
Loyalty, NPS
Task Success
Task Time
Page Views, Clicks
Problems & Frustrations
What does it look like?
Behavior: What they do
Task Success %, per task. Overall Task Success across all tasks is weighed @ 50% of qxScore
Attitude: What they say, feel
Usability (Ease of Use)
Trust
Appearance
Loyalty, NPS
Top 2 Box
qxScore (Quality of Experience Score) Example
UserZoom calculated the qxScore based on both behavioral and attitudinal metrics.
We then plot this qxScore on UZ Index to determine the comparative score.
Very Poor
0 to 45
Poor
46 to 60
Average
61-75
Good
76-90
Great
91-
100Experience Score (qxScore) Range:
August
2018
73
BASELINE
January
2019
90
TESTED
AGAIN AFTER
REDESIGN
Quality of Experience
Score Range (UZIndex):
Very Poor
>45
Poor
45 - 60
Average
61 - 75
Good
76 - 90
Great
91 - 100
Task 1 50%
Task 2 83%
Task 3 66%
Task 4 83%
Task 5 66%
Task 6 66%
Task 7 66%
Task 8 100%
Task 9 100%
Usability (ease of use) 69%
Trust 92%
Appearance 69%
Loyalty 47%
qxScore:
Longitudinal Comparison: qx ScorecardBehavior:Whattheydid(Task
Success)
Attitudes:Whattheyfeel,say
August 2018 January 2019
Quality of Experience
Score Range (UZIndex):
Very Poor
>45
Poor
45 - 60
Average
61 - 75
Good
76 - 90
Great
91 - 100
Task 1 100%
Task 2 100%
Task 3 100%
Task 4 100%
Task 5 80%
Task 6 60%
Task 7 100%
Task 8 100%
Task 9 NA NA NA NA NA
Usability (ease of use) 92%
Trust 92%
Appearance 87%
Loyalty 79%
qxScore:
Behavior:Whattheydid(Task
Success)
Attitudes:Whattheyfeel,say
9073
Scorecard: Task-Level Effectiveness & Efficiency (Behavioral KPIs)
Participants task success rate improved on 6 out of 8 tasks. Users had a 100% success rate for 6 tasks.
Participants’ average time spent on task was reduced by 50% or more for most tasks; average # of clicks and page views also decreased for most tasks
Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9
Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19
Task Success* 50% 100% 83.3% 100% 66.7% 100% 83.3% 100% 66.7% 80% 66.7% 60% 66.7% 100% 100% 100% 100% NA
Time on Task
(Avg.) 1:50 0:45 2:17 1:03 3:23 1:40 1:10 0:23 2:04 1:13 4:01 0:46 1:42 0:56 0:57 1:06 0:44 NA
# of Page Views
(Avg. pages) 3.2 1 3.2 1 5.8 1 2.8 2 5.5 1 5.2 3 3.6 2 3 3 1.4 NA
# of Clicks
(Avg. clicks) 6.5 4 9.7 3 19.3 11 3.3 1 8.8 4 14 5 8 3 5.8 5 3.0 NA
Scorecard: Task-Level (Attitudinal KPIs)
Ease of use improved on 5 tasks; 6 tasks now have a 100% ease of use score
The number of users who reported one or more problems or frustrations declined on 6 tasks
Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Task 10
Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19
Ease of
completing task
(% in Top 2 box)
NA NA 84% 100% 50% 100% 67% 100% 83% 100% 67% 40% 50% 40% 100% 100% 83% 100% 100% NA
Experienced one
or more
problems or
frustrations (%)
NA NA 17% 0% 67% 14% 50% 28% 67% 20% 33% 40% 67% 40% 0% 43% 67% 28% 33% NA
Competitive Example: qxScore (Quality of Experience Score)
UserZoom calculated the qxScore for the 3 site based on both behavioral and attitudinal metrics.
We then plot the qxScore on UZ Index to determine the comparative score.
Very Poor
0 to 45
Poor
46 to 60
Average
61-75
Good
76-90
Great
91-
100
Experience Score (qxScore) Range:
76BRAND 1
73BRAND 2
68BRAND 3
Scorecard Calculator Available on Request
Scorecard Calculator Available on Request
UX Scorecard Program
Business Leaders believe:
‘What cannot be measured,
cannot be managed…’
UX Scorecard Dashboard: Experience Score = qxScore
BU1 Studies (Avg of 4 studies was at 69)
Study 2: UX Scorecard
Task 2: Find X in My Account (Issues with Screenshots)
10 Benchmarking Tips
Get started measuring today!
Tip #1
Stay focused!
Be clear and specific when you define
your benchmarking goals.
Tip #2
Once is not enough!
Keep measuring to track how the
experience changes overtime.
Tip #3
Keep in step with your users
Stay abreast of changes in feature
prioritization, usage, needs, and
expectations.
Tip #4
Your research is only as good as your
participants.
Tip #5
Consistency is imperative to yield
reliable longitudinal and competitive data.
Tip #6
Don’t miss out on the many benefits of
expanding your testing universe.
Tip #7
Carefully consider which
competitors you use
Tip #8
Plan accordingly for
larger sample sizes
Tip #9
Work smarter, not harder!
Tip #10
Who is
the lucky
winner?
Competitive UX Benchmark
Study: Online Banking Industry
UserZoom’s latest Competitive UX Benchmark Report
offers key recommendations to teams who are focused
on optimizing and delivering great customer
experiences within the Banking industry.
Download: info.userzoom.com/competitive-ux-banking-benchmark-
report.html
Q&A
marketing@userzoom.com US Office: +1 866-599-1550
Would you like to schedule a live
benchmarking demo or meeting with Dana?
Let us show you how we’ve helped hundreds of companies unlock
user insights with UserZoom’s research platform
SCHEDULE A CALL: USERZOOM.COM/CONTACT-US

More Related Content

What's hot

What's hot (20)

User Experience Audit by Gridle
User Experience Audit by GridleUser Experience Audit by Gridle
User Experience Audit by Gridle
 
UX Lesson 1: User Centered Design
UX Lesson 1: User Centered DesignUX Lesson 1: User Centered Design
UX Lesson 1: User Centered Design
 
UI-UX Services | Web Designing Services
UI-UX Services | Web Designing ServicesUI-UX Services | Web Designing Services
UI-UX Services | Web Designing Services
 
UX Fundamentals for Beginners
UX Fundamentals for BeginnersUX Fundamentals for Beginners
UX Fundamentals for Beginners
 
UX Basics Workshop - Guest Lecture at NSCAD University
UX Basics Workshop - Guest Lecture at NSCAD UniversityUX Basics Workshop - Guest Lecture at NSCAD University
UX Basics Workshop - Guest Lecture at NSCAD University
 
DIY UX Audit
DIY UX AuditDIY UX Audit
DIY UX Audit
 
UX Prototyping (UXiD) - Slide by Anton Chandra and Bahni Mahariasha
UX Prototyping (UXiD) - Slide by Anton Chandra and Bahni MahariashaUX Prototyping (UXiD) - Slide by Anton Chandra and Bahni Mahariasha
UX Prototyping (UXiD) - Slide by Anton Chandra and Bahni Mahariasha
 
UX Humor | Jokes and Funny Quotes
UX Humor | Jokes and Funny QuotesUX Humor | Jokes and Funny Quotes
UX Humor | Jokes and Funny Quotes
 
Intro to UX: Enterprise UX
Intro to UX: Enterprise UXIntro to UX: Enterprise UX
Intro to UX: Enterprise UX
 
UI/UX Foundations - Research
UI/UX Foundations - ResearchUI/UX Foundations - Research
UI/UX Foundations - Research
 
UX Best Practices
UX Best PracticesUX Best Practices
UX Best Practices
 
UX Experience Design: Processes and Strategy
UX Experience Design: Processes and StrategyUX Experience Design: Processes and Strategy
UX Experience Design: Processes and Strategy
 
Estimating UX
Estimating UXEstimating UX
Estimating UX
 
Web accessibility workshop 1
Web accessibility workshop 1Web accessibility workshop 1
Web accessibility workshop 1
 
How to Create a Great Product Storyboard in 8 Steps
How to Create a Great Product Storyboard in 8 StepsHow to Create a Great Product Storyboard in 8 Steps
How to Create a Great Product Storyboard in 8 Steps
 
UI UX in depth
UI UX in depthUI UX in depth
UI UX in depth
 
Google's Official Note to Product Management Candidates
Google's Official Note to Product Management CandidatesGoogle's Official Note to Product Management Candidates
Google's Official Note to Product Management Candidates
 
What is a User Experience?
What is a User Experience? What is a User Experience?
What is a User Experience?
 
Existing Website UX Audit
Existing Website UX AuditExisting Website UX Audit
Existing Website UX Audit
 
Basics of UX Research
Basics of UX ResearchBasics of UX Research
Basics of UX Research
 

Similar to Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchmarking at Scale feat. UserZoom's Scorecard Program

BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
UserZoom
 
The Good, The Bad, and The Metrics
 The Good, The Bad, and The Metrics The Good, The Bad, and The Metrics
The Good, The Bad, and The Metrics
TeamQualityPro
 
Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...
Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...
Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...
UserZoom
 
Shifting the conversation from cost to value! How to gather the right evidenc...
Shifting the conversation from cost to value! How to gather the right evidenc...Shifting the conversation from cost to value! How to gather the right evidenc...
Shifting the conversation from cost to value! How to gather the right evidenc...
Dr. Regis P. Chasse, MBA
 

Similar to Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchmarking at Scale feat. UserZoom's Scorecard Program (20)

BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
BENCHMARKING MINI-SERIES PART #1: Proving Value & Quantifying the Impact of U...
 
Creating Your Dashboard & Universal Measures with Userzoom – The Deep-Dive Ho...
Creating Your Dashboard & Universal Measures with Userzoom – The Deep-Dive Ho...Creating Your Dashboard & Universal Measures with Userzoom – The Deep-Dive Ho...
Creating Your Dashboard & Universal Measures with Userzoom – The Deep-Dive Ho...
 
First 30 days of Your CRO Program
First 30 days of Your CRO ProgramFirst 30 days of Your CRO Program
First 30 days of Your CRO Program
 
The Good, The Bad, and The Metrics
 The Good, The Bad, and The Metrics The Good, The Bad, and The Metrics
The Good, The Bad, and The Metrics
 
UX Lead Product Management
UX Lead Product ManagementUX Lead Product Management
UX Lead Product Management
 
Anton Muzhailo - Practical Test Process Improvement using ISTQB
Anton Muzhailo - Practical Test Process Improvement using ISTQBAnton Muzhailo - Practical Test Process Improvement using ISTQB
Anton Muzhailo - Practical Test Process Improvement using ISTQB
 
Why Apps Succeed: 4 Keys to Winning the Digital Quality Game
Why Apps Succeed: 4 Keys to Winning the Digital Quality GameWhy Apps Succeed: 4 Keys to Winning the Digital Quality Game
Why Apps Succeed: 4 Keys to Winning the Digital Quality Game
 
Product Development Demystified: Launching Faster with Confidence through Hum...
Product Development Demystified: Launching Faster with Confidence through Hum...Product Development Demystified: Launching Faster with Confidence through Hum...
Product Development Demystified: Launching Faster with Confidence through Hum...
 
USABILITY TESTING TO IMPROVE USER EXPERIENCE
USABILITY  TESTING TO  IMPROVE USER EXPERIENCEUSABILITY  TESTING TO  IMPROVE USER EXPERIENCE
USABILITY TESTING TO IMPROVE USER EXPERIENCE
 
Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...
Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...
Competitive UX Benchmarking: How Four Healthcare Insurance Sites Scored Acros...
 
Measuring the experience meaningful measurement for service blueprints and ...
Measuring the experience   meaningful measurement for service blueprints and ...Measuring the experience   meaningful measurement for service blueprints and ...
Measuring the experience meaningful measurement for service blueprints and ...
 
Measuring the experience meaningful measurement for service blueprints and ...
Measuring the experience   meaningful measurement for service blueprints and ...Measuring the experience   meaningful measurement for service blueprints and ...
Measuring the experience meaningful measurement for service blueprints and ...
 
Jeff Sing - Quarterly Service Delivery Reviews.pdf
Jeff Sing - Quarterly Service Delivery Reviews.pdfJeff Sing - Quarterly Service Delivery Reviews.pdf
Jeff Sing - Quarterly Service Delivery Reviews.pdf
 
[Webinar] Visa's Journey to a Culture of Experimentation
[Webinar] Visa's Journey to a Culture of Experimentation[Webinar] Visa's Journey to a Culture of Experimentation
[Webinar] Visa's Journey to a Culture of Experimentation
 
Methodologies 1: Managing Agile Projects
Methodologies 1: Managing Agile ProjectsMethodologies 1: Managing Agile Projects
Methodologies 1: Managing Agile Projects
 
Six Sigma Yellow Belt Program | Learning Meaningfully
Six Sigma Yellow Belt Program | Learning MeaningfullySix Sigma Yellow Belt Program | Learning Meaningfully
Six Sigma Yellow Belt Program | Learning Meaningfully
 
Intro to Data Analytics with Oscar's Director of Product
 Intro to Data Analytics with Oscar's Director of Product Intro to Data Analytics with Oscar's Director of Product
Intro to Data Analytics with Oscar's Director of Product
 
Advancing Testing Program Maturity in your organization
Advancing Testing Program Maturity in your organizationAdvancing Testing Program Maturity in your organization
Advancing Testing Program Maturity in your organization
 
Measuring the User Experience in Digital Products
Measuring the User Experience in Digital ProductsMeasuring the User Experience in Digital Products
Measuring the User Experience in Digital Products
 
Shifting the conversation from cost to value! How to gather the right evidenc...
Shifting the conversation from cost to value! How to gather the right evidenc...Shifting the conversation from cost to value! How to gather the right evidenc...
Shifting the conversation from cost to value! How to gather the right evidenc...
 

More from UserZoom

[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
UserZoom
 
How to Master UserZoom’s Latest Study Builder & Extract Meaningful UX Insights
How to Master UserZoom’s Latest Study Builder & Extract Meaningful UX InsightsHow to Master UserZoom’s Latest Study Builder & Extract Meaningful UX Insights
How to Master UserZoom’s Latest Study Builder & Extract Meaningful UX Insights
UserZoom
 
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
UserZoom
 
How to Run Research in Agile Sprints by Democratizing It Across Teams
How to Run Research in Agile Sprints by Democratizing It Across TeamsHow to Run Research in Agile Sprints by Democratizing It Across Teams
How to Run Research in Agile Sprints by Democratizing It Across Teams
UserZoom
 
[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...
[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...
[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...
UserZoom
 
Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...
Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...
Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...
UserZoom
 
Telling Your UX Metrics Story - The 21st Century Metrics Model
Telling Your UX Metrics Story - The 21st Century Metrics ModelTelling Your UX Metrics Story - The 21st Century Metrics Model
Telling Your UX Metrics Story - The 21st Century Metrics Model
UserZoom
 
[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...
[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...
[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...
UserZoom
 
[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...
[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...
[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...
UserZoom
 

More from UserZoom (20)

Retail UX in 2020: How to stay on top of changing customer behaviors
Retail UX in 2020: How to stay on top of changing customer behaviorsRetail UX in 2020: How to stay on top of changing customer behaviors
Retail UX in 2020: How to stay on top of changing customer behaviors
 
[Webinar] Transitioning to Remote Research
[Webinar] Transitioning to Remote Research[Webinar] Transitioning to Remote Research
[Webinar] Transitioning to Remote Research
 
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move ...
 
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
 
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
[19.2 UserZoom Spring Release Webinar] Get Card Sort Insights with Confidence
 
How to Work Collaboratively on the User Experience
How to Work Collaboratively on the User ExperienceHow to Work Collaboratively on the User Experience
How to Work Collaboratively on the User Experience
 
UX Research for the Masses
UX Research for the MassesUX Research for the Masses
UX Research for the Masses
 
Don’t Guess It, Test It!
Don’t Guess It, Test It!Don’t Guess It, Test It!
Don’t Guess It, Test It!
 
Sharing the Love: Democratising Research at MoneySuperMarket
Sharing the Love: Democratising Research at MoneySuperMarketSharing the Love: Democratising Research at MoneySuperMarket
Sharing the Love: Democratising Research at MoneySuperMarket
 
Making Research a UX Team Sport
Making Research a UX Team SportMaking Research a UX Team Sport
Making Research a UX Team Sport
 
How to Help Newbies Run UX Tests
How to Help Newbies Run UX TestsHow to Help Newbies Run UX Tests
How to Help Newbies Run UX Tests
 
Ready to Democratise UX Insights?
Ready to Democratise UX Insights?Ready to Democratise UX Insights?
Ready to Democratise UX Insights?
 
How to Master UserZoom’s Latest Study Builder & Extract Meaningful UX Insights
How to Master UserZoom’s Latest Study Builder & Extract Meaningful UX InsightsHow to Master UserZoom’s Latest Study Builder & Extract Meaningful UX Insights
How to Master UserZoom’s Latest Study Builder & Extract Meaningful UX Insights
 
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
Exploring UX in the Enterprise: The Industry’s Hottest Trends & Insights from...
 
How to Run Research in Agile Sprints by Democratizing It Across Teams
How to Run Research in Agile Sprints by Democratizing It Across TeamsHow to Run Research in Agile Sprints by Democratizing It Across Teams
How to Run Research in Agile Sprints by Democratizing It Across Teams
 
[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...
[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...
[Product Release Highlight] Seamlessly Discover and Share Your User Experienc...
 
Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...
Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...
Hold on to Your Hats: The Scaled Agile Framework (SAFe) Might Actually Be a G...
 
Telling Your UX Metrics Story - The 21st Century Metrics Model
Telling Your UX Metrics Story - The 21st Century Metrics ModelTelling Your UX Metrics Story - The 21st Century Metrics Model
Telling Your UX Metrics Story - The 21st Century Metrics Model
 
[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...
[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...
[Product Release Highlight] Accelerate Your UX Research - Gaining Fast & Flex...
 
[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...
[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...
[UserZoom Webinar] The Online Shopping Experience: Benchmarking Four Ecommerc...
 

Recently uploaded

Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Victor Rentea
 

Recently uploaded (20)

presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptx
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 

Benchmarking Mini-series Part #2: Conducting Quick, Cost-Effective UX Benchmarking at Scale feat. UserZoom's Scorecard Program

  • 1. BENCHMARKING MINI-SERIES PART #2 Conducting Quick, Cost-Effective Benchmarking at Scale Presented by: Dana Bishop, Sr. Director of UX Research
  • 2. Quick Housekeeping • Control panel on the side of your screen if you have any comments during the presentation • Time at the end for Q&A • Today’s webinar will be recorded for future viewing • All attendees will receive a copy of the slides/recording • Continue the discussion using #uzwebinar • Stay tuned until the end for the raffle winner announcement! Let’s make sure you’re all set up for the webinar!
  • 3. What We’re Covering in Part #2 Planning & conducting UX benchmarking Collecting and tracking data with scorecards UserZoom’s qxScore & Scorecard program 10 expert benchmarking tips Dana Bishop Sr. Director, UX Research Dana has 25 years of industry experience, with a focus on UX Benchmarking and Syndicated Research. Email dbishop@userzoom.com
  • 4. Can UX be measured? ● UX Benchmarking is critical because it’s testing your online properties to get baseline usability metrics for improvements. ● Which are then tested and compared against the baseline over time in order to understand how your site, app or other digital product is progressing. ● The outcome of benchmark studies also help identify problem areas for improvement as well as building a vision for future releases. Yes, absolutely! MEASURE
  • 5. Why Benchmark? Quantify How Changes Have Impacted Product Performance03 Get a holistic view of of the customer journey02 Understand How to Improve Your Products01 Engage Stakeholders & Leadership06 Generate Insights worth Investigating05 Get Competitive Intelligence04
  • 6. Who should be benchmarking? • UX Researchers • Product Owners • Product Marketing
  • 7. What can you benchmark? • Desktop site • Mobile Site • Mobile App • Live sites or Prototypes • Internal Sites or Intranets Basically anywhere your users and customers are accessing your site or product online can and should be benchmarked.
  • 8. When should I be Benchmarking? “We don’t know what our research roadmap is for next year” “Our boss is obsessed with [x] competitor” “We want execs to know & care about what we do” “We’re about to start journey mapping / new product innovation” TRIGGERS Benchmarking can provide a useful context before engaging in discovery. When mapping out research for the year, benchmarking helps identify areas of weakness against competitors across a range of customer journeys. If they mention some competitors, or the HiPPO has a company they often bring up as best practice - a benchmark can answer questions on the best approach. Benchmarking can be a useful exercise to excite and engage stakeholders outside of UX within an org.
  • 9. ü Fast, convenient, and cost effective ü Flexibility - professional services research or DIY ü Study templates ü Quick results ü Recruit your target audience ü Benchmarking all devices / digital properties ü Get the complete picture with both types of data ü Ease of including competitors Why conduct UX benchmarking online?
  • 10. 1. Create a Project Plan - Outline your goals, objectives, and timeline 2. Study Design - participant screener, tasks, follow-up questions, and make sure they tie back to your project plan 3. Study Build in UZ – now you are ready to “build” it 4. QA your study build 5. Launch & Monitor 6. Analysis & Reporting Next time is even easier, because you’ve already done the heavy lifting! Just repeat steps 5 & 6! How do I start? What does the process look like? You’re ready to benchmark. Now what?
  • 11. STAY FOCUSED Don’t set too many goals or ask questions that are not directly tied back to helping answer your top questions. ROADMAP Collect the data you need to assist in making sound decisions. KNOWN PROBLEMS Address problems you or your customers have already identified. YOUR HYPOTHESIS Design your tasks and questions to get the answer you need to confirm or refute your hypothesis. INDUSTRY BEST PRACTICE DEFINE Be clear and specific when you establish your benchmarking goals. Step 1: Project Plan | Set Your Research Goals & Objectives
  • 12. Step 2: Study Design | Design Once, Run Multiple Times DESIGNING A BENCHMARKING STUDY Tasks should be consistent across all competitors. Review each site to confirm all tasks can be completed across competitors. Use the same participant profile. CONSISTENCY Carefully consider the usability and business metrics you want to track. KEY PERFORMANCE INDICATORS (KPIs The UserZoom Platform makes it quick and easy to design and build your benchmark, then repeat/ re- run as needed in the future. TEMPLATES Include the primary tasks users do on your site or app. TASKS
  • 13. Step 2: Study Design | Methodology Our methodology is unique in that it includes both task-based large sample usability studies (50n) A smaller Qualitative think out loud usability study (10n). Along with a reliable way to measure and track the user experience with Consistent UX Metrics (KPIs) via our qxScore (Quality of Experience Score). Initial questionnaire Task 1 Final questionnaireTask 2 Task 3 Screener
  • 14. Step 3: Study Build in UZ | Templates, screener library
  • 15. Step 4: QA Your Build | Preview the Participant Perspective
  • 16. Step 5: Launch & Monitor | Start reviewing your results
  • 17. Step 6: Analysis & Reporting | Work smarter, not harder UserZoom Results
  • 18. Step 6: Analysis & Reporting | Work smarter, not harder Insights: Quantitative
  • 19. Step 6: Analysis & Reporting | Work smarter, not harder Insights: Qualitative Non-Success rate of all sites for this task at 62% Of those who were not successful: • 42% used incomplete address information in the locator tool • 20% had chosen a location without a physical branch Participant ID 130: “…It says Union Trust. Is that the name of the bank? I’m confused.” Sample Participant Video
  • 21. Understanding the qxScore WHAT’S IT BASED ON? This score is based on a number plotted on a 100 point scale that references the UZ Index HOW’S IT CALCULATED? This score is comprised of behavioral task success data combined with attitudinal data WHAT WE CALL THIS “SINGLE SCORE”? We call this single score the quality of experience score, better known as the qxScore
  • 22. Industry Best Practice | UX KPIs BEHAVIORAL (WHAT THEY DO) ATTITUDINAL (WHAT THEY SAY) Ease of Use Trust & Credibility Appearance Loyalty, NPS Task Success Task Time Page Views, Clicks Problems & Frustrations
  • 23. What does it look like?
  • 24. Behavior: What they do Task Success %, per task. Overall Task Success across all tasks is weighed @ 50% of qxScore
  • 25. Attitude: What they say, feel Usability (Ease of Use) Trust Appearance Loyalty, NPS Top 2 Box
  • 26. qxScore (Quality of Experience Score) Example UserZoom calculated the qxScore based on both behavioral and attitudinal metrics. We then plot this qxScore on UZ Index to determine the comparative score. Very Poor 0 to 45 Poor 46 to 60 Average 61-75 Good 76-90 Great 91- 100Experience Score (qxScore) Range: August 2018 73 BASELINE January 2019 90 TESTED AGAIN AFTER REDESIGN
  • 27. Quality of Experience Score Range (UZIndex): Very Poor >45 Poor 45 - 60 Average 61 - 75 Good 76 - 90 Great 91 - 100 Task 1 50% Task 2 83% Task 3 66% Task 4 83% Task 5 66% Task 6 66% Task 7 66% Task 8 100% Task 9 100% Usability (ease of use) 69% Trust 92% Appearance 69% Loyalty 47% qxScore: Longitudinal Comparison: qx ScorecardBehavior:Whattheydid(Task Success) Attitudes:Whattheyfeel,say August 2018 January 2019 Quality of Experience Score Range (UZIndex): Very Poor >45 Poor 45 - 60 Average 61 - 75 Good 76 - 90 Great 91 - 100 Task 1 100% Task 2 100% Task 3 100% Task 4 100% Task 5 80% Task 6 60% Task 7 100% Task 8 100% Task 9 NA NA NA NA NA Usability (ease of use) 92% Trust 92% Appearance 87% Loyalty 79% qxScore: Behavior:Whattheydid(Task Success) Attitudes:Whattheyfeel,say 9073
  • 28. Scorecard: Task-Level Effectiveness & Efficiency (Behavioral KPIs) Participants task success rate improved on 6 out of 8 tasks. Users had a 100% success rate for 6 tasks. Participants’ average time spent on task was reduced by 50% or more for most tasks; average # of clicks and page views also decreased for most tasks Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Task Success* 50% 100% 83.3% 100% 66.7% 100% 83.3% 100% 66.7% 80% 66.7% 60% 66.7% 100% 100% 100% 100% NA Time on Task (Avg.) 1:50 0:45 2:17 1:03 3:23 1:40 1:10 0:23 2:04 1:13 4:01 0:46 1:42 0:56 0:57 1:06 0:44 NA # of Page Views (Avg. pages) 3.2 1 3.2 1 5.8 1 2.8 2 5.5 1 5.2 3 3.6 2 3 3 1.4 NA # of Clicks (Avg. clicks) 6.5 4 9.7 3 19.3 11 3.3 1 8.8 4 14 5 8 3 5.8 5 3.0 NA
  • 29. Scorecard: Task-Level (Attitudinal KPIs) Ease of use improved on 5 tasks; 6 tasks now have a 100% ease of use score The number of users who reported one or more problems or frustrations declined on 6 tasks Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Task 8 Task 9 Task 10 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Aug-18 Jan-19 Ease of completing task (% in Top 2 box) NA NA 84% 100% 50% 100% 67% 100% 83% 100% 67% 40% 50% 40% 100% 100% 83% 100% 100% NA Experienced one or more problems or frustrations (%) NA NA 17% 0% 67% 14% 50% 28% 67% 20% 33% 40% 67% 40% 0% 43% 67% 28% 33% NA
  • 30. Competitive Example: qxScore (Quality of Experience Score) UserZoom calculated the qxScore for the 3 site based on both behavioral and attitudinal metrics. We then plot the qxScore on UZ Index to determine the comparative score. Very Poor 0 to 45 Poor 46 to 60 Average 61-75 Good 76-90 Great 91- 100 Experience Score (qxScore) Range: 76BRAND 1 73BRAND 2 68BRAND 3
  • 34. Business Leaders believe: ‘What cannot be measured, cannot be managed…’
  • 35. UX Scorecard Dashboard: Experience Score = qxScore
  • 36. BU1 Studies (Avg of 4 studies was at 69)
  • 37. Study 2: UX Scorecard
  • 38. Task 2: Find X in My Account (Issues with Screenshots)
  • 40. Get started measuring today! Tip #1
  • 41. Stay focused! Be clear and specific when you define your benchmarking goals. Tip #2
  • 42. Once is not enough! Keep measuring to track how the experience changes overtime. Tip #3
  • 43. Keep in step with your users Stay abreast of changes in feature prioritization, usage, needs, and expectations. Tip #4
  • 44. Your research is only as good as your participants. Tip #5
  • 45. Consistency is imperative to yield reliable longitudinal and competitive data. Tip #6
  • 46. Don’t miss out on the many benefits of expanding your testing universe. Tip #7
  • 48. Plan accordingly for larger sample sizes Tip #9
  • 49. Work smarter, not harder! Tip #10
  • 51. Competitive UX Benchmark Study: Online Banking Industry UserZoom’s latest Competitive UX Benchmark Report offers key recommendations to teams who are focused on optimizing and delivering great customer experiences within the Banking industry. Download: info.userzoom.com/competitive-ux-banking-benchmark- report.html
  • 52. Q&A
  • 53. marketing@userzoom.com US Office: +1 866-599-1550 Would you like to schedule a live benchmarking demo or meeting with Dana? Let us show you how we’ve helped hundreds of companies unlock user insights with UserZoom’s research platform SCHEDULE A CALL: USERZOOM.COM/CONTACT-US