Does this sound familiar? Researchers sitting around a meeting table arguing about which methods to use, especially when it comes to unmoderated remote testing vs moderated? Usually without any empirical data?
In this webinar we'll give you the power of data to say "ELMO!" (Enough, let’s move on!) and end the argument once and for all.
We collected this data by conducting 10 moderated and 10 unmoderated remote sessions across six tasks on Patagonia.com, in order to show how moderated and unmoderated remote studies compare in terms of the number and severity of usability issues surfaced.
Register for this upcoming webinar and discover the theoretical and actual strengths and weaknesses of various user research methods to stop the argument before it even begins.
What Are The Drone Anti-jamming Systems Technology?
Moderated vs Unmoderated Research: It’s time to say ELMO (Enough, let’s move on!)
1. It’s time to say ELMO:
Unmoderated v.s. Moderated Showdown
Asha Fereydouni, Research Partner
23 January, 2020
2. Quick Housekeeping
• Control panel on the side of your screen if you
have any comments during the presentation
• Time at the end for Q&A
• Today’s webinar will be recorded for future
viewing
• All attendees will receive a copy of the
slides/recording
• Continue the discussion using #UZwebinar
Let’s make sure you’re all set up for the webinar!
3. c
c
I. Review of Unmoderated v.s. Moderated Literature
II. Research scope and Project overview
III. Discussion of Severity Matrices and sample size
IV. Most/Least Problematic tasks, Key Insights, & Takeaways
V. Saying ELMO (and next steps)
VI. Q/A
Today’s Presentation:
15. “When considering doing unmoderated user research, it’s
important to keep in mind that unmoderated user research is never
as good as moderated user research.
You should always avoid attempting to replace necessary
moderated user research with unmoderated user research.”
- Actual quote from a leading publication -
21. Are the (1) number and (2) severity of usability issues discovered
through 10 remote moderated sessions, and 10 remote unmoderated
sessions different?
Primary Research Question
Note: Not looking at generative insights, but usability issues.
22. Tasks:
1. First impressions
2. Find a specific item w/o Search
3. Find a specific item w/ Search
and add to cart
4. Find key site info – policies
around organic cotton
5. Find a store
6. Describe site to a friend
Study 1: Remote Moderated - 10 sessions; 5 completed Saturday, 5 Sunday.
Study 2: Remote Unmoderated - 10 sessions; set-up Friday, launched Saturday.
23. Severity Matrices - Past Work
Exist to help researchers more consistently rate and present: 1) number and (2) severity
of various usability issues
Molich & Jeffries, 1-3 point scale:
1. Minor: delays user briefly;
2. Serious: delays user significantly but eventually allows them to complete the task;
3. Catastrophic: prevents user from completing their task.
Jakob Nielsen, 0-4 point scale:
0. I don’t agree that this is a usability problem at all;
1. Cosmetic problem only: need not be fixed unless extra time is available on project;
2. Minor usability problem: fixing this should be given low priority;
3. Major usability problem: important to fix, so should be given high priority;
4. Usability catastrophe: imperative to fix this before product can be released.
25. Most & Least problematic Tasks
Most problematic task: Task 4 (Find key site info – policies around organic cotton):
• Number of issues: Most participants had the same difficulty with:
1. The first click
2. Navigating successive pages on the site with no clear site-level-guidance
3. Understanding the mass of text about Patagonia’s policies around organic cotton
• Severity of issues: Most participants had level 2 serious problems with this task – these were
problems that delay “user[s] significantly but eventually allows them to complete the task.”
Least problematic task: Task 5 (Find a store):
• Number of issues: Understanding the meaning of the different-colored location pins.
• Severity of issues: A few participants, for each method, had level 1 minor problems with this
task – that delayed the “user briefly.”
28. Key Insights
Key Insights:
• Number of issues: Moderated discovered 20% more usability issues,
largely due to my ability to probe and ask follow-up questions.
• Severity of issues: Of those additional issues uncovered most were
minor usability issues.
• Method-to-method comparison: Number and severity of issues
uncovered in moderated was stronger.
29. Takeaways for UX practitioners
If you need to identify the maximum number and severity of issues – a
remote moderated test is best.
Practitioner’s opinion: While moderated may be the “winner,” practical
limits including time (to moderate, to review videos) and cost (to pay
participants, to pay a recruiting firm to recruit people,etc.) mean that an
unmoderated test may be more practical. Researchers should inform
themselves about the theoretical and actual strengths and weaknesses
of various methods.
37. Research Overview
Study 1: Remote Moderated - 10 sessions; 5 Saturday, 5 Sunday in September.
Study 2: Remote Unmoderated - 10 sessions, test set-up Friday - launched Saturday in Sept.
Stimulus: Patagonia.com public e-commerce website.
Tasks:
1. First impressions of site
2. Find a specific item w/o Search
3. Find a specific item w/ Search and add to cart
4. Find key site info – policies around organic cotton
5. Find a store
6. Describe site to a friend