Recently, there has been a surge in the number of tools that are available to conduct unmoderated (“automated” or “asynchronous”) remote usability testing. This surge is changing the user experience industry and it forces us, whether we want to or not, to take a closer look at what the benefits and drawbacks are of unmoderated testing and whether or not we should incorporate it into our usability toolbox.
In this session we will cover: what you can learn from unmoderated testing, how actionable the data is, how it’s conducted, when it should be conducted, benefits and drawbacks, and an overview of some unmoderated testing tools that are currently available. If there is time, a discussion with attendees about their experience with unmoderated testing will also take place. The content of this session is aimed at anyone who conducts usability testing or is interested in it. The material is good for both the novice and advanced UX practitioner.
The content in this session is aimed at being 100% practical and applicable. Attendees should be able to immediately apply what they learn in this session to evaluate whether or not unmoderated usability testing is a good fit for them. This session will also give attendees the information they need to start conducting their own unmoderated tests. The material that will be covered in this session is an extension of an article that was written by the presenter for UXmatters.com: http://www.uxmatters.com/mt/archives/2010/01/unmoderated-remote-usability-testing-good-or-evil.php
43. Nothing beats watching participants in real time and
being able to ask probing questions about what they
are doing as it’s happening.
Image source: facit digital
44. Some participants may only be interested in earning
the honorarium you’ve provided as an incentive.
$$$
46. What participants report on surveys can be very
different in comparison to what they actually do.
It is simple to use:
Strongly Disagree ---1---2---3---4---5---6---7 Strongly Agree
Image source: http://www.hiero.com/web-analytics.html
47. It’s possible for participants to think they’ve
successfully completed a task when they haven’t.
Image source: http://www.hiero.com/web-analytics.html
48. Does it matter if it took someone longer to complete
a task?
Image source: http://www.hiero.com/web-analytics.html
68. Resources
• RemoteUsability.com
• Book: Remote Research
by Nate Bolt & Tony Tulathimutte
• Article: Unmoderated, Remote Usability Testing:
Good or Evil? by Kyle Soucy
http://www.uxmatters.com/mt/archives/2010/01/unmoderated-remote-usability-
testing-good-or-evil.php