Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

User Testing

211 visualizaciones

Publicado el

In the webinar that these slides go with we explore different approaches to integrating user testing into the development of legal content for diverse audiences. Examples include user testing in the following contexts: the development of a website and mobile app in the immigration sphere, the rollout of a pro bono mobilization website, content development for a statewide website, and enhancements to user experience when navigating online forms for courts.

  • Sé el primero en comentar

  • Sé el primero en recomendar esto

User Testing

  1. 1. If you joined the training via telephone, please select Telephone and enter your audio pin if you haven’t already. If you joined with a microphone and headset or speakers (VoIP), please select Mic & Speakers. We will start promptly at the hour.
  2. 2. Maximize/minimize control panel with the orange arrow. VOIP users select Mic & Speakers. Telephone users select Telephone, and then enter the audio pin. Ask a question or tell us something in the Questions box. Raise your hand by clicking on the Hand at the bottom of the tool bar if you want to talk. (We will stop after presenters.) A few logistics before we start…
  3. 3. LSNTAP is recording this training and will post it to their SlideShare account for the LSNTAP and SWEB websites. Registered attendees will receive an email with a link to this information once it has been posted.
  4. 4. Let the User be Your Guide October 19th, 2016
  5. 5. Presenters Mike Grunenwald (Moderator) Pro Bono Net Tony Lu Immigration Advocates Network Candice Farha and Melissa Nolte Kansas Legal Services Dina Nikitaides Illinois Legal Aid Online Claudia Johnson LawHelp Interactive
  6. 6. What is user testing?
  7. 7. Why do User Testing?
  8. 8. Let’s look at some methods…
  9. 9. Challenging your assumptions about your users Tony Lu Product Manager, Immigration Advocates Network
  10. 10. User Personas: A Case Study How do you create a platform that: ● is targeted at low-income, low computer-literacy users? ● helps users learn about complex immigration benefits and requirements? ● is accessible to and welcoming to users and lay-advocates?
  11. 11. Personas in Software Design Persona: A representation of a particular audience segment for a website/product/service you are designing, based on various types of qualitative and quantitative research. It captures a person’s motivations, frustrations and the “essence” of who they are. Source:
  12. 12. User Journeys User Journey: A series of steps (typically 4-12) which represent a scenario in which a user might interact with the thing you are designing. Source:
  13. 13. User Journeys Applied to Legal Services From: The Open Law Lab,
  14. 14. Personas as a Way to Organize Legal Knowledge DACA SIJS Asylum VAWA Under 16 y.o. when arrived Abandoned by parents Well founded fear of persecution Survivor of domestic violence by US citizen spouse ● Young woman brought to U.S. as child ● Parents have died ● Escaped abusive marriage ● Member of ethnic minority in home country
  15. 15. Focus Groups Target Group Incentives Lawyers/Advocates ● A stake/say in functionality and requirements Lay-advocates ● Potential partnership to expand their capacity Target users ● Gift cards
  16. 16. Focus Group Takeaways ● Information needs to be curated and organized (Education via Google search is not good enough) ● A lot of people don’t know what they don’t know (don’t know where to start) ● People often learn by relating to others (“I learned I was able to get Special Immigrant Juvenile Status because my cousin got it.)
  17. 17. Personas: From Design Tool To Feature
  18. 18. Never Stop Challenging Your Assumptions Build prototypes - early and often. ● Customer feedback is invaluable testing data. ● Your customer support staff are internal advocates for your users. Listen to them. Empower them to help set priorities.
  19. 19. Usability Testing Methods & Process Melissa Nolte & Candice Farha of Kansas Legal Services
  20. 20. Some methods we use in Kansas ✖Surveys ✖Focus groups in person and web-based ✖Intern projects ✖Continual feedback option on site pages ✖Google analytics ✖Usability testing model
  21. 21. BIG CONCEPT Testing should be an organic, continual process (not just pre- and post- testing).
  22. 22. Testing Methods A cornucopia of options
  23. 23. Surveys
  24. 24. Surveys ✖Survey Monkey for web-based surveys - urls on website & in emails ✖One page paper surveys for in-person surveys, i.e., at meetings ✖For both kinds: 5 questions at most, 5- point Likert scale with space for comments at end
  25. 25. Surveys ✖Measure attitudes, knowledge, satisfaction, in separate surveys and at different times ✖Measure at regular intervals, after each enhancement, throughout the project and beyond
  26. 26. Focus Groups
  27. 27. Focus Groups ✖Focus groups, or discussion groups, may be in-person or online ✖Get more data, usually, in person ✖Try for 6 to 10 people ✖Start with broad question, get more narrow (wide-angle to close-up)
  28. 28. Focus Groups ✖Explain reasons for the discussion, express appreciation ✖Limit time to 20 - 30 minutes ✖Let people talk, discuss questions with each other ✖Tell them you’ll share the results later
  29. 29. Intern Testing Tasks
  30. 30. Intern Testing Tasks To evaluate a search on the website: ✖ Give interns a list of search items, i.e., domestic violence, Power of Attorney, etc. ✖ Ask them to take screenshots of what is at the top of the search. ✖ Ask them to try searches on other legal aid websites and discuss comparison with ours.
  31. 31. Continual Website Feedback
  32. 32. Feedback option This is what the feedback option looks like on the bottom of an average KLS website page
  33. 33. Continual Website Feedback ✖Monitor feedback and respond to requests/concerns: Note changes over time when pages or user tools are added or changed ✖Take note when feedback points to systemic problems & plan for remedies, i.e., add a “guide” to help users’ searches
  34. 34. Google Analytics
  35. 35. Google Analytics Great tool to monitor usage before, during and after website enhancements Great tool to monitor changing issues in users’ legal needs by pageview measures and unique visitors Great tool to monitor users’ evolving use of browsers and technology
  36. 36. Usability Testing Model
  37. 37. Usability Testing Model ✖What is it? How does it work? ✖What is the value? What do you learn? Steps: ✖Reminders/ intro e.g. “Think out loud” ✖Pre-testing questions ✖Testing questions Tasks Scenarios ✖Post-testing questions
  38. 38. Usability Testing Model Example: Task ✖ Open the KLS homepage. ✖ Find the search box. ✖ Conduct a search on the topic of Wills. ✖ Examine the available results. ✖ Narrow your search to Living wills. ✖ Examine what the new results are. ✖ Attempt to print out a PDF of one of the results. ✖ Return to the KLS homepage.
  39. 39. Usability Testing Model Example: Scenario ✖ You are a Kansas tenant living in an apartment building in Topeka. ✖ Your landlord won’t fix a problem you’re having with your water pipes. ✖ You’ve sent him many requests, but he keeps saying it’s not his problem and that you have to fix it. ✖ You want to know whose responsibility it is to fix the water.
  40. 40. Thanks for listening!
  41. 41. User Experience Testing & Design Dina Nikitaides User Experience Manager
  42. 42. Where we started  5 websites for 5 “audiences”  Too many pieces of content  Good ideas that were never used  No easy browsing or way to see breadth of information  Terrible mobile usability
  43. 43. Where we wanted to go  Easy to use  Less content, still covering same areas and depth  Findable via search & browse  Balance between enough and too much information
  44. 44. Road to becoming user centered  How people really work VS. how they say they work  Ask & observe  Reach all types of users
  45. 45. Road to becoming user centered  Continually improve
  46. 46. Testing Observation Click tests Card sorts Tree tests Comprehension evaluation Surveys Focus groups Un-moderated Observation
  47. 47. Card sort  What: Users sort list of topics (cards) into set of categories  How: In person on paper & online via software  Why: Provides understanding of how users organize information  When: Likely best at the beginning of a projects, but always good to reassess
  48. 48. Card sort
  49. 49. Card sort
  50. 50. Click test  What: Users are shown a prototype and indicate where they would look for info & features  How: Online via software & in person on paper  Why: Provides understanding of users’ prior experiences and expectations  When: Anytime
  51. 51. Click test
  52. 52. Click test Users were asked where they would click to find a lawyer.
  53. 53. User Experience Testing & Design Dina Nikitaides User Experience Manager
  54. 54. Making online forms go! Claudia Johnson Program Manager LawHelp Interactive
  55. 55. It is more than just the forms—initial conditions matter When creating forms we know that: • Plain language • Form design • Clear instructions • Process maps • Complete instructions after printing Matter!
  56. 56. What are form initial conditions? Most people that use an LHI form come from an approved webpage: 70% of our users are new users—come from referring webpages 30% are frequent users come directly to our landing page
  57. 57. Prior thoughts on online form starting pages
  58. 58. Ongoing focus: • 2011 staging page survey—came up w/a list of six aspects to include in a page staging page • Remember the “form finder” approaches? • Other ideas included: “Mini guides”—grouping all resources (forms and other around a high volume problem)—eviction, divorce etc. Generic forms videos and visual FAQs
  59. 59. In 2015—what makes a landing effective --focus SRLs first time users • How does page design impacts what the SWEB visitor chooses in a page? New tools that developed from 2011 to 2014 widget and tabbed approach
  60. 60. How do we measure impact of an intentional layering of these tools and new approaches?
  61. 61. At the same time, our partners were in this scene… You’re in a staff meeting, lamenting the low traffic to this awesome new resource you’ve just built. Susie says, “Hey! What if we changed how we showcase that project on our homepage?”
  62. 62. Ok, but… • What changes do we try? • How will we know which changes will work the best? • Will changing how this looks negatively affect other parts of our website? • Can we get clear data about what we try? • How will we know if better outcomes are because of these changes or are from something else? • What’s the cost? . . .
  63. 63. A/B to the Rescue! A/B testing is a way of conducting controlled, randomized experiments with the goal of improving a website metric (e.g. clicks). Source:, retrieved 1/4/16
  64. 64. Synergy! 1. User behavior on SWEB—based on design options 2. GA analytics for LHI 3. Anonymized user data from interview users 4. LHI data To come up with a great design for LHI form staging pages
  65. 65. A B
  66. 66. OR
  67. 67. WOAH!
  68. 68. TIG 14043 – What else are we looking at? • The amount of content we include on staging pages • The format for the content on our staging pages • Featuring multiple short pages v. one long page • Posting resources along with the instruction pages • Posting info about technical requirements • Including a link to just the static form • Using the LHI widget • Other TBD
  69. 69. OPTIMIZELY
  70. 70. Source:, retrieved 1/4/16
  71. 71. Source:, retrieved 1/4/16
  72. 72. Source:, retrieved 1/4/16
  73. 73. Source:, retrieved 1/4/16
  74. 74. Source:, retrieved 1/4/16
  75. 75. Source:, retrieved 1/4/16
  76. 76. What’s the cost? Starter account so far! • Pay-as-you-go ($) & enterprise level ($) let you do more, including multi-page/funnel tests Staff time • ~2 hrs of interactive training sessions before starting • First few – ~2 hrs/experiment • Now – can set something up in ~30 min to 1 hr
  77. 77. IT’S A BETTER WAY TO B!
  78. 78. RESULTS
  79. 79. Minnesota forms use is up • Over 22,000 documents created since January • One of the fastest growing states in 2016! • Q1 6635 from 3093 in 2015 • Q3 7909 from 7820 in 2015 And higher rates of assembly for some of the forms But for some forms—there is a slow down in use—which might be due to better instructions before people come to the form
  80. 80. Health Directive • Q1 vs. Q3 • 59.08% rate of assembly to a 61.39% rate of assembly • It is now provided through an LHI widget (as of 9/9/2016) • In a tabbed approach • With strong instructions before they get to the LHI link Of note—short form—takes only a few minutes—not a lot of accounts created in LHI---”create and go” form
  81. 81. Next steps • Will through LHI metrics • Will share the report widely—so that other legal non profits can replicate some of these innovations and track them—to help us increase our understanding of how all of these factors impact the adoption of online forms.
  82. 82. THANKS! With special thanks to our Minnesota partners and to Mary Kaczorek and Jenny Singleton for sharing their TIG A/B slides and for their leadership and vision in doing this project. For more info contact: Claudia Johnson Pro Bono Net
  83. 83. THANK YOU FOR ATTENDING TODAY! More information on additional webinars can be found at
  84. 84. Contact Information Brian Rowe ( or via chat on Don’t forget to take our feedback survey!