SlideShare una empresa de Scribd logo
1 de 178
UX Research
2013.5.13
InnoUX CEO 최병호
InnoUX@InnoUX.com, @ILOVEHCI
© 2013 InnoUX & Innodesign All rights reserved.UX 리서치
Table of Contents
• Big Thinking
• Definition
• Case Study
• Methodology
 Overview
 Contextual Inquiry
 Diary Study
 Field Study
 Card Sorting
 Usability Test
 Remote Test
 Eye Tracking
 Heuristic Evaluation
 Cognitive Walkthrough
• 12가지 이노베이션 게임 기법 소개
• References
1
Big Thinking
해석해본다면, 짂실은…
해석해본다면, 짂실은?
편견이 빚은 사실 왜곡, 이상핚 통찰…
면접관의 컵으로
의사결정을 조작핛
수 있다?... 그렇다면?
실험 결과의 유도:
무지의 소치,
정치의도의 개입
4 Common Biases
in Customer Research
• Confirmation Bias
• Framing Effect
• Observer-expectancy Effect
• Recency Bias
Confirmation Bias
Your tendency to search for or interpret
information in a way that confirms your
preconceptions or hypotheses.
Framing Effect
When you and your team draw different
conclusions from the same data based on your
own preconceptions.
Observer-expectancy
When you expect a given result from your
research which makes you unconsciously
manipulate your experiments to give you that
result
Recency Bias
This results from disproportionate salience
attributed to recent observations (your very last
interview) – or the tendency to weigh more
recent information over earlier observations
과학자의 태도로,
UX/UI 디자인과
리서치에 임핚다면,
권위 부정, 실험,
도약, 권위 도전
수용, 오픈 마인드?!
http://j.mp/10moy77
밥
실무
교양
장인
전문가
리서처
…
혹독한 훈련
인디언 훈련
디테일
…
매니저
아웃소싱
의사결정
…
기초 훈련
Market Research
User Research
UX Research
Customer Research
Definition
http://www.nngroup.com/articles/return-on-investment-for-usability/
Case Studies
Methodology:
Overview
You need to gather:
• Factual information
• Behavior
• Pain points
• Goals
You can document this on the persona validation board
As well as…
Photos, video, audio, journals…document everything
http://www.nngroup.com/articles/how-many-test-users/
http://www.nngroup.com/articles/how-many-test-users/
http://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
http://www.nngroup.com/articles/guerrilla-hci/
http://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
http://www.nngroup.com/articles/how-many-test-users/
http://www.nngroup.com/articles/quantitative-studies-how-many-users/
http://www.nngroup.com/articles/outliers-and-luck-in-user-performance/
http://www.nngroup.com/articles/card-sorting-how-many-users-to-test/
46
Types of Research Methods
Quantitative
Qualitative
Generative Evaulative
12 fMRI Brain Imaging
10 Eye Tracking
6 Lab-Based Testing
8 Professional
Heuristics
1 Contextual Observation
2 Remote Ethnography
11 Online UX Concept Surveys
9 Online Card Sorting
7 Large Sample On-Line Behavior Testing
5 Ergonomic Observation
4 Focus Groups
http://www.nngroup.com/articles/which-ux-research-methods/
48
Methodology: Contextual Observation/Ethnography
► Business problem
► How are people actually using products versus how they were designed?
► Description
► In-depth, in-person observation of tasks & activities at work or home. Observations
are recorded.
► Benefits
► Access to the full dimensions of the user experience (e.g. information flow,
physical environment, social interactions, interruptions, etc)
► Limitations
► Time-consuming research; travel involved, Smaller sample size does not provide
statistical significance, Data analysis can be time consuming
► Data
► Patterns of observed behavior and verbatims based on participant response,
transcripts and video recordings
► Tools
► LiveScribe (for combining audio recording with note-taking)
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
52
Methodology: Remote Ethnography
► Business problem
► How are people actually using in their environment in real-time?
► Description
► Participants self-record activities over days or weeks with pocket video cameras or
mobile devices, based on tasks provided by researcher.
► Benefits
► Allows participants to capture activities as they happen and where they happen
(away from computer), without the presence of observers. Useful for longitudinal
research & geographically spread participants.
► Limitations
► Dependence on participant ability to articulate and record activities, Relatively high
data analysis to small sample size ratio
► Data
► Patterns based on participant response, transcripts and video recordings
► Tools
► Qualvu.com
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
53
Methodology: Large-Sample Online Behavior Tracking
► Business problem
► Major redesign of a large complex site that is business-critical?
► Description
► 200-10,000+ respondents do tasks using online tracking / survey tools
► Benefits:
► Large sample size, low cost per respondent, extensive data possible
► Limitations
► No direct observation of users, survey design complex…other issues
► Data
► You name it (data exports to professional analysis tools).
► Tools of Choice
► Keynote WebEffective, UserZoom,
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
54
Methodology: Lab-based UX Testing
► Business problem
► Are there show-stopper (CI) usability problems with your user experience?
► Description
► 12-24 Respondents undertake structured tasks in controlled setting (Lab)
► Benefits
► Relatively fast, moderate cost, very graphic display of major issues
► Limitations
► Small sample, study design, recruiting good respondents
► Data
► Summary data in tabular and chart format PLUS video out-takes
► Tools
► Leased testing room, recruiting service and Morae (Industry Standard)
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
56
Methodology: Eye-Tracking
Business Problem
Do users see critical content and in what order?
Description
Respondents view content on a specialized workstation or glasses.
Benefits
Very accurate tracking of eye fixations and pathways.
Limitations
Relatively high cost, analysis is complex, data can be deceiving.
Data
Live eye fixations, heat maps…etc.
Tools of Choice
Tobii - SMI
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
57
Methodology: Automated Online Card Sorting
► Business problem
► User’s cannot understand where content they want is located?
► Description
► Online card sorting based on terms you provide (or users create)
► Benefits
► Large sample size, low cost, easy to field
► Limitations
► Use of sorting tools confuse users, data hard to understand
► Data
► Standard cluster analysis charts and more
► Tools of Choice
► WebSort…and others
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
58
Methodology: fMRI (Brain Imaging)
► Business Problem?
► What areas of the brain are being activated by UX design
► Description
► Respondents given visual stimulus while in FMRI scanner
► Benefits
► Maps design variables to core functions of the human brain
► Limitations
► Expensive and data can be highly misleading
► Data
► Brain scans
► Tools
► Major medical centers and research services (some consultants)
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
59
Methodology: Professional Heuristics
► Business problem
► Rapid feedback on UX design based on best practices or opinions
► Definition
► “Heuristic is a simple procedure that helps find adequate, though often imperfect,
answers to difficult questions (same root as: eureka)”
► Benefits
► Fast, low cost, can be very effective in some applications
► Limitations
► No actual user data, analysis only as good as expert doing audit
► Data
► Ranging from verbal direction to highly detailed recommendations
► Tools of Choice
► Written or verbal descriptions and custom tools by each experts.
Cost / respondent: NA
Statistical validity: None – Some – Extensive
60
Methodology: Focus Groups
► Business problem
► What are perceptions and ideas around products/concepts?
► Description
► Moderated discussion group to gain concept/product feedback and inputs; can
include screens, physical models and other artifacts
► Benefits
► Efficient method for understanding end-user preferences and for getting early
feedback on concepts , particularly for physical or complex products that benefit
from hands-on exposure and explanation
► Limitations
► Lacks realistic context of use; Influence of participants on each other
► Data
► Combination of qualitative observations (like ethnographic research) with
quantitative data (e.g. ratings, surveys)
► Tools
► See qualitative data analysis
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
A / B Testing
What
A testing procedure in which two (or
more) different designs are evaluated in
order to see which one is the most
effective. Alternate designs are served to
different users on the live website.
Why
Can be valuable in refining elements on a
web page. Altering the size, placement, or
color of a single element, or the wording
of a single phrase can have dramatic
effects. A / B Testing measures the results
of these changes.
Resources
A/B testing is covered in depth in the book
Always Be Testing: The Complete Guide to
Google Website Optimizer by Bryan
Eisenberg and John Quarto-von Tivadar.
http://www.testingtoolbox.com/
You can also check out the free A/B
testing tool Google Optimizer.
https://www.google.com/analytics/siteopt/pr
eview
A / B Testing
http://www.flickr.com/photos/danielwaisberg/
Kano Analysis
What
Survey method that determines
how people value features and
attributes in a known product
domain. Shows what features are
basic must-haves, which features
create user satisfaction, and which
features delight.
Why
Allows quantitative analysis of
feature priority to guide
development efforts and
specifications. Ensures that
organization understands what is
valued by users. Less effective for
new product categories
Kano Analysis
Six Thinking Hats
What
A tactic that helps you look at decisions
from a number of different perspectives.
The white hat focuses on data; the red on
emotion; the black on caution; the yellow
on optimism; the green on creativity; and
the blue on process.
Why
Can enable better decisions by
encouraging individuals or teams to
abandon old habits and think in new or
unfamiliar ways. Can provide insight into
the full complexity of a decision, and
highlight issues or opportunities which
might otherwise go unnoticed.
Resources
Lateral thinking pioneer Edward de Bono
created the Six Thinking Hats method.
http://www.edwdebono.com/
An explination from Mind Tools.
http://www.mindtools.com/pages/article/
newTED_07.htm
Six Thinking Hats
http://www.flickr.com/photos/daijihirata/
Methodology:
Contextual Inquiry
What is
Contextual Inquiry?
65
What is Ethnography?
• Defined as:
– a method of observing human interactions in social
settings and activities (Burke & Kirk, 2001)
– as the observation of people in their ‘cultural context’
– the study and systematic recording of human cultures;
also : a descriptive work produced from such research
(Merriam-Webster Online)
• Rather than studying people from the outside, you
learn from people from the inside
(Anderson, 1997; Malinowski, 1
967; 1987; Kuper 1983)
Who Invented Ethnography?
• Invented by Bronislaw Malinowski in 1915
– Spent three years on the Trobriand Islands (New
Guinea)
– Invented the modern form of fieldwork and
ethnography as its analytic component
(Salvador & Mateas, 1997)
Traditional VS Design Ethnography
Traditional
• Describes cultures
• Uses local language
• Objective
• Compare general
principles of society
• Non-interference
• Duration: Several Years
Design
• Describes domains
• Uses local language
• Subjective
• Compare general
principles of design
• Intervention
• Duration: Several
Weeks/Months
Contextual inquiry is a field data-gathering technique
that studies a few carefully selected individuals in
depth to arrive at a fuller understanding of the work
practice across all customers.
Through inquiry and interpretation, it reveals
commonalities across a product’s customer base.
What is contextual inquiry?
~ Beyer & Holtzblatt
Contextual Inquiry:
When to do it
Every ideation and design cycle should start with a
contextual inquiry into the full experience of a customer and
his/her.
Contextual inquiry clarifies and focuses the problems a
customer is experiencing by discovering the
• Precise situation in which the problems occur.
• What the problem entails.
• How customers go about solving them.
What is your focus?
Who is your audience?
Recruit & schedule participants
Learn what your users do
Develop scenarios
Conduct the inquiry
Interpret the results
Evangelize the findings
Rinse, repeat (at least monthly)
Contextual Inquiry:
How to do it
(Nielsen, 2002)
Dos & Don’ts
Don’t
• Ask simple Yes/No
questions
• Ask leading questions
• Use unfamiliar jargon
• Lead/guide the ‘user’
Do
• Ask open-ended questions
• Phrase questions properly
to avoid bias
• Speak their language
• Let user notice things on
his/her own
Analyzing
the results
“The output from customer research is not a
neat hierarchy; rather, it is narratives of
successes and breakdowns, examples of use
that entail context, and messy use artifacts”
Dave Hendry
74
Research Analysis
What are people’s values?
People are driven by their social and cultural contexts as much as their rational
decision making processes.
What are the mental models people build?
When the operation of a process isn’t apparent, people create their own models of
it
What are the tools people use?
It is important to know what tools people use since you are building new tools to
replace the current ones.
What terminology do people use to describe what they do?
Words reveal aspects of people’s mental models and thought processes
What methods do people use?
Flow is work is crucial to understanding what people’s needs are and where
existing tools are failing them.
What are people’s goals?
Understanding why people perform certain actions reveals an underlying
structure of their work that they may not be aware of themselves.
Affinity
Diagrams
“People from different teams engaged in affinity
diagramming is as valuable as tequila shots and
karaoke. Everyone develops a shared
understanding of customer needs, without the
hangover or walk of shame”
76
Research Analysis: Affinity Diagrams
Creates a hierarchy of all observations, clustering them
into themes.
From the video observations, 50-100 singular
observations are written on post-its
(observations ranging from tools, sequences, interactions,
work-arounds, mental models, etc)
With entire team, notes are categorized by relations into
themes and trends.
Methodology:
Diary Study
Customer Validation
Diary Studies help here
Users record thoughts,
comments, etc. over time
http://www.flickr.com/photos/vanessabertozzi/877910821
http://www.flickr.com/photos/yourdon/3599753183/
http://www.flickr.com/photos/stevendepolo/3020452399/
http://www.flickr.com/photos/jevnin/390234217/
Interview users
Gather feedback, data
Organise and analyse
(affinity maps, analytics)
Participants keep a record of
“When” data
Date & time
Duration
Activity / task
“What" data
Activity / task
Feelings / mood
Environment / setting
No one right way to collect data
Structured
Yes/no
Select a category
Date & time
Multiple choice
Unstructured
Open-ended
Opinions / thoughts / feelings
Notes / comments
Combine / mix & match
http://www.flickr.com/photos/roboppy/9625780/
http://www.flickr.com/photos/vanessabertozzi/877910821
“Hygiene” aspects
At the beginning
•Introduction / get-to-know-you
•Demographics & psychographics, profiling
•Instructions / Setting expectations
At the end
•Follow-up
•Thanks / token gift
•Reflection
Pitfalls
• Belief bias
• Behavior adjustment
• Ramp-up time
• Failure to recall
Methodology:
Field Study
Methodology:
Usability Test
Usability Tests
Start new test
Identify 3-5
tasks to test
Observe test
participants
performing
tasks
Identify the 2-3
easiest things to
fix
Make changes
to site
Identify Tasks for the Test
Known
problem
areas
Most
common
activities
Popular
pages
New
pages or
services
Recommended Gear
Computer
(or paper prototype)
Screen
recording
software
(or observer)
Microphone
(or observer)
Screen Recording Software
Price
Features
Staff of One
Recruits test
participants Runs the test
Records the
test
(screen recording
software & mic)
Preps test
environment
before & after
test
Staff of Two
Recruits test
participants
Runs the test
#1
Observes the
test
Preps test
environment
before & after
each test
#2
Working Through Tasks
Stick to
the script
Encourage
participant
to speak
aloud
Don’t help
the
participant
Methodology:
Remote Test
New technologies and
techniques allow for
Remote:
– Moderated testing
– Unmoderated testing
– Observation
Irrelevance of
Place
Remote Moderated Testing
Products like GotoMeeting allow connections
to the test (or observation) computer to the
Internet. VoIP can carry voice cheaply.
LiveMeeting
WebEx
GoToMeeting
For screen VoIP Audio
Skype
GoogleTalk
Translator
Moderator
Participant
Observers
Remote Unmoderated Testing
‘Task-based’ Surveys
> Online/remote Usability Studies
(unmoderated)
> Benchmarking (competitive /comparison)
> UX Dash`boards (measure ROI)
Online Card Sorting
> Open or closed
> Stand alone or
> Integrated with task-based
studies & surveys
Online Surveys
> Ad hoc research
> Voice of Customer studies
> Integrated with Web Analytics data
User Recruiting Tool
> Intercept real visitors (tab or layer)
> Create your own private panel
> Use a panel provider*
Robust Set of Services
110
• Saves time
o Lab study takes 2-4 weeks from start to finish, unmoderated typically takes hours to
a few days*
• Saves money
o Participants compensation typically a lot less ($10 vs. $100)
o Tools are becoming very inexpensive
• Reliable metrics
o Only (reasonable) way to collect UX data from large sample sizes
• Geography is not a limitation
o Collect feedback from customers all over the world
• Greater Customer insight
o Richest dataset about the customer experience
Why Should You Care?
111
Common Research Questions:
• What are the usability issues, and how big?
• Which design is better, and by how much?
• How do customer segments differ?
• What are user design preferences?
• Is the new design better than the old design?
• Where are users most likely to abandon a transaction?
Types of Studies:
• Comprehensive evaluation
• UX benchmark
• Competitive evaluation
• Live site vs. prototype comparison
• Feature/function test
• Discovery
Overview
Typical Metrics:
• Task success
• Task time
• Self-report ratings such as ease of use,
confidence, satisfaction
• Click paths
• Abandonment
112
Full Service Tools – Online Usability Testing
113
Card Sorting / IA Tools
114
Online Surveys
115
Click and Mouse Tools
116
Video Tools
117
Report Tools
118
Expert Review
Methodology:
Card Sorting
STEPS IN A CARD SORT
1. Decide what you want to learn
2. Select the type of Card Sort (open vs closed)
3. Choose Suitable Content
4. Choose and invite participants
5. Conduct the sort (online or in-person)
6. Analyze Results
7. Integrate results
WHAT ARE YOU WANTING TO LEARN?
• New Intranet vs Existing?
• Section of Intranet?
• Whole organization vs single department?
• For a project? For a team?
Product
Targets
CRM
Project
Review
CRM
Organizatio
n Chart
Christmas
Party
Walkathon
Results
Year in
Review
Meeting
Vacation
Policy
Pay Days
Vacation
request
form
Year in
Review
Meeting
Product
TargetsCRM
Project
Review
CRM
Organizatio
n Chart
Christmas
Party
Walkathon
Results
Vacation
Policy
Pay Days
Vacation
request
form
OPEN VS CLOSED
Vacation
Policy
Christmas
Party
CRM
Project
Review
CRM
Organization
Chart
Product
Targets
Year in
Review
Meeting
Pay Days
Walkathon
Results
Vacation
request
form
Vacation
Policy
Christmas
Party
CRM
Project
Review
CRM
Organizatio
n Chart
Product
TargetsYear in
Review
Meeting
Pay Days
Walkathon
Results
Vacation
request
form
Company
News
Departments
Human
Resources
Projects
Company
News
Events
Human
Resources
Projects
Company
News
Departments
Human
Resources
Projects
OPEN
SORT
CLOSED
SORT
SELECTING CONTENT
Do’s
•30 – 100 Cards
•Select content that can be
grouped
•Select terms and concepts
that mean something to
users
Don’ts
• More than 100 cards
• Mix functionality and
content
• Include both detailed and
broad content
ANALYSIS
LOOK AT
• What groups were created
• Where the cards were placed
• What terms were used for labels
• Organization scheme used
• Whether people created accurate or inaccurate groups
INTEGRATE RESULTS: CREATE YOUR IA
Our Company
Executive Blog
New York
Vancouver
Mission and Values
Projects
Project Name 1
Project Name 2
Project Name 3
Project Name 4
Departments
Executive
Operations
Operations Support
Vessel Planning
Yard Planning
Rail Planning
Finance &
Administration
Human Resources
Corporate
Communications
IT
Community &
Groups
Events
Charitable Campaigns
Vancouver Carpool
Employee
Resources
Vacation & Holidays
Expenses
Travel
Health & Safety
Wellness
Benefits
Facilities
Payroll
Communication Tools
Centers of
Excellence
Project Management
Professionals
Engineering
Terminal Technologies
NAVIS
Lawson
IT
Yard Planning
Traditional Card Sort
Online Card Sorting
Card Sorting is as common as Lab based Usability
Testing
Source: 2011 UxPA Salary Survey
Terms & Concepts
• Open Sort: Users sort items into groups and give the
groups a name.
Closed Sort: Users sort items into previously defined
category names.
• Reverse Card Sort (Tree Test) : Users are asked to locate
items in a hierarchy (no design)
• Most Users Start Browsing vs Searching: Across 9
websites and 25 tasks we found on average 86% start
browsing
http://www.measuringusability.com/blog/card-sorting.php
http://www.measuringusability.com/blog/search-browse.php
Methodology:
Eye Tracking
Set-up of an eye tracking test
User tests are often run in 45 to 60
minute sessions with 6 to 15
participants:
1. Participants are give a number of
typical task to complete, using the
website, design or product you want
to test.
2. The user’s intuitive interaction is
observed, comments and reactions
are recorded.
3. The participant‟s impressions are
captured in an interview at the end
of the test.
132
Eye tracking results: Heatmaps
Heatmaps show what participants
focus on.
In this example, „hot spots‟ are the
picture of the shoes, the central entry
field and the two right-hand tiles
underneath.
The data of all participants is
averaged in this map.
133
Eye tracking results: Gazeplot
Gaze plots show the „visual path‟ of
individual participants. Each bubble
represents a fixation.
The bubble size denotes the length
or intensity of the fixation.
Additional results are available in
table format for more detailed
analysis.
134
The key visual and a box at the bottom
Note: Telstra Clear have since re-designed their homepage.
The key
visual got
lots of
attention.
Surprising: This box got
heaps of attention. It
reads:
“If you are having trouble
getting through to us on
the phone, please click
here to email us, we‟ll get
back to you within 2
business days”.
Participants got the
impression that Telstra Clear
has trouble with their
customer service.
The main
navigation and
its options got
almost no
attention.
135
The Face effect – an example
bunnyfoot
Yep, there’s
attention on
certain… areas, … the face,
however, is the
strongest point
of focus!
136
Using the Face effect
humanfactors.com
Eye tracking results for ad Version
A:
 We see a face effect: The model‟s face
draws a lot of attention.
 The slogan is the other hot spot of the
design. Participants will likely have read
it.
 The product and its name get some,
but not a lot of attention.
137
Using the Face effect
Eye tracking results for ad Version
B:
 Again, we see a strong face effect. BUT:
In this version, the models gaze is in line
with the product and its name.
 The product image and name get
considerably more attention!
 Additionally, even the product name at
the bottom is noticed by a number of
participants.
humanfactors.com 138
Ways to focus attention
usableworld.com.au
Same effect: If the baby faces you, you‟ll look at the baby. But if the baby faces the ad
message, you pay attention to the message. You basically follow the baby‟s gaze.
139
Banner blindness
… or are they?
In this test, participants were
given a task: Find the nearest
ATM.
Participants focused on the
main navigation and the
footer navigation– this is
where they found the „ATM
locator‟.
So, when visiting a site with a
task in mind – as you
normally do – the central
banner can be ignored!
140
Compare the visual paths: Task versus browse
When browsing, the central banner gets lots of attention. But how often do you visit a bank
website just to browse?
Participant was asked just to look at the homepage Participant was given a task („Find the nearest ATM‟)
141
Main focus: Navigation options
Eye tracking results show:
When looking for
something on a
website, the main
focus of attention are
the navigation options.
Maybe users have learned
that they‟re unlikely to
find what they‟re looking
for in a central banner
image.
Task: „What concerts are happen in Auckland this month?‟ Task: „You want to send an email to customer service‟
142
Task: „You want to get in touch with customer service‟
When do users look at banners?
In this example, participants looked at the banner even though they were looking for
something specific. What‟s different?
Participant was asked just to look at the homepage
143
Methodology:
Heuristic
Evaluation
1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover from errors
10. Help and documentation
J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994
Nielsen’s 10 heuristics
Slide 146
http://www.slideshare.net/AbbyCovert/information-architecture-heuristics
HE output
Slide 148
• A list of usability problems
• Tied to a heuristic or rule of practice
• A ranking of findings by severity
• Recommendations for fixing problems
• Oh, and the positive findings, too
usability.spsu.edu UPA 201
1
149
HE sample
findings
page
150
Objectives/goals for
the modules
Reason content is being
presented
Conciseness of presentation
Definitions required to work
with the module/content
Evaluation criteria and
methods
Direct tie between content
and assessment measure
Sequence of presentation
follows logically from
introduction
Quizzes challenge users
Develop a consistent structure that
defines what’s noted in the
bulleted points, above.
Avoid generic statements that
don’t focus users on what they will
be accomplishing.
Advise that there is an assessment
used for evaluation and indicate if
it’s at the end or interspersed in
the module
Connect ideas in the goals and
objectives with outcomes in the
assessment
Follow the order of presentation
defined at the beginning
Develop interesting and
challenging questions
Re-frame goals/objectives at the
end of the module
   3
Finding Description Recommendation H C S Severity Rating
Objectives/goals for the
modules
Reason content is being
presented
Conciseness of presentation
Definitions required to work
with the module/content
Evaluation criteria and
methods
Direct tie between content and
assessment measure
Sequence of presentation
follows logically from
introduction
Quizzes challenge users
Develop a consistent structure that
defines what’s noted in the bulleted
points, above.
Avoid generic statements that don’t
focus users on what they will be
accomplishing.
Advise that there is an assessment
used for evaluation and indicate if it’s
at the end or interspersed in the
module
Connect ideas in the goals and
objectives with outcomes in the
assessment
Follow the order of presentation
defined at the beginning
Develop interesting and challenging
questions
Re-frame goals/objectives at the end
of the module
   3
Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.
H = Hyperspace; C = Cardiac Arrest; S = Shock
Methodology:
Cognitive
Walkthrough
12가지 이노베이션
게임 기법 소개
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
고객의 요구를 더욱 잘 이해핛 목적으로
고객과 함께 하는 재미난 놀이
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
제품 가지 치기
(제품이 나아갈 방향을 잡는다)
굵은 가지=주요 기능
중심부 나뭇잎=기능, 바깥 나뭇잎=새로욲 기능
경계선 바깥=미래
나무 뿌리=고객 지원, 서비스 체계
균형 성장 초점: 나무 뿌리만큼 나뭇가지도 그 만큼 성장
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(제품이 나아갈 방향을 잡는다)
미래 기억하기
2013년에서 2015년의 미래 기억하기
“우리 제품이 어떤 기능을 제공해
만족스러웠나요?”(타임라인 표기)
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(제품이 나아갈 방향을 잡는다)
시력 2.0
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(제품이 나아갈 방향을 잡는다)
기능 구매
잠정적인 기능에 가격 결정
고객은 가짜 돈으로 다음 출시 버전 구매 협상
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(고객이 제품과 서비스를 사용하는
방식과 관계를 이해핚다)
나의 하루
제품이 필요핚 때 묘사
제품이 고객의 하루에 어떻게 지원 또는 방해 관찰
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(고객이 제품과 서비스를 사용하는
방식과 관계를 이해핚다)
그림자 놀이
그림자가 되어 관찰
다른 고객들을 데려가 통역가로 활용
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다)
견습공
고객 현장에서 직접 체험
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다)
자랑하기
아이가 수업시간에 자기 보물 자랑하듯
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(고객이 제품과 서비스를 사용하는
방식과 관계를 이해핚다)
거미줄
자사와 경쟁사 제품 모두 그려 관계 파악
(선 색상/굵기/모양 다르게 중요 관계 표현)
얶제, 어떻게, 왜 사용
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
(제품과 서비스가 제공 핛 기능 파악)
고속 질주를 방해하는 보트의 닻 규명
제품에서 싫어하는 점 색인 카드에 적어 보트 아래 닻 배치
다른 고객이 배치핚 닻 동의 의견 추가
스피드 보트
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
뜨거욲 맛 보이기
아주 말이 앆 되는 기능 제공
기능 접핚 고객 반응 관찰
http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game
제품 상자
스스로 구매핛 제품 상자 디자인 요청
상자는 마케팅 표어, 그림, 가격 등 어떤 정보도 포함
자싞이 만든 상자 다른 고객에게 판매
References
© 2013 InnoUX & Innodesign All rights reserved.UX 리서치
인용/참조 문헌
• Ethnography (Santosh Bhandari, Mar 29, 2013)
• Cultural probes in real life (gerrygaffney, Jun 11, 2012)
• UX of User Stories Workshop (Anders Ramsay, Aug 14, 2012)
• Usability behaviors: Usability and the SDLC (Ted Tschopp, Nov 04, 2012)
• Know Thy User: The Role of Research in Great Interactive Design (frog, Sep 2012)
• The Mobile Frontier (Rachel Hinman, Feb 2012)
• Introduction to AgileUX: Fundamentals of Customer Research (Will Evans, Jan 2012)
• Customer Research & Persona Development (Will Evans, Oct 2012)
• Introduction to UX Research: Conducting Focus Groups (Will Evans, Jan 2012)
• Midwest UX '12: Mapping the Experience (Chris Risdon, Jun 2012)
• Eye Tracking & User Research (Optimal Usability, Apr 2012)
• Taking it to the streets: Investigating mobile and web UX through field studies (Emma Rose, Jun 2012)
• NYTECH "Measuring Your User Experience Design“ (New York Technology Council, Mar 2012)
• How to Conduct UX Benchmarking (UserZoom, May 2012)
• Customer validation with Diary Studies (Boon Chew, Jan 2012)
• The Science of Great Site Navigation: Online Card Sorting + Tree Testing (UserZoom, Jul 2012)
• Introduction to Card Sorting (ThoughtFarmer, Sep 2012)
• Usability Testing Basics (Stephen Francoeur, Mar 2012)
• Storytelling: Rhetoric of heuristic evaluation (Southern Polytechnic State University, Mar 2012)
• Cognitive and pluralistic (Aarushi Mishra, Oct 2012)
• How to Quantitatively Measure Your User Experience (Richard Dalton, May 2012)
• Information Architecture Heuristics (Abby Covert, Jul 24, 2011)
• Diary Studies in HCI & Psychology (UPABoston, Jul 13, 2011)
• Remote Testing Methods & Tools Webinar (UserZoom, Dec 2011)
• Beyond User Research (Louis Rosenfeld, Mar 2011)
• Using Ethnographic User Research to Drive Knowledge Management and Intranet Strategy (NavigationArts, Dec 01, 2010)
• Remote Research, The Talk. (bolt peters, May 27, 2010)
• User Interview Techniques (Liz Danzico, May 2010)
• The new digital ethnographer’s toolkit: Capturing a participant’s lifestream (Christopher Khalil, Sep 04, 2009)
• Design Research For Everyday Projects - UX London (leisa reichelt, Jun 2009)
• Contextual Inquiry V1 (Rajesh Jha, Sep 11, 2008)
176
경청해주셔서
고맙습니다!

Más contenido relacionado

La actualidad más candente

A&D - Fact Finding Techniques
A&D - Fact Finding TechniquesA&D - Fact Finding Techniques
A&D - Fact Finding Techniquesvinay arora
 
System analysis and design
System analysis and designSystem analysis and design
System analysis and designDiana Oommachan
 
Carma internet research module: Survey reduction
Carma internet research module: Survey reductionCarma internet research module: Survey reduction
Carma internet research module: Survey reductionSyracuse University
 
Elicitation Techniques
Elicitation TechniquesElicitation Techniques
Elicitation TechniquesSwati Sinha
 
Research methods - qual or quant
Research methods - qual or quantResearch methods - qual or quant
Research methods - qual or quantTracy Harwood
 
Evaluation techniques in HCI
Evaluation techniques in HCIEvaluation techniques in HCI
Evaluation techniques in HCIsawsan slii
 
Basics of-usability-testing
Basics of-usability-testingBasics of-usability-testing
Basics of-usability-testingWBC Software Lab
 
Reading 1 need assessment
Reading 1 need assessmentReading 1 need assessment
Reading 1 need assessmentAlex Tsang
 
Tech Market Research Guide
Tech Market Research GuideTech Market Research Guide
Tech Market Research GuideBlaine Mathieu
 
Design methods for emotions
Design methods for emotionsDesign methods for emotions
Design methods for emotionsCarles Debart
 
Probsolv2007 engineering design processes pp ws
Probsolv2007   engineering design processes pp wsProbsolv2007   engineering design processes pp ws
Probsolv2007 engineering design processes pp wsvideoteacher
 
Modelling guidance 2014
Modelling guidance 2014Modelling guidance 2014
Modelling guidance 2014chazsmith
 
Combining Estimates with Planning Poker *- An Empirical Study
Combining Estimates with Planning Poker *- An Empirical StudyCombining Estimates with Planning Poker *- An Empirical Study
Combining Estimates with Planning Poker *- An Empirical StudyKjetil Moløkken-Østvold
 
User Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSAUser Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSASTC India UX SIG
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentalsdebcook
 
Qualitative Research in Segmentation
Qualitative Research in SegmentationQualitative Research in Segmentation
Qualitative Research in SegmentationSusan Abbott
 

La actualidad más candente (20)

A&D - Fact Finding Techniques
A&D - Fact Finding TechniquesA&D - Fact Finding Techniques
A&D - Fact Finding Techniques
 
System analysis and design
System analysis and designSystem analysis and design
System analysis and design
 
Carma internet research module: Survey reduction
Carma internet research module: Survey reductionCarma internet research module: Survey reduction
Carma internet research module: Survey reduction
 
Elicitation Techniques
Elicitation TechniquesElicitation Techniques
Elicitation Techniques
 
Research methods - qual or quant
Research methods - qual or quantResearch methods - qual or quant
Research methods - qual or quant
 
Ch04
Ch04Ch04
Ch04
 
Evaluation techniques in HCI
Evaluation techniques in HCIEvaluation techniques in HCI
Evaluation techniques in HCI
 
Session 3 sample design
Session 3   sample designSession 3   sample design
Session 3 sample design
 
Basics of-usability-testing
Basics of-usability-testingBasics of-usability-testing
Basics of-usability-testing
 
Reading 1 need assessment
Reading 1 need assessmentReading 1 need assessment
Reading 1 need assessment
 
Chap008
Chap008Chap008
Chap008
 
Surveys
SurveysSurveys
Surveys
 
Tech Market Research Guide
Tech Market Research GuideTech Market Research Guide
Tech Market Research Guide
 
Design methods for emotions
Design methods for emotionsDesign methods for emotions
Design methods for emotions
 
Probsolv2007 engineering design processes pp ws
Probsolv2007   engineering design processes pp wsProbsolv2007   engineering design processes pp ws
Probsolv2007 engineering design processes pp ws
 
Modelling guidance 2014
Modelling guidance 2014Modelling guidance 2014
Modelling guidance 2014
 
Combining Estimates with Planning Poker *- An Empirical Study
Combining Estimates with Planning Poker *- An Empirical StudyCombining Estimates with Planning Poker *- An Empirical Study
Combining Estimates with Planning Poker *- An Empirical Study
 
User Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSAUser Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSA
 
Usability Testing Fundamentals
Usability Testing FundamentalsUsability Testing Fundamentals
Usability Testing Fundamentals
 
Qualitative Research in Segmentation
Qualitative Research in SegmentationQualitative Research in Segmentation
Qualitative Research in Segmentation
 

Similar a UX Research

11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptxZahirahZairul2
 
Qualitative Research vs Quantitative Research - a QuestionPro Academic Webinar
Qualitative Research vs Quantitative Research - a QuestionPro Academic WebinarQualitative Research vs Quantitative Research - a QuestionPro Academic Webinar
Qualitative Research vs Quantitative Research - a QuestionPro Academic WebinarQuestionPro
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 
HCI 3e - Ch 9: Evaluation techniques
HCI 3e - Ch 9:  Evaluation techniquesHCI 3e - Ch 9:  Evaluation techniques
HCI 3e - Ch 9: Evaluation techniquesAlan Dix
 
General Tips to Fast-Track Your Quantitative Methodology
General Tips to Fast-Track Your Quantitative MethodologyGeneral Tips to Fast-Track Your Quantitative Methodology
General Tips to Fast-Track Your Quantitative MethodologyStatistics Solutions
 
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventUsability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventKay Aubrey
 
Human-centered AI: how can we support lay users to understand AI?
Human-centered AI: how can we support lay users to understand AI?Human-centered AI: how can we support lay users to understand AI?
Human-centered AI: how can we support lay users to understand AI?Katrien Verbert
 
UX Burlington 2017: Exploratory Research in UX Design
UX Burlington 2017: Exploratory Research in UX DesignUX Burlington 2017: Exploratory Research in UX Design
UX Burlington 2017: Exploratory Research in UX DesignSarah Fathallah
 
Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.
Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.
Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.Carol Smith
 
Pitfalls and Countermeasures in Software Quality Measurements and Evaluations
Pitfalls and Countermeasures in Software Quality Measurements and EvaluationsPitfalls and Countermeasures in Software Quality Measurements and Evaluations
Pitfalls and Countermeasures in Software Quality Measurements and EvaluationsHironori Washizaki
 
Chapter 2 Consumer Reserch
Chapter 2 Consumer ReserchChapter 2 Consumer Reserch
Chapter 2 Consumer ReserchAvinash Kumar
 
ASA conference Feb 2013
ASA conference Feb 2013ASA conference Feb 2013
ASA conference Feb 2013mrkwr
 
Prototyping and Usability Testing your designs
Prototyping and Usability Testing your designsPrototyping and Usability Testing your designs
Prototyping and Usability Testing your designsElizabeth Snowdon
 
Usability Testing Basics: What's it All About? at Web SIG Cleveland
Usability Testing Basics: What's it All About? at Web SIG ClevelandUsability Testing Basics: What's it All About? at Web SIG Cleveland
Usability Testing Basics: What's it All About? at Web SIG ClevelandCarol Smith
 
Market and Social Research Part 2
Market and Social Research Part 2Market and Social Research Part 2
Market and Social Research Part 2bestsliders
 
CX Survival Guide for 2019
CX Survival Guide for 2019CX Survival Guide for 2019
CX Survival Guide for 2019UserTesting
 

Similar a UX Research (20)

11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx
 
UX Research
UX ResearchUX Research
UX Research
 
NYTECH "Measuring Your User Experience Design"
NYTECH "Measuring Your User Experience Design"NYTECH "Measuring Your User Experience Design"
NYTECH "Measuring Your User Experience Design"
 
Qualitative Research vs Quantitative Research - a QuestionPro Academic Webinar
Qualitative Research vs Quantitative Research - a QuestionPro Academic WebinarQualitative Research vs Quantitative Research - a QuestionPro Academic Webinar
Qualitative Research vs Quantitative Research - a QuestionPro Academic Webinar
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 
classmar2.ppt
classmar2.pptclassmar2.ppt
classmar2.ppt
 
HCI 3e - Ch 9: Evaluation techniques
HCI 3e - Ch 9:  Evaluation techniquesHCI 3e - Ch 9:  Evaluation techniques
HCI 3e - Ch 9: Evaluation techniques
 
General Tips to Fast-Track Your Quantitative Methodology
General Tips to Fast-Track Your Quantitative MethodologyGeneral Tips to Fast-Track Your Quantitative Methodology
General Tips to Fast-Track Your Quantitative Methodology
 
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter eventUsability Testing for Qualitative Researchers - QRCA NYC Chapter event
Usability Testing for Qualitative Researchers - QRCA NYC Chapter event
 
Human-centered AI: how can we support lay users to understand AI?
Human-centered AI: how can we support lay users to understand AI?Human-centered AI: how can we support lay users to understand AI?
Human-centered AI: how can we support lay users to understand AI?
 
UX Burlington 2017: Exploratory Research in UX Design
UX Burlington 2017: Exploratory Research in UX DesignUX Burlington 2017: Exploratory Research in UX Design
UX Burlington 2017: Exploratory Research in UX Design
 
Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.
Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.
Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.
 
Pitfalls and Countermeasures in Software Quality Measurements and Evaluations
Pitfalls and Countermeasures in Software Quality Measurements and EvaluationsPitfalls and Countermeasures in Software Quality Measurements and Evaluations
Pitfalls and Countermeasures in Software Quality Measurements and Evaluations
 
Chapter 2 Consumer Reserch
Chapter 2 Consumer ReserchChapter 2 Consumer Reserch
Chapter 2 Consumer Reserch
 
ASA conference Feb 2013
ASA conference Feb 2013ASA conference Feb 2013
ASA conference Feb 2013
 
Prototyping and Usability Testing your designs
Prototyping and Usability Testing your designsPrototyping and Usability Testing your designs
Prototyping and Usability Testing your designs
 
Usability Testing Basics: What's it All About? at Web SIG Cleveland
Usability Testing Basics: What's it All About? at Web SIG ClevelandUsability Testing Basics: What's it All About? at Web SIG Cleveland
Usability Testing Basics: What's it All About? at Web SIG Cleveland
 
Market and Social Research Part 2
Market and Social Research Part 2Market and Social Research Part 2
Market and Social Research Part 2
 
Rosenhan "User Research"
Rosenhan "User Research"Rosenhan "User Research"
Rosenhan "User Research"
 
CX Survival Guide for 2019
CX Survival Guide for 2019CX Survival Guide for 2019
CX Survival Guide for 2019
 

Más de Billy Choi

현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기
현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기
현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기Billy Choi
 
용산FM 라디오 방송 with 최병호 교수
용산FM 라디오 방송 with 최병호 교수용산FM 라디오 방송 with 최병호 교수
용산FM 라디오 방송 with 최병호 교수Billy Choi
 
특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...
특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...
특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...Billy Choi
 
HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략
HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략
HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략Billy Choi
 
인공지능을 활용한 비즈니스 전략 사례
인공지능을 활용한 비즈니스 전략 사례인공지능을 활용한 비즈니스 전략 사례
인공지능을 활용한 비즈니스 전략 사례Billy Choi
 
스타트업 메타씽킹
스타트업 메타씽킹스타트업 메타씽킹
스타트업 메타씽킹Billy Choi
 
소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안
소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안
소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안Billy Choi
 
죽느냐 사느냐; AI 시대, 우리에게 필요한 질문
죽느냐 사느냐; AI 시대, 우리에게 필요한 질문죽느냐 사느냐; AI 시대, 우리에게 필요한 질문
죽느냐 사느냐; AI 시대, 우리에게 필요한 질문Billy Choi
 
인공지능 마이크로 트렌드 및 통찰
인공지능 마이크로 트렌드 및 통찰인공지능 마이크로 트렌드 및 통찰
인공지능 마이크로 트렌드 및 통찰Billy Choi
 
AI based BM 평가 및 개선 체크리스트
AI based BM 평가 및 개선 체크리스트AI based BM 평가 및 개선 체크리스트
AI based BM 평가 및 개선 체크리스트Billy Choi
 
2020 UX 화두 및 통찰
2020 UX 화두 및 통찰2020 UX 화두 및 통찰
2020 UX 화두 및 통찰Billy Choi
 
인공지능(AI)과 사용자 경험(UX)
인공지능(AI)과 사용자 경험(UX)인공지능(AI)과 사용자 경험(UX)
인공지능(AI)과 사용자 경험(UX)Billy Choi
 
사회문제 해결형 R&D 트렌드와 통찰
사회문제 해결형 R&D 트렌드와 통찰사회문제 해결형 R&D 트렌드와 통찰
사회문제 해결형 R&D 트렌드와 통찰Billy Choi
 
커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)
커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)
커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)Billy Choi
 
모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)
모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)
모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)Billy Choi
 
인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성
인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성
인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성Billy Choi
 
우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!
우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!
우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!Billy Choi
 
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?Billy Choi
 
2019년 이후의 커머스 디자인 트렌드 전망
2019년 이후의 커머스 디자인 트렌드 전망2019년 이후의 커머스 디자인 트렌드 전망
2019년 이후의 커머스 디자인 트렌드 전망Billy Choi
 
사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론
사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론
사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론Billy Choi
 

Más de Billy Choi (20)

현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기
현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기
현재 출시된 생성형 AI 모델 기반 서비스 UX 쪼개기
 
용산FM 라디오 방송 with 최병호 교수
용산FM 라디오 방송 with 최병호 교수용산FM 라디오 방송 with 최병호 교수
용산FM 라디오 방송 with 최병호 교수
 
특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...
특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...
특강_인공지능 기반 소셜벤처의 비즈니스모델 개발(Artificial intelligence and social venture busines...
 
HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략
HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략
HUMAN-AI INTERACTION 관점에서 새로운 HCI/UX 씽킹 전략
 
인공지능을 활용한 비즈니스 전략 사례
인공지능을 활용한 비즈니스 전략 사례인공지능을 활용한 비즈니스 전략 사례
인공지능을 활용한 비즈니스 전략 사례
 
스타트업 메타씽킹
스타트업 메타씽킹스타트업 메타씽킹
스타트업 메타씽킹
 
소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안
소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안
소상공인 데이터 기반 경영을 위한 빅데이터 플랫폼 구축 방안
 
죽느냐 사느냐; AI 시대, 우리에게 필요한 질문
죽느냐 사느냐; AI 시대, 우리에게 필요한 질문죽느냐 사느냐; AI 시대, 우리에게 필요한 질문
죽느냐 사느냐; AI 시대, 우리에게 필요한 질문
 
인공지능 마이크로 트렌드 및 통찰
인공지능 마이크로 트렌드 및 통찰인공지능 마이크로 트렌드 및 통찰
인공지능 마이크로 트렌드 및 통찰
 
AI based BM 평가 및 개선 체크리스트
AI based BM 평가 및 개선 체크리스트AI based BM 평가 및 개선 체크리스트
AI based BM 평가 및 개선 체크리스트
 
2020 UX 화두 및 통찰
2020 UX 화두 및 통찰2020 UX 화두 및 통찰
2020 UX 화두 및 통찰
 
인공지능(AI)과 사용자 경험(UX)
인공지능(AI)과 사용자 경험(UX)인공지능(AI)과 사용자 경험(UX)
인공지능(AI)과 사용자 경험(UX)
 
사회문제 해결형 R&D 트렌드와 통찰
사회문제 해결형 R&D 트렌드와 통찰사회문제 해결형 R&D 트렌드와 통찰
사회문제 해결형 R&D 트렌드와 통찰
 
커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)
커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)
커머스 시장의 인공지능(AI) 활용과 사용자 경험(UX)
 
모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)
모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)
모두를 위한 AI 도전 과제: Intelligent DSI(Digital Social Innovation)
 
인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성
인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성
인공지능을 HCI/UX에 접목할 때 알아야 할 변화와 방향성
 
우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!
우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!
우리의 미래,(지능형) 질문력과 (지능형) 통찰력에 달려있다?!
 
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?
 
2019년 이후의 커머스 디자인 트렌드 전망
2019년 이후의 커머스 디자인 트렌드 전망2019년 이후의 커머스 디자인 트렌드 전망
2019년 이후의 커머스 디자인 트렌드 전망
 
사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론
사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론
사회혁신 담론과 행동변화를 유도할 수 있는 HCI/UX이론
 

Último

ARt app | UX Case Study
ARt app | UX Case StudyARt app | UX Case Study
ARt app | UX Case StudySophia Viganò
 
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
西北大学毕业证学位证成绩单-怎么样办伪造
西北大学毕业证学位证成绩单-怎么样办伪造西北大学毕业证学位证成绩单-怎么样办伪造
西北大学毕业证学位证成绩单-怎么样办伪造kbdhl05e
 
Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025Rndexperts
 
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一F dds
 
原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档208367051
 
Call Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts Service
Call Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts ServiceCall Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts Service
Call Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts Servicejennyeacort
 
Pharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdfPharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdfAayushChavan5
 
毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree 毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree ttt fff
 
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一Fi sss
 
Design principles on typography in design
Design principles on typography in designDesign principles on typography in design
Design principles on typography in designnooreen17
 
Call Girls Aslali 7397865700 Ridhima Hire Me Full Night
Call Girls Aslali 7397865700 Ridhima Hire Me Full NightCall Girls Aslali 7397865700 Ridhima Hire Me Full Night
Call Girls Aslali 7397865700 Ridhima Hire Me Full Nightssuser7cb4ff
 
Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025Rndexperts
 
办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一
办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一
办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一diploma 1
 
group_15_empirya_p1projectIndustrial.pdf
group_15_empirya_p1projectIndustrial.pdfgroup_15_empirya_p1projectIndustrial.pdf
group_15_empirya_p1projectIndustrial.pdfneelspinoy
 
8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCRdollysharma2066
 
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一diploma 1
 
办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一
办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一
办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一F La
 
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services DubaiDubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubaikojalkojal131
 

Último (20)

ARt app | UX Case Study
ARt app | UX Case StudyARt app | UX Case Study
ARt app | UX Case Study
 
Call Girls in Pratap Nagar, 9953056974 Escort Service
Call Girls in Pratap Nagar,  9953056974 Escort ServiceCall Girls in Pratap Nagar,  9953056974 Escort Service
Call Girls in Pratap Nagar, 9953056974 Escort Service
 
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
原版美国亚利桑那州立大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
西北大学毕业证学位证成绩单-怎么样办伪造
西北大学毕业证学位证成绩单-怎么样办伪造西北大学毕业证学位证成绩单-怎么样办伪造
西北大学毕业证学位证成绩单-怎么样办伪造
 
Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025
 
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
办理学位证(SFU证书)西蒙菲莎大学毕业证成绩单原版一比一
 
原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制堪培拉大学毕业证(UC毕业证)#文凭成绩单#真实留信学历认证永久存档
 
Call Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts Service
Call Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts ServiceCall Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts Service
Call Girls in Ashok Nagar Delhi ✡️9711147426✡️ Escorts Service
 
Pharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdfPharmaceutical Packaging for the elderly.pdf
Pharmaceutical Packaging for the elderly.pdf
 
毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree 毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲弗林德斯大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
(办理学位证)埃迪斯科文大学毕业证成绩单原版一比一
 
Design principles on typography in design
Design principles on typography in designDesign principles on typography in design
Design principles on typography in design
 
Call Girls Aslali 7397865700 Ridhima Hire Me Full Night
Call Girls Aslali 7397865700 Ridhima Hire Me Full NightCall Girls Aslali 7397865700 Ridhima Hire Me Full Night
Call Girls Aslali 7397865700 Ridhima Hire Me Full Night
 
Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025Top 10 Modern Web Design Trends for 2025
Top 10 Modern Web Design Trends for 2025
 
办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一
办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一
办理(麻省罗威尔毕业证书)美国麻省大学罗威尔校区毕业证成绩单原版一比一
 
group_15_empirya_p1projectIndustrial.pdf
group_15_empirya_p1projectIndustrial.pdfgroup_15_empirya_p1projectIndustrial.pdf
group_15_empirya_p1projectIndustrial.pdf
 
8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR
8377877756 Full Enjoy @24/7 Call Girls in Nirman Vihar Delhi NCR
 
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
办理(USYD毕业证书)澳洲悉尼大学毕业证成绩单原版一比一
 
办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一
办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一
办理(宾州州立毕业证书)美国宾夕法尼亚州立大学毕业证成绩单原版一比一
 
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services DubaiDubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
Dubai Calls Girl Tapes O525547819 Real Tapes Escort Services Dubai
 

UX Research

  • 1. UX Research 2013.5.13 InnoUX CEO 최병호 InnoUX@InnoUX.com, @ILOVEHCI
  • 2. © 2013 InnoUX & Innodesign All rights reserved.UX 리서치 Table of Contents • Big Thinking • Definition • Case Study • Methodology  Overview  Contextual Inquiry  Diary Study  Field Study  Card Sorting  Usability Test  Remote Test  Eye Tracking  Heuristic Evaluation  Cognitive Walkthrough • 12가지 이노베이션 게임 기법 소개 • References 1
  • 5. 해석해본다면, 짂실은? 편견이 빚은 사실 왜곡, 이상핚 통찰…
  • 6. 면접관의 컵으로 의사결정을 조작핛 수 있다?... 그렇다면? 실험 결과의 유도: 무지의 소치, 정치의도의 개입
  • 7. 4 Common Biases in Customer Research • Confirmation Bias • Framing Effect • Observer-expectancy Effect • Recency Bias
  • 8. Confirmation Bias Your tendency to search for or interpret information in a way that confirms your preconceptions or hypotheses.
  • 9. Framing Effect When you and your team draw different conclusions from the same data based on your own preconceptions.
  • 10. Observer-expectancy When you expect a given result from your research which makes you unconsciously manipulate your experiments to give you that result
  • 11. Recency Bias This results from disproportionate salience attributed to recent observations (your very last interview) – or the tendency to weigh more recent information over earlier observations
  • 12.
  • 13. 과학자의 태도로, UX/UI 디자인과 리서치에 임핚다면, 권위 부정, 실험, 도약, 권위 도전 수용, 오픈 마인드?!
  • 15.
  • 16. Market Research User Research UX Research Customer Research
  • 17.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 36.
  • 37.
  • 38. You need to gather: • Factual information • Behavior • Pain points • Goals You can document this on the persona validation board As well as… Photos, video, audio, journals…document everything
  • 45.
  • 46.
  • 47. 46 Types of Research Methods Quantitative Qualitative Generative Evaulative 12 fMRI Brain Imaging 10 Eye Tracking 6 Lab-Based Testing 8 Professional Heuristics 1 Contextual Observation 2 Remote Ethnography 11 Online UX Concept Surveys 9 Online Card Sorting 7 Large Sample On-Line Behavior Testing 5 Ergonomic Observation 4 Focus Groups
  • 49. 48 Methodology: Contextual Observation/Ethnography ► Business problem ► How are people actually using products versus how they were designed? ► Description ► In-depth, in-person observation of tasks & activities at work or home. Observations are recorded. ► Benefits ► Access to the full dimensions of the user experience (e.g. information flow, physical environment, social interactions, interruptions, etc) ► Limitations ► Time-consuming research; travel involved, Smaller sample size does not provide statistical significance, Data analysis can be time consuming ► Data ► Patterns of observed behavior and verbatims based on participant response, transcripts and video recordings ► Tools ► LiveScribe (for combining audio recording with note-taking) Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 50.
  • 51.
  • 52.
  • 53. 52 Methodology: Remote Ethnography ► Business problem ► How are people actually using in their environment in real-time? ► Description ► Participants self-record activities over days or weeks with pocket video cameras or mobile devices, based on tasks provided by researcher. ► Benefits ► Allows participants to capture activities as they happen and where they happen (away from computer), without the presence of observers. Useful for longitudinal research & geographically spread participants. ► Limitations ► Dependence on participant ability to articulate and record activities, Relatively high data analysis to small sample size ratio ► Data ► Patterns based on participant response, transcripts and video recordings ► Tools ► Qualvu.com Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 54. 53 Methodology: Large-Sample Online Behavior Tracking ► Business problem ► Major redesign of a large complex site that is business-critical? ► Description ► 200-10,000+ respondents do tasks using online tracking / survey tools ► Benefits: ► Large sample size, low cost per respondent, extensive data possible ► Limitations ► No direct observation of users, survey design complex…other issues ► Data ► You name it (data exports to professional analysis tools). ► Tools of Choice ► Keynote WebEffective, UserZoom, Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 55. 54 Methodology: Lab-based UX Testing ► Business problem ► Are there show-stopper (CI) usability problems with your user experience? ► Description ► 12-24 Respondents undertake structured tasks in controlled setting (Lab) ► Benefits ► Relatively fast, moderate cost, very graphic display of major issues ► Limitations ► Small sample, study design, recruiting good respondents ► Data ► Summary data in tabular and chart format PLUS video out-takes ► Tools ► Leased testing room, recruiting service and Morae (Industry Standard) Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 56.
  • 57. 56 Methodology: Eye-Tracking Business Problem Do users see critical content and in what order? Description Respondents view content on a specialized workstation or glasses. Benefits Very accurate tracking of eye fixations and pathways. Limitations Relatively high cost, analysis is complex, data can be deceiving. Data Live eye fixations, heat maps…etc. Tools of Choice Tobii - SMI Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 58. 57 Methodology: Automated Online Card Sorting ► Business problem ► User’s cannot understand where content they want is located? ► Description ► Online card sorting based on terms you provide (or users create) ► Benefits ► Large sample size, low cost, easy to field ► Limitations ► Use of sorting tools confuse users, data hard to understand ► Data ► Standard cluster analysis charts and more ► Tools of Choice ► WebSort…and others Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 59. 58 Methodology: fMRI (Brain Imaging) ► Business Problem? ► What areas of the brain are being activated by UX design ► Description ► Respondents given visual stimulus while in FMRI scanner ► Benefits ► Maps design variables to core functions of the human brain ► Limitations ► Expensive and data can be highly misleading ► Data ► Brain scans ► Tools ► Major medical centers and research services (some consultants) Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 60. 59 Methodology: Professional Heuristics ► Business problem ► Rapid feedback on UX design based on best practices or opinions ► Definition ► “Heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions (same root as: eureka)” ► Benefits ► Fast, low cost, can be very effective in some applications ► Limitations ► No actual user data, analysis only as good as expert doing audit ► Data ► Ranging from verbal direction to highly detailed recommendations ► Tools of Choice ► Written or verbal descriptions and custom tools by each experts. Cost / respondent: NA Statistical validity: None – Some – Extensive
  • 61. 60 Methodology: Focus Groups ► Business problem ► What are perceptions and ideas around products/concepts? ► Description ► Moderated discussion group to gain concept/product feedback and inputs; can include screens, physical models and other artifacts ► Benefits ► Efficient method for understanding end-user preferences and for getting early feedback on concepts , particularly for physical or complex products that benefit from hands-on exposure and explanation ► Limitations ► Lacks realistic context of use; Influence of participants on each other ► Data ► Combination of qualitative observations (like ethnographic research) with quantitative data (e.g. ratings, surveys) ► Tools ► See qualitative data analysis Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive
  • 62. A / B Testing What A testing procedure in which two (or more) different designs are evaluated in order to see which one is the most effective. Alternate designs are served to different users on the live website. Why Can be valuable in refining elements on a web page. Altering the size, placement, or color of a single element, or the wording of a single phrase can have dramatic effects. A / B Testing measures the results of these changes. Resources A/B testing is covered in depth in the book Always Be Testing: The Complete Guide to Google Website Optimizer by Bryan Eisenberg and John Quarto-von Tivadar. http://www.testingtoolbox.com/ You can also check out the free A/B testing tool Google Optimizer. https://www.google.com/analytics/siteopt/pr eview A / B Testing http://www.flickr.com/photos/danielwaisberg/
  • 63. Kano Analysis What Survey method that determines how people value features and attributes in a known product domain. Shows what features are basic must-haves, which features create user satisfaction, and which features delight. Why Allows quantitative analysis of feature priority to guide development efforts and specifications. Ensures that organization understands what is valued by users. Less effective for new product categories Kano Analysis
  • 64. Six Thinking Hats What A tactic that helps you look at decisions from a number of different perspectives. The white hat focuses on data; the red on emotion; the black on caution; the yellow on optimism; the green on creativity; and the blue on process. Why Can enable better decisions by encouraging individuals or teams to abandon old habits and think in new or unfamiliar ways. Can provide insight into the full complexity of a decision, and highlight issues or opportunities which might otherwise go unnoticed. Resources Lateral thinking pioneer Edward de Bono created the Six Thinking Hats method. http://www.edwdebono.com/ An explination from Mind Tools. http://www.mindtools.com/pages/article/ newTED_07.htm Six Thinking Hats http://www.flickr.com/photos/daijihirata/
  • 67.
  • 68. What is Ethnography? • Defined as: – a method of observing human interactions in social settings and activities (Burke & Kirk, 2001) – as the observation of people in their ‘cultural context’ – the study and systematic recording of human cultures; also : a descriptive work produced from such research (Merriam-Webster Online) • Rather than studying people from the outside, you learn from people from the inside
  • 69. (Anderson, 1997; Malinowski, 1 967; 1987; Kuper 1983) Who Invented Ethnography? • Invented by Bronislaw Malinowski in 1915 – Spent three years on the Trobriand Islands (New Guinea) – Invented the modern form of fieldwork and ethnography as its analytic component
  • 70. (Salvador & Mateas, 1997) Traditional VS Design Ethnography Traditional • Describes cultures • Uses local language • Objective • Compare general principles of society • Non-interference • Duration: Several Years Design • Describes domains • Uses local language • Subjective • Compare general principles of design • Intervention • Duration: Several Weeks/Months
  • 71. Contextual inquiry is a field data-gathering technique that studies a few carefully selected individuals in depth to arrive at a fuller understanding of the work practice across all customers. Through inquiry and interpretation, it reveals commonalities across a product’s customer base. What is contextual inquiry? ~ Beyer & Holtzblatt
  • 72. Contextual Inquiry: When to do it Every ideation and design cycle should start with a contextual inquiry into the full experience of a customer and his/her. Contextual inquiry clarifies and focuses the problems a customer is experiencing by discovering the • Precise situation in which the problems occur. • What the problem entails. • How customers go about solving them.
  • 73. What is your focus? Who is your audience? Recruit & schedule participants Learn what your users do Develop scenarios Conduct the inquiry Interpret the results Evangelize the findings Rinse, repeat (at least monthly) Contextual Inquiry: How to do it
  • 74. (Nielsen, 2002) Dos & Don’ts Don’t • Ask simple Yes/No questions • Ask leading questions • Use unfamiliar jargon • Lead/guide the ‘user’ Do • Ask open-ended questions • Phrase questions properly to avoid bias • Speak their language • Let user notice things on his/her own
  • 75. Analyzing the results “The output from customer research is not a neat hierarchy; rather, it is narratives of successes and breakdowns, examples of use that entail context, and messy use artifacts” Dave Hendry 74
  • 76. Research Analysis What are people’s values? People are driven by their social and cultural contexts as much as their rational decision making processes. What are the mental models people build? When the operation of a process isn’t apparent, people create their own models of it What are the tools people use? It is important to know what tools people use since you are building new tools to replace the current ones. What terminology do people use to describe what they do? Words reveal aspects of people’s mental models and thought processes What methods do people use? Flow is work is crucial to understanding what people’s needs are and where existing tools are failing them. What are people’s goals? Understanding why people perform certain actions reveals an underlying structure of their work that they may not be aware of themselves.
  • 77. Affinity Diagrams “People from different teams engaged in affinity diagramming is as valuable as tequila shots and karaoke. Everyone develops a shared understanding of customer needs, without the hangover or walk of shame” 76
  • 78. Research Analysis: Affinity Diagrams Creates a hierarchy of all observations, clustering them into themes. From the video observations, 50-100 singular observations are written on post-its (observations ranging from tools, sequences, interactions, work-arounds, mental models, etc) With entire team, notes are categorized by relations into themes and trends.
  • 80.
  • 81.
  • 83. Users record thoughts, comments, etc. over time http://www.flickr.com/photos/vanessabertozzi/877910821 http://www.flickr.com/photos/yourdon/3599753183/ http://www.flickr.com/photos/stevendepolo/3020452399/ http://www.flickr.com/photos/jevnin/390234217/ Interview users Gather feedback, data Organise and analyse (affinity maps, analytics)
  • 84. Participants keep a record of “When” data Date & time Duration Activity / task “What" data Activity / task Feelings / mood Environment / setting
  • 85. No one right way to collect data Structured Yes/no Select a category Date & time Multiple choice Unstructured Open-ended Opinions / thoughts / feelings Notes / comments Combine / mix & match http://www.flickr.com/photos/roboppy/9625780/ http://www.flickr.com/photos/vanessabertozzi/877910821
  • 86.
  • 87. “Hygiene” aspects At the beginning •Introduction / get-to-know-you •Demographics & psychographics, profiling •Instructions / Setting expectations At the end •Follow-up •Thanks / token gift •Reflection
  • 88. Pitfalls • Belief bias • Behavior adjustment • Ramp-up time • Failure to recall
  • 89.
  • 90.
  • 91.
  • 93.
  • 94.
  • 95.
  • 96.
  • 97.
  • 98.
  • 100. Usability Tests Start new test Identify 3-5 tasks to test Observe test participants performing tasks Identify the 2-3 easiest things to fix Make changes to site
  • 101. Identify Tasks for the Test Known problem areas Most common activities Popular pages New pages or services
  • 102. Recommended Gear Computer (or paper prototype) Screen recording software (or observer) Microphone (or observer)
  • 104. Staff of One Recruits test participants Runs the test Records the test (screen recording software & mic) Preps test environment before & after test
  • 105. Staff of Two Recruits test participants Runs the test #1 Observes the test Preps test environment before & after each test #2
  • 106. Working Through Tasks Stick to the script Encourage participant to speak aloud Don’t help the participant
  • 108. New technologies and techniques allow for Remote: – Moderated testing – Unmoderated testing – Observation Irrelevance of Place
  • 109. Remote Moderated Testing Products like GotoMeeting allow connections to the test (or observation) computer to the Internet. VoIP can carry voice cheaply. LiveMeeting WebEx GoToMeeting For screen VoIP Audio Skype GoogleTalk Translator Moderator Participant Observers
  • 110. Remote Unmoderated Testing ‘Task-based’ Surveys > Online/remote Usability Studies (unmoderated) > Benchmarking (competitive /comparison) > UX Dash`boards (measure ROI) Online Card Sorting > Open or closed > Stand alone or > Integrated with task-based studies & surveys Online Surveys > Ad hoc research > Voice of Customer studies > Integrated with Web Analytics data User Recruiting Tool > Intercept real visitors (tab or layer) > Create your own private panel > Use a panel provider* Robust Set of Services
  • 111. 110 • Saves time o Lab study takes 2-4 weeks from start to finish, unmoderated typically takes hours to a few days* • Saves money o Participants compensation typically a lot less ($10 vs. $100) o Tools are becoming very inexpensive • Reliable metrics o Only (reasonable) way to collect UX data from large sample sizes • Geography is not a limitation o Collect feedback from customers all over the world • Greater Customer insight o Richest dataset about the customer experience Why Should You Care?
  • 112. 111 Common Research Questions: • What are the usability issues, and how big? • Which design is better, and by how much? • How do customer segments differ? • What are user design preferences? • Is the new design better than the old design? • Where are users most likely to abandon a transaction? Types of Studies: • Comprehensive evaluation • UX benchmark • Competitive evaluation • Live site vs. prototype comparison • Feature/function test • Discovery Overview Typical Metrics: • Task success • Task time • Self-report ratings such as ease of use, confidence, satisfaction • Click paths • Abandonment
  • 113. 112 Full Service Tools – Online Usability Testing
  • 114. 113 Card Sorting / IA Tools
  • 121. STEPS IN A CARD SORT 1. Decide what you want to learn 2. Select the type of Card Sort (open vs closed) 3. Choose Suitable Content 4. Choose and invite participants 5. Conduct the sort (online or in-person) 6. Analyze Results 7. Integrate results
  • 122. WHAT ARE YOU WANTING TO LEARN? • New Intranet vs Existing? • Section of Intranet? • Whole organization vs single department? • For a project? For a team?
  • 123. Product Targets CRM Project Review CRM Organizatio n Chart Christmas Party Walkathon Results Year in Review Meeting Vacation Policy Pay Days Vacation request form Year in Review Meeting Product TargetsCRM Project Review CRM Organizatio n Chart Christmas Party Walkathon Results Vacation Policy Pay Days Vacation request form OPEN VS CLOSED Vacation Policy Christmas Party CRM Project Review CRM Organization Chart Product Targets Year in Review Meeting Pay Days Walkathon Results Vacation request form Vacation Policy Christmas Party CRM Project Review CRM Organizatio n Chart Product TargetsYear in Review Meeting Pay Days Walkathon Results Vacation request form Company News Departments Human Resources Projects Company News Events Human Resources Projects Company News Departments Human Resources Projects OPEN SORT CLOSED SORT
  • 124. SELECTING CONTENT Do’s •30 – 100 Cards •Select content that can be grouped •Select terms and concepts that mean something to users Don’ts • More than 100 cards • Mix functionality and content • Include both detailed and broad content
  • 126. LOOK AT • What groups were created • Where the cards were placed • What terms were used for labels • Organization scheme used • Whether people created accurate or inaccurate groups
  • 127. INTEGRATE RESULTS: CREATE YOUR IA Our Company Executive Blog New York Vancouver Mission and Values Projects Project Name 1 Project Name 2 Project Name 3 Project Name 4 Departments Executive Operations Operations Support Vessel Planning Yard Planning Rail Planning Finance & Administration Human Resources Corporate Communications IT Community & Groups Events Charitable Campaigns Vancouver Carpool Employee Resources Vacation & Holidays Expenses Travel Health & Safety Wellness Benefits Facilities Payroll Communication Tools Centers of Excellence Project Management Professionals Engineering Terminal Technologies NAVIS Lawson IT Yard Planning
  • 130. Card Sorting is as common as Lab based Usability Testing Source: 2011 UxPA Salary Survey
  • 131. Terms & Concepts • Open Sort: Users sort items into groups and give the groups a name. Closed Sort: Users sort items into previously defined category names. • Reverse Card Sort (Tree Test) : Users are asked to locate items in a hierarchy (no design) • Most Users Start Browsing vs Searching: Across 9 websites and 25 tasks we found on average 86% start browsing http://www.measuringusability.com/blog/card-sorting.php http://www.measuringusability.com/blog/search-browse.php
  • 133. Set-up of an eye tracking test User tests are often run in 45 to 60 minute sessions with 6 to 15 participants: 1. Participants are give a number of typical task to complete, using the website, design or product you want to test. 2. The user’s intuitive interaction is observed, comments and reactions are recorded. 3. The participant‟s impressions are captured in an interview at the end of the test. 132
  • 134. Eye tracking results: Heatmaps Heatmaps show what participants focus on. In this example, „hot spots‟ are the picture of the shoes, the central entry field and the two right-hand tiles underneath. The data of all participants is averaged in this map. 133
  • 135. Eye tracking results: Gazeplot Gaze plots show the „visual path‟ of individual participants. Each bubble represents a fixation. The bubble size denotes the length or intensity of the fixation. Additional results are available in table format for more detailed analysis. 134
  • 136. The key visual and a box at the bottom Note: Telstra Clear have since re-designed their homepage. The key visual got lots of attention. Surprising: This box got heaps of attention. It reads: “If you are having trouble getting through to us on the phone, please click here to email us, we‟ll get back to you within 2 business days”. Participants got the impression that Telstra Clear has trouble with their customer service. The main navigation and its options got almost no attention. 135
  • 137. The Face effect – an example bunnyfoot Yep, there’s attention on certain… areas, … the face, however, is the strongest point of focus! 136
  • 138. Using the Face effect humanfactors.com Eye tracking results for ad Version A:  We see a face effect: The model‟s face draws a lot of attention.  The slogan is the other hot spot of the design. Participants will likely have read it.  The product and its name get some, but not a lot of attention. 137
  • 139. Using the Face effect Eye tracking results for ad Version B:  Again, we see a strong face effect. BUT: In this version, the models gaze is in line with the product and its name.  The product image and name get considerably more attention!  Additionally, even the product name at the bottom is noticed by a number of participants. humanfactors.com 138
  • 140. Ways to focus attention usableworld.com.au Same effect: If the baby faces you, you‟ll look at the baby. But if the baby faces the ad message, you pay attention to the message. You basically follow the baby‟s gaze. 139
  • 141. Banner blindness … or are they? In this test, participants were given a task: Find the nearest ATM. Participants focused on the main navigation and the footer navigation– this is where they found the „ATM locator‟. So, when visiting a site with a task in mind – as you normally do – the central banner can be ignored! 140
  • 142. Compare the visual paths: Task versus browse When browsing, the central banner gets lots of attention. But how often do you visit a bank website just to browse? Participant was asked just to look at the homepage Participant was given a task („Find the nearest ATM‟) 141
  • 143. Main focus: Navigation options Eye tracking results show: When looking for something on a website, the main focus of attention are the navigation options. Maybe users have learned that they‟re unlikely to find what they‟re looking for in a central banner image. Task: „What concerts are happen in Auckland this month?‟ Task: „You want to send an email to customer service‟ 142
  • 144. Task: „You want to get in touch with customer service‟ When do users look at banners? In this example, participants looked at the banner even though they were looking for something specific. What‟s different? Participant was asked just to look at the homepage 143
  • 146.
  • 147. 1. Visibility of system status 2. Match between system and real world 3. User control and freedom 4. Consistency and standards 5. Error prevention 6. Recognition rather than recall 7. Flexibility and efficiency of use 8. Aesthetic and minimalist design 9. Help users recognize, diagnose, and recover from errors 10. Help and documentation J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994 Nielsen’s 10 heuristics Slide 146
  • 149. HE output Slide 148 • A list of usability problems • Tied to a heuristic or rule of practice • A ranking of findings by severity • Recommendations for fixing problems • Oh, and the positive findings, too
  • 150. usability.spsu.edu UPA 201 1 149 HE sample findings page
  • 151. 150 Objectives/goals for the modules Reason content is being presented Conciseness of presentation Definitions required to work with the module/content Evaluation criteria and methods Direct tie between content and assessment measure Sequence of presentation follows logically from introduction Quizzes challenge users Develop a consistent structure that defines what’s noted in the bulleted points, above. Avoid generic statements that don’t focus users on what they will be accomplishing. Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the module Connect ideas in the goals and objectives with outcomes in the assessment Follow the order of presentation defined at the beginning Develop interesting and challenging questions Re-frame goals/objectives at the end of the module    3 Finding Description Recommendation H C S Severity Rating Objectives/goals for the modules Reason content is being presented Conciseness of presentation Definitions required to work with the module/content Evaluation criteria and methods Direct tie between content and assessment measure Sequence of presentation follows logically from introduction Quizzes challenge users Develop a consistent structure that defines what’s noted in the bulleted points, above. Avoid generic statements that don’t focus users on what they will be accomplishing. Advise that there is an assessment used for evaluation and indicate if it’s at the end or interspersed in the module Connect ideas in the goals and objectives with outcomes in the assessment Follow the order of presentation defined at the beginning Develop interesting and challenging questions Re-frame goals/objectives at the end of the module    3 Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives. H = Hyperspace; C = Cardiac Arrest; S = Shock
  • 153.
  • 154.
  • 155.
  • 156.
  • 157.
  • 160. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game 제품 가지 치기 (제품이 나아갈 방향을 잡는다) 굵은 가지=주요 기능 중심부 나뭇잎=기능, 바깥 나뭇잎=새로욲 기능 경계선 바깥=미래 나무 뿌리=고객 지원, 서비스 체계 균형 성장 초점: 나무 뿌리만큼 나뭇가지도 그 만큼 성장
  • 162. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (제품이 나아갈 방향을 잡는다) 미래 기억하기 2013년에서 2015년의 미래 기억하기 “우리 제품이 어떤 기능을 제공해 만족스러웠나요?”(타임라인 표기)
  • 164. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (제품이 나아갈 방향을 잡는다) 기능 구매 잠정적인 기능에 가격 결정 고객은 가짜 돈으로 다음 출시 버전 구매 협상
  • 165. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다) 나의 하루 제품이 필요핚 때 묘사 제품이 고객의 하루에 어떻게 지원 또는 방해 관찰
  • 167. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다) 그림자 놀이 그림자가 되어 관찰 다른 고객들을 데려가 통역가로 활용
  • 168. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다) 견습공 고객 현장에서 직접 체험
  • 169. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다) 자랑하기 아이가 수업시간에 자기 보물 자랑하듯
  • 170. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (고객이 제품과 서비스를 사용하는 방식과 관계를 이해핚다) 거미줄 자사와 경쟁사 제품 모두 그려 관계 파악 (선 색상/굵기/모양 다르게 중요 관계 표현) 얶제, 어떻게, 왜 사용
  • 172. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game (제품과 서비스가 제공 핛 기능 파악) 고속 질주를 방해하는 보트의 닻 규명 제품에서 싫어하는 점 색인 카드에 적어 보트 아래 닻 배치 다른 고객이 배치핚 닻 동의 의견 추가 스피드 보트
  • 175. http://www.slideshare.net/Enthiosys/collaborating-with-customers-using-innovation-game 제품 상자 스스로 구매핛 제품 상자 디자인 요청 상자는 마케팅 표어, 그림, 가격 등 어떤 정보도 포함 자싞이 만든 상자 다른 고객에게 판매
  • 177. © 2013 InnoUX & Innodesign All rights reserved.UX 리서치 인용/참조 문헌 • Ethnography (Santosh Bhandari, Mar 29, 2013) • Cultural probes in real life (gerrygaffney, Jun 11, 2012) • UX of User Stories Workshop (Anders Ramsay, Aug 14, 2012) • Usability behaviors: Usability and the SDLC (Ted Tschopp, Nov 04, 2012) • Know Thy User: The Role of Research in Great Interactive Design (frog, Sep 2012) • The Mobile Frontier (Rachel Hinman, Feb 2012) • Introduction to AgileUX: Fundamentals of Customer Research (Will Evans, Jan 2012) • Customer Research & Persona Development (Will Evans, Oct 2012) • Introduction to UX Research: Conducting Focus Groups (Will Evans, Jan 2012) • Midwest UX '12: Mapping the Experience (Chris Risdon, Jun 2012) • Eye Tracking & User Research (Optimal Usability, Apr 2012) • Taking it to the streets: Investigating mobile and web UX through field studies (Emma Rose, Jun 2012) • NYTECH "Measuring Your User Experience Design“ (New York Technology Council, Mar 2012) • How to Conduct UX Benchmarking (UserZoom, May 2012) • Customer validation with Diary Studies (Boon Chew, Jan 2012) • The Science of Great Site Navigation: Online Card Sorting + Tree Testing (UserZoom, Jul 2012) • Introduction to Card Sorting (ThoughtFarmer, Sep 2012) • Usability Testing Basics (Stephen Francoeur, Mar 2012) • Storytelling: Rhetoric of heuristic evaluation (Southern Polytechnic State University, Mar 2012) • Cognitive and pluralistic (Aarushi Mishra, Oct 2012) • How to Quantitatively Measure Your User Experience (Richard Dalton, May 2012) • Information Architecture Heuristics (Abby Covert, Jul 24, 2011) • Diary Studies in HCI & Psychology (UPABoston, Jul 13, 2011) • Remote Testing Methods & Tools Webinar (UserZoom, Dec 2011) • Beyond User Research (Louis Rosenfeld, Mar 2011) • Using Ethnographic User Research to Drive Knowledge Management and Intranet Strategy (NavigationArts, Dec 01, 2010) • Remote Research, The Talk. (bolt peters, May 27, 2010) • User Interview Techniques (Liz Danzico, May 2010) • The new digital ethnographer’s toolkit: Capturing a participant’s lifestream (Christopher Khalil, Sep 04, 2009) • Design Research For Everyday Projects - UX London (leisa reichelt, Jun 2009) • Contextual Inquiry V1 (Rajesh Jha, Sep 11, 2008) 176

Notas del editor

  1. Business question: Is anyone working on a major (large-scale) site launch or redesign that your company depends on for survival?
  2. Audience question: “How many of you have a project at the point where it is ready for major commitment? (round A, new release, major new upgrade) I have a web site, software or product and I am about to commit major funding or resources to next phase of developmentDo I have usability problems with the user experience that are basically show stoppers Users cannot download the application Users cannot log in Users cannot set up a profile pageUsers cannot navigate to critical content 1-3 critical tasks in 60 min.
  3. Business question: How are users actually viewing your content (in what order, for how long and in what specific pattern or pathways)?Audience question: Have you wondered if critical links, buttons or content messaging is being viewed on a critical page?Description: this methodology is very useful when trying to determine why certain homepage metrics from analytics programs are of concern (not clicking on value proposition element…etc.)The respondent sits at a specialized computer screen and undergoes a simple calibration sequence. Respondent is given a stimulus question or task (active or passive) (show homepage for set period of time 15 seconds)System tracks eye pathways and fixations and produces a data file from that task.Important things to know about eye-tracking Tobii not designed for web sites or changing visual stimulus)This makes actual testing of web navigation (changing from page to page) very complex to actually analyze and not accurateVery effective for single stimulus presentations of fixed durationsExcellent for detailed analysis of home pages or critical landing pages and forms Very insightful for assessing impact of advertizing on homepage visual scanning.
  4. Business problem: How do I organize information like content, navigation, overall IA so that users understand it?Description: this is an automated version of the classic card sorting studies where you give users a pile of index cards with your content descriptions on them and ask them to sort the cards into groups according to how they relate to the content. Example: If I have a bunch of content categories how do I determine what the groupings are and the high level navigation labels? Lets say you have a site selling women’s underwear and you what to create a navigation structure that matches the users mental model. So do you organize the site by type of underwear on top level and styles, colors, or do you organize the site navigation by life style like (athletic, everyday, intimate) and they by type of article color, and price. Respondents are invited to an online study via email.When they agree they encounter a screen with a list of labels or terms in one column and are asked to sort the terms into groups they find organizationally relevant. When they are finished you can give them another card sort of just finish the study. When the required number of respondents are finished with the card sort you can view the dataCard sorting data is analyzed through the application of cluster analysis (not that easy to understand but very useful)
  5. Business question: Do any of you have new development team that has minimal UX / Usability experience? Is your team employing best practices and are they aware of the key UX and Usability performance issues that an effective solution must meet.Description: A highly experienced usability / UI design expert conducts a structured audit of your system or product and rates the system on best practices and estimated performanceInterview and select an expert who has direct experience in your product category and sectorExpert gathers information from your development team and conducts structured audit based on predetermined best practices. Expert presents findings to your team (sometimes not a happy experience for UX design teams without knowledge of formal UCD methods.Very effective early in development and can be repeated with updates at less cost.
  6. Pattern Name : A/B TestingClassification: Continuous ImprovementIntent: Can be valuable in refining elements on a web page. Altering the size, placement, or color of a single element, or the wording of a single phrase can have dramatic effects. A / B Testing measures the results of these changes. Also Known As:Other names for the pattern.Motivation (Forces):A scenario consisting of a problem and a context in which this pattern can be used.Applicability:Situations in which this pattern is usable; the context for the pattern.Structure:A graphical representation of the pattern. Class diagrams and Interaction diagrams may be used for this purpose.Participants:A listing of the classes and objects used in the pattern and their roles in the design.Collaboration:A description of how classes and objects used in the pattern interact with each other.Consequences:A description of the results, side effects, and trade offs caused by using the pattern.Implementation:A description of an implementation of the pattern; the solution part of the pattern.Sample Code:An illustration of how the pattern can be used in a programming languageKnown Uses:Examples of real usages of the pattern.Related Patterns:Other patterns that have some relationship with the pattern; discussion of the differences between the pattern and similar patterns.
  7. Pattern Name : Kano AnalysisClassification: Business Requirements ManagementIntent: Allows quantitative analysis of feature priority to guide development efforts and specifications. Ensures that organization understands what is valued by users. Less effective for new product categories.Also Known As: Kano ModelMotivation (Forces): You have a need to categorize features by basic must-haves, which features create user satisfaction, and which features delight. Applicability: You have a list of business requirements, however you know that in the current phase of the project, you will not be able to get everything done. You are going to use a Cycle methodology, and you need to know which features the users will want as basic must have’s, which features will excite them, and which are low impact features. In any given release, you will want to include at least one delightful / exciting features. Additionally on your first release you will probably want to include as many basic / must have features. Use Kano Analysis to identify which features are which.Structure:A graphical representation of the pattern. Class diagrams and Interaction diagrams may be used for this purpose.Participants: Potential Users, SurveyorCollaboration:A description of how classes and objects used in the pattern interact with each other.Consequences: This tool tells you about user perceptions. Remember this limitation, you might want to measure something else. Implementation: Survey method that determines how people value features and attributes in a known product domain. Shows what features are basic must-haves, which features create user satisfaction, and which features delight.Sample Code:An illustration of how the pattern can be used in a programming languageKnown Uses:Examples of real usages of the pattern.Related Patterns:Other patterns that have some relationship with the pattern; discussion of the differences between the pattern and similar patterns.
  8. Pattern Name : Six Thinking HatsClassification: Business Requirements ManagementIntent: Can enable better decisions by encouraging individuals or teams to abandon old habits and think in new or unfamiliar ways. Can provide insight into the full complexity of a decision, and highlight issues or opportunities which might otherwise go unnoticed.Also Known As:Other names for the pattern.Motivation (Forces):A scenario consisting of a problem and a context in which this pattern can be used.Applicability:Situations in which this pattern is usable; the context for the pattern.Structure:A graphical representation of the pattern. Class diagrams and Interaction diagrams may be used for this purpose.Participants:A listing of the classes and objects used in the pattern and their roles in the design.Collaboration:A description of how classes and objects used in the pattern interact with each other.Consequences:A description of the results, side effects, and trade offs caused by using the pattern.Implementation:A description of an implementation of the pattern; the solution part of the pattern.Sample Code:An illustration of how the pattern can be used in a programming languageKnown Uses:Examples of real usages of the pattern.Related Patterns:Other patterns that have some relationship with the pattern; discussion of the differences between the pattern and similar patterns.
  9. Note: give an example here
  10. Usability tests are really not such a big deal. Here’s a quick overview of the steps:Come up with a set of 3-5 different tasks that you’ll ask users to perform.Round up some 5-10 volunteers who will act as test participants and then bring them one at a time into a testing area where you’ll observe them as they perform the predetermined tasks.After you’ve observed all the test partipants, you’ll have a pretty good idea of some things that need to be fixed and what things seems to be working OK. After you make the easiest 2-3 fixes, go back and do another round of testing and tweaking, etc.
  11. OK, so you now have an idea about what service or resource you’re going to test, next you’ll want to think about what actual tasks you want your test participants to do. You’lll want to pick pick tasks that are going to reveal some useful information to you.One obvious place to go looking for tasks are those pages or services that you and your colleagues already know need work, such as your interlibrary loan form or the way that library hours are displayed.Another strategy is to think about what are the most common activities among patrons in your library. Take a look at your site statistics to see what are the most popular pages. Maybe that’s where you want to do your testing.Or maybe you’re about to launch a new page or service. Those are great opportunities for testing.
  12. OK. So the gear you need is not too complicated. You’ll need a computer….a desktop or a laptop will do. Last year, I had test participants use my smarthphone wen I was testing a mobile web site. If you really want to get serious about user-centered design, you may want to do usability testing on paper sketches that precede any actual website coding. This is perfect acceptable and commonly done. It’s a great way to run tests that will help you get a basic page layout and site architecture problems.You’ll also want to install some screen recording software on the computer that your test participants use. That way, you can capture as a movie all the mouse movements, page clicks, and characters typed; this is really rich data to return to when the tests are done and you are trying to write up your report. I’ll talk in a minute about software options.Another option that has worked for me is to simply have a second person on hand helping you with the test. That person’s sole responsibility is to closely observe the test participant and take detailed notes.Finally, if you have screen recording software, you might as well get a USB microphone that can capture the conversation between the test participant and the test facilitator. You’ll want to encourage the participant to think aloud as much as possible as they perform tasks.
  13. Here are five options for screen recording software. I’ve used CamStudio a lot mostly because it is free and can be installed on any machine. With the others, you’ll get a much richer feature set but will limited by the number of machines you can install it on.
  14. OK, so if you are doing the tests all by your lonesome (not the best situation but certainly still doable), you’ll be in charge of recruiting test participants, running the test, recording the test (you’ll definitely need screen recording software and a mic), and for prepping the test environment.
  15. If you can get another person to help you out with the testing, you can break up the tasks in rational ways.
  16. It’s essential that you ask the participant to speak aloud so you can hear them express any frustrations or surprises they’ve had.
  17. Saves time – Very fast, thousands on panels, Money – essence of quick and dirty. Techniques for dealing with noise, unrealistic to be in the lab that long. Combines both qual/quant and attitudes and behavior.
  18. All the flexibility you need to set up a study and analyzing the dataSignificant support in designing study and analysis. Pricing is all project based – typically very expensive – good choice for a large benchmark study
  19. - Sort into groups
  20. OPEN SORT: good for getting ideas on groups of contentCLOSED: Useful to see where people would put the content.
  21. Card sorting as a method in HCI largely took off during the internet boom of the late 1990’s with the proliferation of website navigation.
  22. Today it’s one of the most popular methods UX professionals use. In fact, practitioners report using Card Sorting as frequently as task oriented lab-based usability testing.
  23. This effect can be used to direct attention, for example on an ad. Here two different versions of an ad were eye-tracked. In this case, the model is looking directly at the viewer.
  24. And in this version, the model looks at the product, forming a straight line between her eye and the product name on the package.
  25. Using the cards post-task or post-test.Participant walks table, chooses. Returns to discuss meaning. Log comments for later analysis.