5. Why can’t we get
findability right?
• Semantic illiteracy
• Siloed organizations
• Ill-equipped decision-makers prone to
short-term thinking
• We don’t know how to diagnose
• We don’t know how to measure
6. Information architecture:
8 better practices for findability
1. Diagnosing the important problems
2. Balancing our evidence
3. Designing for the long term
4. Measuring engagement
5. Supporting contextual navigation
6. Improving search across silos
7. Combining design approaches effectively
8. Tuning our designs over time
8. A handful of queries/tasks/ways to navigate/features/
A little goes a long way
documents meet the needs of your most important audiences
9. A handful of queries/tasks/ways to navigate/features/
A little goes a long way
documents meet the needs of your most important audiences
Not all queries
are distributed
equally
10. A handful of queries/tasks/ways to navigate/features/
A little goes a long way
documents meet the needs of your most important audiences
11. A handful of queries/tasks/ways to navigate/features/
A little goes a long way
documents meet the needs of your most important audiences
Nor do they
diminish gradually
12. A handful of queries/tasks/ways to navigate/features/
A little goes a long way
documents meet the needs of your most important audiences
13. A handful of queries/tasks/ways to navigate/features/
A little goes a long way
documents meet the needs of your most important audiences
80/20 rule isn’t
quite accurate
29. Balanced research
leads to true insight,
new opportunities
from Christian Rohrer: http://is.gd/95HSQ2
30. Lou’s TABLE OF
OVERGENERALIZED Web Analytics User Experience
DICHOTOMIES
Users' intentions and
What they Users' behaviors (what's
motives (why those things
analyze happening)
happen)
Qualitative methods for
What methods Quantitative methods to
explaining why things
they employ determine what's happening
happen
Helps users achieve goals
What they're Helps the organization meet
(expressed as tasks or
trying to achieve goals (expressed as KPI) topics of interest)
Uncover patterns and
How they use Measure performance (goal-
surprises (emergent
data driven analysis)
analysis)
Statistical data ("real" data Descriptive data (in small
What kind of data
in large volumes, full of volumes, generated in lab
they use errors) environment, full of errors)
40. The missing metrics
of engagement
• Orientation (“What can I do here?”)
• Authority (“I trust this”)
• Social (“Who else likes this?”)
• Connection/cross-promotion (“What goes
with this?”)
• and many more...
41. Conversation architecture
uncovers levels of engagement
Level 0: I visit site (unauthenticated)
Level 1: I ask site a question (e.g., a search)
Level 2: Site asks me a question (“can we save those
settings?”)
Level 3: Site suggests something
to me (“you might also like this”)
Level 4: Site acts on my behalf
(“I’ve added this to your favorites
list in case you’d like to reorder”)
trust and value grow progressively
44. 1.Choose a content
type (e.g., events)
2.Ask: “Where
should users go
from here?”
3.Analyze the
frequent queries
from this content
type
from aiga.org
45.
Analyze frequent queries generated from each content sample
46. Content models emerge (example: BBC) concert calendar
album pages artist descriptions
TV listings
album reviews discography artist bios
52. ...convert “advanced” search
into refinement
search session patterns
1. solar energy
2. how solar energy works
search session patterns
1. solar energy
2. energy
53. ...convert “advanced” search
into refinement search session patterns
search session patterns 1. solar energy
1. solar energy 2. solar energy charts
2. how solar energy works
search session patterns
1. solar energy
2. energy
54. ...convert “advanced” search
into refinement search session patterns
search session patterns 1. solar energy
1. solar energy 2. solar energy charts
2. how solar energy works
search session patterns
search session patterns 1. solar energy
1. solar energy 2. explain solar energy
2. energy
55. ...convert “advanced” search
into refinement search session patterns
search session patterns 1. solar energy
1. solar energy 2. solar energy charts
2. how solar energy works
search session patterns
search session patterns 1. solar energy
1. solar energy 2. explain solar energy
2. energy
search session patterns
1. solar energy
2. solar energy news
70. Treat your content
Each layer is
cumulative;
most important
like an onion content is at
the core
information
layer usability content strategy
architecture
indexed by search
0 engine
leave it alone leave it alone
squeaky wheel issues
1 tagged by users
addressed
refresh annually
tagged by experts (non- test with a service
2 topical tags) (e.g., UserTesting.com)
refresh monthly
tagged by experts “traditional” lab-based titled according to
3 (topical tags) user testing guidelines
content models for structured according
4 contextual navigation
A/B testing
to schema
82. Summary:
8 IA better practices
1. Diagnosing the important problems
2. Balancing our evidence
3. Designing for the long term
4. Measuring engagement
5. Supporting contextual navigation
6. Improving search across silos
7. Combining design approaches effectively
8. Tuning our designs over time
83. Say hello
Lou Rosenfeld
lou@louisrosenfeld.com
Rosenfeld Media
www.louisrosenfeld.com | @louisrosenfeld
www.rosenfeldmedia.com | @rosenfeldmedia
Notas del editor
\n
\n
http://xkcd.com/773/\n
\n
\n
\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n
Amazing drawing by Eva-Lotta Lamm: www.evalotta.net\n