This document discusses the importance of website performance and user experience. Some key points:
- Studies show that users perceive website speed to be 15-20% slower than it actually is, and slowdowns have a bigger impact than speed increases.
- A study of mobile shopping found that speed was the number one priority for users, with functional issues on sites like Zalando impacting perceptions.
- To measure performance, companies should establish performance SLAs by defining key metrics and targets based on user tasks. They also need to collect both synthetic and real user monitoring data.
- Performance analytics can show how sites compare to competitors and identify pages for optimization. Metrics like revenue risked from poor performance are also
6. How we perceive performance....
15%-20% worse than in reality
7. How we perceive performance....
15%-20% worse than in reality
Slow downs have more impact
8. How we perceive performance....
15%-20% worse than in reality
Slow downs have more impact
Task completion has positive impact
9. How we perceive performance....
15%-20% worse than in reality
Slow downs have more impact
Task completion has positive impact
Always compared to past experiences
10. A simple online business model:
Marketing
Conversion Optimization
New
visitors
Search
Tweets
Mentions
ADs seen
Conversion
rate
Growth
Number of
visits
Loss
Bounce
rate
Revenue
Pages per
visit
Time on
site
X
Order
value
17. 3
1
0,3
0
Mmm, shall I click away?
Interaction: Let’s conversate...
Instantaneous: I like it!
Source: Jakob Nielsen
18. 10
Only if the task/content is relevant
3
1
0,3
0
Mmm, shall I click away?
Interaction: Let’s conversate...
Instantaneous: I like it!
Source: Jakob Nielsen
32. Over 60% of people that experienced a slow website ranked
design 2 points lower compared to a fast experience
Slow
Average
Fast
% of respondents that rated the actual website speed
100
75
50
25
0
1
2
3
Design score (1=bad - 5=beautiful)
4
5
36. Desktop shopping experience setup....
Buy a summer
holiday:
100 users
‣ From 5 selected websites
‣ Rank from fastest to
slowest after task execution
No measurements tools allowed ;-)
Desktop only,
either via LAN or WiFi
37. Perceived
Technology
(User Ranking)
(Full page download)
1
Sunweb
2
D-reizen
3
Neckermann
4
Arke
5
Globe
Technology vs. Perceived
Study performed by MeasureWorks for Emerce eTravel, 2013
38. Perceived
Technology
(User Ranking)
(Full page download)
1
Sunweb
Sunweb
2
D-reizen
Globe
3
Neckermann
Neckermann
4
Arke
D-reizen
Globe
Arke
5
Technology vs. Perceived
Study performed by MeasureWorks for Emerce eTravel, 2013
39. Perceived
Technology
(User Ranking)
(Full page download)
1
Sunweb
Sunweb
2
D-reizen
Globe
3
Neckermann
Neckermann
4
Arke
D-reizen
Globe
Arke
5
Technology vs. Perceived
Study performed by MeasureWorks for Emerce eTravel, 2013
45. Functional issues
reported with
Zalando
50
37,5
Round 1
25
45
42,1
12,5
15,4
0
Zalando
HM
20
7,7
V&D
24
3,9
Tom Tailor
4
Design
1. Buy a T-shirt
Speed
Mobile Readiness
Other
2. Feedback
40
30
Round 2
20
10
12
0
54
34
Zalando
Mobile Only research:
- Task completion: Only use smartphone to buy a book
- N = 100, users range from 20-65
15,4
HM
8
V&D
Tom Tailor
21
Design
18
Speed
6
Mobile Readiness
Other
50. 100
75
50
46
44
26
24
25
16
0
Bol.com Selexyz
Bruna
Other
1. Buy a book
Mobile Only research:
- Task completion: Only use smartphone to buy a book
- N = 100, users range from 20-65
Design
Speed
16
Mobile Readiness
2. Feedback bol.com
16
Other
Design
12
Speed
Mobile Readiness
Other
3. Feedback from different stores
57. Web Analytics
Usability
Performance
(what did they do on
the site?)
(how did they
interact with it?)
(could they do what
they wanted to?)
Complete Web Monitoring
VoC
Social Media
Competition
(what were their
motivations?)
(what were they
saying?)
(what are they up
to?)
58. “Hard” data
Web Analytics
Usability
Performance
(what did they do on
the site?)
(how did they
interact with it?)
(could they do what
they wanted to?)
Complete Web Monitoring
VoC
Social Media
Competition
(what were their
motivations?)
(what were they
saying?)
(what are they up
to?)
“Soft” data
59. “Hard” data
Web Analytics
Usability
Performance
(what did they do on
the site?)
(how did they
interact with it?)
(could they do what
they wanted to?)
Complete Web Monitoring
VoC
Social Media
Competition
(what were their
motivations?)
(what were they
saying?)
(what are they up
to?)
“Soft” data
65. Establish a baseline:
A pre-defined set of metrics
that describes normal behavior
in order to detect variancies
66. Establish a baseline:
A pre-defined set of metrics
that describes normal behavior
in order to detect variancies
and to be comparable within historic context
68. Purchasing a book,
must be completed (speed),
where every page loads under 3 sec.,
Customer journey
Metric: Speed
Target: Sec
using IE9 and higher,
User scenario
from any location in the Netherlands,
User locations
for 90% of all users,
every day between 6am and 12pm,
measured with Real User Monitoring.
Percentile
Window
Collection type
Read more: Metrics 101, Velocityconf 2010
73. (CLOUD) DATA CENTER
INTERNAL USERS
CUSTOMERS
INTERNET
Third-party/
Cloud Services
Storage
Major
ISP
DB Servers Web Servers
Network
Mainframe
Middleware
Servers
App
Servers
Load
Balancers
Local
ISP
Content
Mobile
Delivery
Networks Carriers
74. (CLOUD) DATA CENTER
INTERNAL USERS
CUSTOMERS
INTERNET
Third-party/
Cloud Services
Storage
Major
ISP
DB Servers Web Servers
Network
Mainframe
Middleware
Servers
App
Servers
Load
Balancers
This is what you control...
Local
ISP
Content
Mobile
Delivery
Networks Carriers
75. (CLOUD) DATA CENTER
INTERNAL USERS
CUSTOMERS
INTERNET
Third-party/
Cloud Services
Storage
Major
ISP
DB Servers Web Servers
Network
Mainframe
Middleware
Servers
App
Servers
Load
Balancers
This is what you control...
Local
ISP
Content
Mobile
Delivery
Networks Carriers
What you’re blamed for..