2. Good UX is {usually} great for business
• Historically customer experience has not been tied
to business outcomes
• Universal need for data driven experiences, or
measurable user experience
• The Quantitative vs. Qualitative Debate
3. • Define: Analyze business goals & translate them into UX KPIs
• Benchmark: Measure the current UX to establish a baseline
• Forecast: Predict the likely impact of improvement
• Design & Build: Design & develop guided by UX KPIs
• Measure: Test & collect results data to measure success
• Analyze: Use results to provide actionable insights, make
recommendations and quantify a value of success
Ensure experiences are intentional and tailored to
deliver business results
Methodology
5. Question: To offer or not to offer?
Test Plan:
• Competitive research
• What are customer expectations?
• Test – A/B test of the “free returns” offering
• Measure
– Qualitative: Industry research, user testing
– Quantitative: Conversion Rate, Revenue per Visit and
Returns Rate
Free Shipping, Free Returns
6. “…be generous with
your return policies—
both your brand
reputation and your
margins will benefit.”
— IR, February 28, 2016
“57% of
shoppers consider having to
pay for return shipping as an
issue when making returns”
— UPS Pulse of the Online Shopper,
June 2015
“If customers are forced to pay for
returns, they will either:
• keep a product they don’t want, or
• reluctantly shell out to get one they
do want.
Either way this will likely reduce
customer satisfaction and
ultimately, customer loyalty.”
— IR, December, 2015
“Customers who paid for their own
return decreased their post-return
spending at that retailer 75%-100% by
the end of two years. In contrast,
returns that were free to the consumer
resulted in post-return customer
spending that was 158%-457% of
pre-return spending.”
— Journal of Marketing, September 2012
Ice breaker:
Quick show of hands, who here has ever bought a pair shoes online? Keep your hands up if you’ve had to return a pair because either you didn’t like them or they didn’t fit when you received them. Now, how many of you enjoyed the process of returning that purchase?
Introduce myself:
Haley Nemann – Director of digital experience at crocs
Crocs is a casual footwear company based in Boulder, Colorado 11 sites globally
The main focus of my role is Global Ecommerce Customer & User Experience & Optimization.
You’ll notice that I jump between the terms user experience and customer experience. These two are not one in the same, but in the world of ecommerce they’re pretty close. Our customers are also our users. So by improving user experience we are essentially improving our customer experience at the same time.
A main focus of my job is prioritizing and approving all front feature improvements we make to our global platform and online experience.
To aid in this process we do a lot testing at crocs – everything from user testing, to MVT, to testing configurations within our platform. Today I am going to talk about how to use data to influence changes to your CX that may seem unfavorable to your businesses bottom line.
BULLET 1)
Historically customer experience has not been tied to business outcomes – however I seek to change that!
A large part of my role at crocs is to focus on using data to drive insights and power better experiences for our consumers. Happy customers come back! Unhappy customers often spread the word of social platforms.
To be successful, I must first understand user behavior patterns and align our site experience accordingly.
BULLET 2)
There is a universal need for data driven experiences, or ‘ what I call measurable ux. This is challenging for today’s business leaders across most verticals and industries, not just retail.
BULLET 3)
The Quantitative vs Qualitative Debate –This is an ongoing debate within our ecomm department and I’m sure many of you experience this debate as well
Accessing the data itself can be challenging, and that data can often be inaccurate, laden with biases, flaws and outdated information.
In my case, data isn’t just about numbers – It’s a blend of quantitative and qualitative information from the field, our sites, testing and customer feedback. Looking at the data from a variety of vantage points helps minimize biases. In a few minutes we will go into a real world example of how we combine these two types of data to inform business decisions.
SLIDE 3
Bullet 1)
There is no exact science to creating data driven user experience, but here’s one approach to help ensure the experiences are intentional and tailored to deliver business results:
Subsequent bullets)
Define: Analyze the business objectives and translate them into UX KPIs and metrics
- Example of this, “we think our checkout process is too long. Our goal is to reduce fall out and have users complete this process in X number of seconds without making errors in the process”
Benchmark: Measure the current experience to establish baseline
- Following along with our checkout example, This would entail user testing the current experience, watching videos to identify friction points and understanding how long it is currently taking the user to complete the checkout process. This would also involve looking at your checkout funnel analytics and identifying drop off rates for each step of the process.
Forecast: Predict the likely impact of improvement
At crocs we call this PIE, an idea we adapted from the website the wider funnel. This is calculation of Potential, Impact and Effort that helps us measure the affect of a particular feature.
Our analytics team came up with a fantastically nerdy calculation that looks a combination of traffic, what section in the shopping journey we are attempting to impact, plus expected improvement to help with this PIE calculation
Again in our example, you may expect checkout to have a high potential for impact because users are so far down the funnel, however the effort and risk are very high. Therefore this will likely extend your effort timeline to account for additional risk mitigation strategies like extensive QA & user testing.
Design & Begin: design and development the new experience guided by the UX KPIs
- At this point you may build the entire feature, or just a prototype to put in front of testers and iterate upon
Measure: Test and collect results data to measure success
- The may be in the form of an A/B test prior to launching a feature at 100% or as post launch analysis
Analyze: Use results to provide actionable insights, make recommendations and quantify a value of success
REAL WORLD Example:
We know offering free returns to our consumers is the correct CX, however we were unsure of the impact to our bottom line.
We needed to determine if offering free returns would increase our CR and RPV enough to off-set the cost of the return shipping and touch cost of re-stalking the product.
In the past we have done extensive testing on our free shipping thresholds to determine that optimal price. However now we needed to understand the impact of free returns.
We ran an extensive test set out to answer these questions.
We combined both qualitative and quantitative results data to answer our questions.
For the qualitative – we did competitive research and user testing
For the quantitative - we ran an A/B test of the offer by presenting different messaging to a test group live customer on the front end of our the site.
This was a bit tricky because we don’t have a order management system sophisticated enough to know which messaging the customer saw on our website when she made her purchase (free shipping, free returns) or (just free shipping).
To solve this problem fairly easily we actually decided not to charge any customers for returns of purchases that were made during the test period. So those customers who thought they would be charged for return shipping simply got delighted when they received a full refund amount.
To Measure Results, we summarized our research results and look at the CR, RPV and Return Rate.
Industry Research around the topic heavily supports the cause for offering free returns
We also did a competitive analysis of other branded footwear manufacturers and found that 7/8 offered free returns
As a part of understanding user expectations we ran a series of user tests that included the very broad question –
“what factors influence your online purchase decision?”.
This question was intentionally asked this way as to not sway the user’s answers.
I’m going to show a series of user testing clips where users discuss their general expectations when shopping for shoes online. The video jumps around a bit as its edits of users just answering this question.
As a part of understanding user expectations we ran a series of user tests that included the very broad question –
“what factors influence your online purchase decision?”.
This question was intentionally asked this way as to not sway the user’s answers.
I’m going to show a series of user testing clips where users discuss their general expectations when shopping for shoes online. The video jumps around a bit as its edits of users just answering this question.
Customer experience was measured qualitatively as a win. This was based both on the finding from the user testing videos & Journal of Marketing research mentioned earlier that found post free return customers spending upwards of 158% of their of pre-return spending. We know our consumers shop with us on average of 1.5 times per year, so losing current customers could have a vast impact on customer attrition.
However in this particular case the quantitative data alone was enough to quantify free returns as a win. The increase in RPV and CR was enough to offset the increase we had in returns because our returns baseline is low.
This is just small example of how you can use data to make a case to an executive suite that the right Customer Experience can also be great for business even though it may appear otherwise.