Mixed Method Experimentation to Eliminate Siloes & Uncover Business Growth

VWO
VWOContent Marketer at Wingify
Mixed Methods Experimentation
How to break down organizational silos and uncover
business growth
Hello!
Alex Mason
Director of Experimentation Strategy,
Conversion
Harriet Swan
Director of UX Research,
Conversion
Evidence based
decision-making
We have over 15 years’ experience advising
and partnering with leading brands
Mixed Methods Experimentation
Behavioral
What people do
Attitudinal
What people say
Quantitative
How Many & How Much
Qualitative
Why & How
Mixed Methods Experimentation
Behavioral
What people do
Attitudinal
What people say
Onsite
Polls
Session
Recordings
Diary
Studies
Surveys
Usability
Tests
Contextual
Interviews
Quantitative
How Many & How Much
Qualitative
Why & How
A/B Tests
Analytics
Mixed Methods Experimentation
Behavioral
What people do
Attitudinal
What people say
Onsite
Polls
Session
Recordings
Diary
Studies
Surveys
Usability
Tests
Contextual
Interviews
Quantitative
How Many & How Much
Qualitative
Why & How
A/B Tests
Analytics
Mixed Methods Experimentation
Behavioral
What people do
Attitudinal
What people say
Onsite
Polls
Session
Recordings
Diary
Studies
Surveys
Usability
Tests
Contextual
Interviews
Quantitative
How Many & How Much
Qualitative
Why & How
A/B Tests
Analytics
Many experimenters do a lot
of this…
…but little of this
This creates various challenges
● Lack of supporting evidence behind decisions
● Difficulties prioritizing experiments & initiatives
● Limited customer understanding
● Diminishing returns from investment
● Information silos
● Teams limited by their own creativity
● Challenges in informing certain business questions
● Redundant/repeated workstreams
1. Lack of supporting evidence behind decisions
2. Difficulties prioritizing experiments & initiatives
3. Limited customer understanding
4. Diminishing returns from investment
5. Information silos across organizations
6. Teams limited by their own creativity
7. Challenges in informing certain business questions
8. Redundant/repeated workstreams
Audience Poll: Which of these is the biggest challenge
for your organization?
What impact can mixed methods experimentation
have for your business?
It increases confidence
It focuses on the customer
Source: Bain Customer-led Growth diagnostic questionnaire
It improves experiment outcomes
35%
Win Rate
51%
Win Rate
Analysis of Experiment Outcomes (win rates) applying a mixed-method approach
It acts as an innovation hub
Explore
Innovation Engine
Exploit
Execution Engine
Idea
Idea w/ Evidence
Launch
Scale
Test
Optimize
Pivot
Evolve
It creates more business value
Without mixed methods
experimentation
With mixed methods
experimentation
So how can you put this into practice?
3 Keys to
success
✓ Cross team processes & collaboration
✓ Enablement through tools
✓ Shared language
3 Keys to
success
✓ Cross team processes & collaboration
✓ Enablement through tools
✓ Shared language
Establishing use cases
Context Sequence
Inform Test Hypothesis UXR → Test Hypothesis → A/B/n Test
Gain Additional Insight
Test Hypothesis → A/B/n Test → UXR
Test Hypothesis → UXR + A/B/n Test
Pilot test Design Concepts Design Concept → Pilot Test → A/B/n Test
Establishing use cases
Context Sequence
Inform Test Hypothesis UXR → Test Hypothesis → A/B/n Test
Gain Additional Insight
Test Hypothesis → A/B/n Test → UXR
Test Hypothesis → UXR + A/B/n Test
Pilot test Design Concepts Design Concept → Pilot Test → A/B/n Test
Use Case:
Gaining additional
insight
Behavioral
What people do
Attitudinal
What people say
Quantitative
How Many & How
Much
Qualitative
Why & How
A/B Tests
Analytics
A very successful experiment in one
area of the business surprised everyone
when it didn’t succeed in a parallel area
of the business.
The team was stuck trying to explain
why.
Contact Submissions: +44.04%
Contact Submissions: -0.64%
A successful experiment in one area of the business…
…Didn’t replicate in another.
“Can you speak her language, and match her
tone, to make sure that you are
customizing something strictly for her
situation? That’s what would make the
difference for me.”
“I feel like treatment centers are all the
same. How much choice is there? What can I
expect as a parent?”
UXR Insight:
When speaking about treatment
more broadly, participants stated
the need to ensure treatment is
aligned with the specific individual
and their context
💡
🙋
♂️
Behavioral
What people do
Attitudinal
What people say
Contextual
Interviews
Quantitative
How Many & How
Much
Qualitative
Why & How
A/B Tests
Analytics
Use Case:
Informing a test
hypothesis
Contact Submissions: +20%
UXR Insight:
When speaking about treatment
more broadly, participants stated
the need to ensure treatment is
aligned with the specific individual
and their context
✅ Confirmed!
Behavioral
What people do
Attitudinal
What people say
Contextual
Interviews
Quantitative
How Many & How
Much
Qualitative
Why & How
Use Case:
Informing a test
hypothesis
What people say ≠ What people do
“I would be more inclined to go
somewhere that aligned with issues
that are important to me, even if it was
more expensive”
“That tagline attracts my attention, what
are you going to give me in return?”
UXR Insight:
Getting something in return (like
contributions towards social causes)
would incentivize customers to
purchase.
💡
🙋
♂️
Use Case:
Informing a test hypothesis
Use Case:
Informing a test
hypothesis
Behavioral
What people do
Attitudinal
What people say
Quantitative
How Many & How
Much
Qualitative
Why & How
A/B Tests
Contextual
Interviews
Surveys
Control Variation A
Audience Poll: Which variation resulted in more orders?
Control Variation A
AB Test Insight:
Social causes & messaging are not a
motivating factor in driving
purchases.
Control
Variation A
Orders: -0.5%
UXR
Experimentation
UXR
Experimentation
UXR
Experimentation
UXR
Experimentation
UXR
Experimentation
3 Keys to
success
✓ Cross team processes & collaboration
✓ Enablement through tools
✓ Shared language
Liftmap®
Program reporting
Idea & test prioritization
Workflow management
Roadmapping
Improving collaboration
Knowledge Database & repository
Liftmap®
Program reporting
Idea & test prioritization
Workflow management
Roadmapping
Improving collaboration
Knowledge Database & repository
3 Keys to
success
✓ Cross team processes & collaboration
✓ Enablement through tools
✓ Shared language
Mixed Method Experimentation to Eliminate Siloes & Uncover Business Growth
Mixed Method Experimentation to Eliminate Siloes & Uncover Business Growth
Win
Rate
Whirlpool Case Study
An example of a mixed methods program
in action
Read the Case study →
Q&A
Thank you!
1 de 48

Recomendados

Using Nudge Theory to achieve a competitive edge with your UX | Psychology of... por
Using Nudge Theory to achieve a competitive edge with your UX | Psychology of...Using Nudge Theory to achieve a competitive edge with your UX | Psychology of...
Using Nudge Theory to achieve a competitive edge with your UX | Psychology of...CharityComms
2.9K vistas59 diapositivas
Lean startup – rapid execution in the age of the rooster; kyra davis @ Year o... por
Lean startup – rapid execution in the age of the rooster; kyra davis @ Year o...Lean startup – rapid execution in the age of the rooster; kyra davis @ Year o...
Lean startup – rapid execution in the age of the rooster; kyra davis @ Year o...Year of the X
222 vistas31 diapositivas
Aligning Quantitative Analytics for Better Product Decisions por
Aligning Quantitative Analytics for Better Product DecisionsAligning Quantitative Analytics for Better Product Decisions
Aligning Quantitative Analytics for Better Product DecisionsProduct School
390 vistas47 diapositivas
Using nudge theory to achieve a competitive edge with your UX por
Using nudge theory to achieve a competitive edge with your UXUsing nudge theory to achieve a competitive edge with your UX
Using nudge theory to achieve a competitive edge with your UXFresh Egg UK
979 vistas57 diapositivas
Market Research and UX: Birds of A Feather or A Cat Among the Pigeons? por
Market Research and UX: Birds of A Feather or A Cat Among the Pigeons?Market Research and UX: Birds of A Feather or A Cat Among the Pigeons?
Market Research and UX: Birds of A Feather or A Cat Among the Pigeons?Tom Bubnick
293 vistas53 diapositivas
Serve your customers better with User Experience Research por
Serve your customers better with User Experience ResearchServe your customers better with User Experience Research
Serve your customers better with User Experience ResearchAmanda Stockwell
602 vistas37 diapositivas

Más contenido relacionado

Similar a Mixed Method Experimentation to Eliminate Siloes & Uncover Business Growth

The Evidence-Based Organization: A Platform for Innovation por
The Evidence-Based Organization: A Platform for InnovationThe Evidence-Based Organization: A Platform for Innovation
The Evidence-Based Organization: A Platform for InnovationJan Recker @ University of Hamburg
18K vistas26 diapositivas
HPX44- The True Power of User Research por
HPX44- The True Power of User ResearchHPX44- The True Power of User Research
HPX44- The True Power of User ResearchStella Hsiao
54 vistas52 diapositivas
HPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella Hsiao por
HPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella HsiaoHPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella Hsiao
HPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella Hsiao悠識學院
780 vistas52 diapositivas
UX 101 - Presentation for Werk1 München por
UX 101 - Presentation for Werk1 MünchenUX 101 - Presentation for Werk1 München
UX 101 - Presentation for Werk1 MünchenGijs van Zon
179 vistas47 diapositivas
Lean Business Validation experiments por
Lean Business Validation experimentsLean Business Validation experiments
Lean Business Validation experimentsBundl
386 vistas75 diapositivas
Experience strategy with UX designer as protagonist por
Experience strategy with UX designer as protagonistExperience strategy with UX designer as protagonist
Experience strategy with UX designer as protagonistAnthony Colfelt
11.1K vistas36 diapositivas

Similar a Mixed Method Experimentation to Eliminate Siloes & Uncover Business Growth(20)

HPX44- The True Power of User Research por Stella Hsiao
HPX44- The True Power of User ResearchHPX44- The True Power of User Research
HPX44- The True Power of User Research
Stella Hsiao54 vistas
HPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella Hsiao por 悠識學院
HPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella HsiaoHPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella Hsiao
HPX Talk 44:以增長為導向,發揮用戶研究的真正價值 / Stella Hsiao
悠識學院780 vistas
UX 101 - Presentation for Werk1 München por Gijs van Zon
UX 101 - Presentation for Werk1 MünchenUX 101 - Presentation for Werk1 München
UX 101 - Presentation for Werk1 München
Gijs van Zon179 vistas
Lean Business Validation experiments por Bundl
Lean Business Validation experimentsLean Business Validation experiments
Lean Business Validation experiments
Bundl 386 vistas
Experience strategy with UX designer as protagonist por Anthony Colfelt
Experience strategy with UX designer as protagonistExperience strategy with UX designer as protagonist
Experience strategy with UX designer as protagonist
Anthony Colfelt11.1K vistas
Ux design strategy_slideshare[1] por Different
Ux design strategy_slideshare[1]Ux design strategy_slideshare[1]
Ux design strategy_slideshare[1]
Different854 vistas
UI/UX Foundations - Research por Meg Kurdziolek
UI/UX Foundations - ResearchUI/UX Foundations - Research
UI/UX Foundations - Research
Meg Kurdziolek6.9K vistas
Appreciative Inquiry por John Gray
Appreciative InquiryAppreciative Inquiry
Appreciative Inquiry
John Gray17.5K vistas
Collaborative Research por Erika Hall
Collaborative ResearchCollaborative Research
Collaborative Research
Erika Hall23.8K vistas
Startup Istanbul 2016 / Jake Disraeli - Kyra Davis - LaunchPad por Startup Istanbul
Startup Istanbul 2016 / Jake Disraeli - Kyra Davis - LaunchPadStartup Istanbul 2016 / Jake Disraeli - Kyra Davis - LaunchPad
Startup Istanbul 2016 / Jake Disraeli - Kyra Davis - LaunchPad
Startup Istanbul131 vistas
Qualitative vs quantitative survey questions por Pollfish
Qualitative vs quantitative survey questionsQualitative vs quantitative survey questions
Qualitative vs quantitative survey questions
Pollfish95 vistas
Design research for a quality product por Carmen Brion
Design research for a quality productDesign research for a quality product
Design research for a quality product
Carmen Brion571 vistas
Handout presentatie nationale vakdag dialogue and digital marketing (net mar... por Ronald Verschueren
Handout presentatie nationale vakdag dialogue and digital marketing  (net mar...Handout presentatie nationale vakdag dialogue and digital marketing  (net mar...
Handout presentatie nationale vakdag dialogue and digital marketing (net mar...
Ronald Verschueren456 vistas
More Than a Feeling: Data-Informed Design por Courtney Clark
More Than a Feeling: Data-Informed DesignMore Than a Feeling: Data-Informed Design
More Than a Feeling: Data-Informed Design
Courtney Clark447 vistas
An Experimentation Framework: How to Position for Triple Digit Growth por Optimizely
An Experimentation Framework: How to Position for Triple Digit GrowthAn Experimentation Framework: How to Position for Triple Digit Growth
An Experimentation Framework: How to Position for Triple Digit Growth
Optimizely768 vistas
How to Effectively Experiment in PM by LendingTree Sr PM por Product School
How to Effectively Experiment in PM by LendingTree Sr PMHow to Effectively Experiment in PM by LendingTree Sr PM
How to Effectively Experiment in PM by LendingTree Sr PM
Product School278 vistas
From Questions to Confidence por Brent Chudoba
From Questions to ConfidenceFrom Questions to Confidence
From Questions to Confidence
Brent Chudoba355 vistas
UserZoom & UXPA Present a Webinar: Build a Better Experience por UserZoom
UserZoom & UXPA Present a Webinar: Build a Better ExperienceUserZoom & UXPA Present a Webinar: Build a Better Experience
UserZoom & UXPA Present a Webinar: Build a Better Experience
UserZoom803 vistas

Más de VWO

Turning Business Challenges into Testable Ideas - 29 Nov '23.pdf por
Turning Business Challenges into Testable Ideas - 29 Nov '23.pdfTurning Business Challenges into Testable Ideas - 29 Nov '23.pdf
Turning Business Challenges into Testable Ideas - 29 Nov '23.pdfVWO
65 vistas27 diapositivas
First 30 days of Your CRO Program por
First 30 days of Your CRO ProgramFirst 30 days of Your CRO Program
First 30 days of Your CRO ProgramVWO
61 vistas57 diapositivas
How to Embed Emotions in Experimental Design por
How to Embed Emotions in Experimental DesignHow to Embed Emotions in Experimental Design
How to Embed Emotions in Experimental DesignVWO
149 vistas66 diapositivas
El arte de generar ideas de experimentación que impulsen resultados por
El arte de generar ideas de experimentación que impulsen resultadosEl arte de generar ideas de experimentación que impulsen resultados
El arte de generar ideas de experimentación que impulsen resultadosVWO
40 vistas24 diapositivas
Estrategias de CRO más allá de la conversión: evolución de la experiencia de ... por
Estrategias de CRO más allá de la conversión: evolución de la experiencia de ...Estrategias de CRO más allá de la conversión: evolución de la experiencia de ...
Estrategias de CRO más allá de la conversión: evolución de la experiencia de ...VWO
43 vistas23 diapositivas
Webinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdf por
Webinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdfWebinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdf
Webinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdfVWO
185 vistas21 diapositivas

Más de VWO(20)

Turning Business Challenges into Testable Ideas - 29 Nov '23.pdf por VWO
Turning Business Challenges into Testable Ideas - 29 Nov '23.pdfTurning Business Challenges into Testable Ideas - 29 Nov '23.pdf
Turning Business Challenges into Testable Ideas - 29 Nov '23.pdf
VWO65 vistas
First 30 days of Your CRO Program por VWO
First 30 days of Your CRO ProgramFirst 30 days of Your CRO Program
First 30 days of Your CRO Program
VWO61 vistas
How to Embed Emotions in Experimental Design por VWO
How to Embed Emotions in Experimental DesignHow to Embed Emotions in Experimental Design
How to Embed Emotions in Experimental Design
VWO149 vistas
El arte de generar ideas de experimentación que impulsen resultados por VWO
El arte de generar ideas de experimentación que impulsen resultadosEl arte de generar ideas de experimentación que impulsen resultados
El arte de generar ideas de experimentación que impulsen resultados
VWO40 vistas
Estrategias de CRO más allá de la conversión: evolución de la experiencia de ... por VWO
Estrategias de CRO más allá de la conversión: evolución de la experiencia de ...Estrategias de CRO más allá de la conversión: evolución de la experiencia de ...
Estrategias de CRO más allá de la conversión: evolución de la experiencia de ...
VWO43 vistas
Webinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdf por VWO
Webinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdfWebinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdf
Webinar_ Demystifying A_B Testing_ Simplified Success for Your Campaigns (1).pdf
VWO185 vistas
Growth and Marketing Lessons from Running User Research por VWO
Growth and Marketing Lessons from Running User ResearchGrowth and Marketing Lessons from Running User Research
Growth and Marketing Lessons from Running User Research
VWO144 vistas
VWO_ 5 Strategies to Drive Instant Improvement in Your Brand’s CVR, AOV and L... por VWO
VWO_ 5 Strategies to Drive Instant Improvement in Your Brand’s CVR, AOV and L...VWO_ 5 Strategies to Drive Instant Improvement in Your Brand’s CVR, AOV and L...
VWO_ 5 Strategies to Drive Instant Improvement in Your Brand’s CVR, AOV and L...
VWO85 vistas
Nutzung der Verhaltenswissenschaft zur Steigerung der Konversionsrate.pdf por VWO
Nutzung der Verhaltenswissenschaft zur Steigerung der Konversionsrate.pdfNutzung der Verhaltenswissenschaft zur Steigerung der Konversionsrate.pdf
Nutzung der Verhaltenswissenschaft zur Steigerung der Konversionsrate.pdf
VWO25 vistas
Getting buy in on your optimization program por VWO
 Getting buy in on your optimization program Getting buy in on your optimization program
Getting buy in on your optimization program
VWO64 vistas
Turn Visitors into Buyers with UAM Method por VWO
Turn Visitors into Buyers with UAM MethodTurn Visitors into Buyers with UAM Method
Turn Visitors into Buyers with UAM Method
VWO84 vistas
Deep dive_ Heuristic Analysis and VoC research.pptx por VWO
Deep dive_ Heuristic Analysis and VoC research.pptxDeep dive_ Heuristic Analysis and VoC research.pptx
Deep dive_ Heuristic Analysis and VoC research.pptx
VWO42 vistas
Experimentation Mindset: Plan and Diversify like a Habit Scientist por VWO
Experimentation Mindset: Plan and Diversify like a Habit ScientistExperimentation Mindset: Plan and Diversify like a Habit Scientist
Experimentation Mindset: Plan and Diversify like a Habit Scientist
VWO197 vistas
Beyond ICE Framework Webinar (1).pdf por VWO
Beyond ICE Framework Webinar (1).pdfBeyond ICE Framework Webinar (1).pdf
Beyond ICE Framework Webinar (1).pdf
VWO336 vistas
Aumenta las tasas de conversión mediante estrategias de crecimiento impulsada... por VWO
Aumenta las tasas de conversión mediante estrategias de crecimiento impulsada...Aumenta las tasas de conversión mediante estrategias de crecimiento impulsada...
Aumenta las tasas de conversión mediante estrategias de crecimiento impulsada...
VWO60 vistas
Exploring the Benefits of VWO's Two-way Integration with Google Analytics 4 por VWO
Exploring the Benefits of VWO's Two-way Integration with Google Analytics 4Exploring the Benefits of VWO's Two-way Integration with Google Analytics 4
Exploring the Benefits of VWO's Two-way Integration with Google Analytics 4
VWO423 vistas
Boosting Your First-Party Data Strategy: Whys & Hows por VWO
Boosting Your First-Party Data Strategy: Whys & HowsBoosting Your First-Party Data Strategy: Whys & Hows
Boosting Your First-Party Data Strategy: Whys & Hows
VWO71 vistas
Usar experimentos para impulsar el crecimiento: Cómo empezar, fracasar y esca... por VWO
Usar experimentos para impulsar el crecimiento: Cómo empezar, fracasar y esca...Usar experimentos para impulsar el crecimiento: Cómo empezar, fracasar y esca...
Usar experimentos para impulsar el crecimiento: Cómo empezar, fracasar y esca...
VWO130 vistas
A_B-Testing Strategien.pdf por VWO
 A_B-Testing Strategien.pdf A_B-Testing Strategien.pdf
A_B-Testing Strategien.pdf
VWO26 vistas
Building Blocks of a strong Experimentation Program (1).pdf por VWO
Building Blocks of a strong Experimentation Program (1).pdfBuilding Blocks of a strong Experimentation Program (1).pdf
Building Blocks of a strong Experimentation Program (1).pdf
VWO170 vistas

Último

Learning from Failure_ Lessons from Failed Startups.pptx por
Learning from Failure_ Lessons from Failed Startups.pptxLearning from Failure_ Lessons from Failed Startups.pptx
Learning from Failure_ Lessons from Failed Startups.pptxCodeventures
17 vistas7 diapositivas
Basic of Air Ticketing & IATA Geography por
Basic of Air Ticketing & IATA GeographyBasic of Air Ticketing & IATA Geography
Basic of Air Ticketing & IATA GeographyMd Shaifullar Rabbi
73 vistas27 diapositivas
v s.pptx por
v s.pptxv s.pptx
v s.pptxravikhadalwal
14 vistas2 diapositivas
VCOSA - VIETNAM COTTON - YARN MARKET REPORT - 11/2023 ISSUE por
VCOSA - VIETNAM COTTON - YARN MARKET REPORT - 11/2023 ISSUEVCOSA - VIETNAM COTTON - YARN MARKET REPORT - 11/2023 ISSUE
VCOSA - VIETNAM COTTON - YARN MARKET REPORT - 11/2023 ISSUEVietnam Cotton & Spinning Association
37 vistas26 diapositivas
Better Appeals and Solicitations - Bloomerang.pdf por
Better Appeals and Solicitations - Bloomerang.pdfBetter Appeals and Solicitations - Bloomerang.pdf
Better Appeals and Solicitations - Bloomerang.pdfBloomerang
119 vistas51 diapositivas
MechMaf Shipping LLC por
MechMaf Shipping LLCMechMaf Shipping LLC
MechMaf Shipping LLCMechMaf Shipping LLC
67 vistas288 diapositivas

Último(20)

Learning from Failure_ Lessons from Failed Startups.pptx por Codeventures
Learning from Failure_ Lessons from Failed Startups.pptxLearning from Failure_ Lessons from Failed Startups.pptx
Learning from Failure_ Lessons from Failed Startups.pptx
Codeventures17 vistas
Better Appeals and Solicitations - Bloomerang.pdf por Bloomerang
Better Appeals and Solicitations - Bloomerang.pdfBetter Appeals and Solicitations - Bloomerang.pdf
Better Appeals and Solicitations - Bloomerang.pdf
Bloomerang119 vistas
Nevigating Sucess.pdf por TEWMAGAZINE
Nevigating Sucess.pdfNevigating Sucess.pdf
Nevigating Sucess.pdf
TEWMAGAZINE28 vistas
The Talent Management Navigator Performance Management por Seta Wicaksana
The Talent Management Navigator Performance ManagementThe Talent Management Navigator Performance Management
The Talent Management Navigator Performance Management
Seta Wicaksana39 vistas
Imports Next Level.pdf por Bloomerang
Imports Next Level.pdfImports Next Level.pdf
Imports Next Level.pdf
Bloomerang179 vistas
DEUTSER-03188 Salt Lake Nov 30 Talk with speaker notes[13].pptx por bradgallagher6
DEUTSER-03188 Salt Lake Nov 30 Talk with speaker notes[13].pptxDEUTSER-03188 Salt Lake Nov 30 Talk with speaker notes[13].pptx
DEUTSER-03188 Salt Lake Nov 30 Talk with speaker notes[13].pptx
bradgallagher630 vistas
Promoting the SEO to the C-Suite por Ash Nallawalla
Promoting the SEO to the C-SuitePromoting the SEO to the C-Suite
Promoting the SEO to the C-Suite
Ash Nallawalla14 vistas
3Q23_EN.pdf por irhcs
3Q23_EN.pdf3Q23_EN.pdf
3Q23_EN.pdf
irhcs17 vistas
Irigoyen_231129 - Around the world in 5 questions.pdf por bradgallagher6
Irigoyen_231129 - Around the world in 5 questions.pdfIrigoyen_231129 - Around the world in 5 questions.pdf
Irigoyen_231129 - Around the world in 5 questions.pdf
bradgallagher616 vistas
Netflix Inc. por 125071027
Netflix Inc.Netflix Inc.
Netflix Inc.
12507102714 vistas
Giampietro_DIG Summit v1.2.pptx por bradgallagher6
Giampietro_DIG Summit v1.2.pptxGiampietro_DIG Summit v1.2.pptx
Giampietro_DIG Summit v1.2.pptx
bradgallagher615 vistas
23.12.07 Bloomerang - 2023-12-06 21.39.56.pdf por Bloomerang
23.12.07 Bloomerang - 2023-12-06 21.39.56.pdf23.12.07 Bloomerang - 2023-12-06 21.39.56.pdf
23.12.07 Bloomerang - 2023-12-06 21.39.56.pdf
Bloomerang118 vistas

Mixed Method Experimentation to Eliminate Siloes & Uncover Business Growth

  • 1. Mixed Methods Experimentation How to break down organizational silos and uncover business growth
  • 2. Hello! Alex Mason Director of Experimentation Strategy, Conversion Harriet Swan Director of UX Research, Conversion
  • 4. We have over 15 years’ experience advising and partnering with leading brands
  • 5. Mixed Methods Experimentation Behavioral What people do Attitudinal What people say Quantitative How Many & How Much Qualitative Why & How
  • 6. Mixed Methods Experimentation Behavioral What people do Attitudinal What people say Onsite Polls Session Recordings Diary Studies Surveys Usability Tests Contextual Interviews Quantitative How Many & How Much Qualitative Why & How A/B Tests Analytics
  • 7. Mixed Methods Experimentation Behavioral What people do Attitudinal What people say Onsite Polls Session Recordings Diary Studies Surveys Usability Tests Contextual Interviews Quantitative How Many & How Much Qualitative Why & How A/B Tests Analytics
  • 8. Mixed Methods Experimentation Behavioral What people do Attitudinal What people say Onsite Polls Session Recordings Diary Studies Surveys Usability Tests Contextual Interviews Quantitative How Many & How Much Qualitative Why & How A/B Tests Analytics Many experimenters do a lot of this… …but little of this
  • 9. This creates various challenges ● Lack of supporting evidence behind decisions ● Difficulties prioritizing experiments & initiatives ● Limited customer understanding ● Diminishing returns from investment ● Information silos ● Teams limited by their own creativity ● Challenges in informing certain business questions ● Redundant/repeated workstreams
  • 10. 1. Lack of supporting evidence behind decisions 2. Difficulties prioritizing experiments & initiatives 3. Limited customer understanding 4. Diminishing returns from investment 5. Information silos across organizations 6. Teams limited by their own creativity 7. Challenges in informing certain business questions 8. Redundant/repeated workstreams Audience Poll: Which of these is the biggest challenge for your organization?
  • 11. What impact can mixed methods experimentation have for your business?
  • 13. It focuses on the customer Source: Bain Customer-led Growth diagnostic questionnaire
  • 14. It improves experiment outcomes 35% Win Rate 51% Win Rate Analysis of Experiment Outcomes (win rates) applying a mixed-method approach
  • 15. It acts as an innovation hub Explore Innovation Engine Exploit Execution Engine Idea Idea w/ Evidence Launch Scale Test Optimize Pivot Evolve
  • 16. It creates more business value Without mixed methods experimentation With mixed methods experimentation
  • 17. So how can you put this into practice?
  • 18. 3 Keys to success ✓ Cross team processes & collaboration ✓ Enablement through tools ✓ Shared language
  • 19. 3 Keys to success ✓ Cross team processes & collaboration ✓ Enablement through tools ✓ Shared language
  • 20. Establishing use cases Context Sequence Inform Test Hypothesis UXR → Test Hypothesis → A/B/n Test Gain Additional Insight Test Hypothesis → A/B/n Test → UXR Test Hypothesis → UXR + A/B/n Test Pilot test Design Concepts Design Concept → Pilot Test → A/B/n Test
  • 21. Establishing use cases Context Sequence Inform Test Hypothesis UXR → Test Hypothesis → A/B/n Test Gain Additional Insight Test Hypothesis → A/B/n Test → UXR Test Hypothesis → UXR + A/B/n Test Pilot test Design Concepts Design Concept → Pilot Test → A/B/n Test
  • 22. Use Case: Gaining additional insight Behavioral What people do Attitudinal What people say Quantitative How Many & How Much Qualitative Why & How A/B Tests Analytics
  • 23. A very successful experiment in one area of the business surprised everyone when it didn’t succeed in a parallel area of the business. The team was stuck trying to explain why. Contact Submissions: +44.04% Contact Submissions: -0.64% A successful experiment in one area of the business… …Didn’t replicate in another.
  • 24. “Can you speak her language, and match her tone, to make sure that you are customizing something strictly for her situation? That’s what would make the difference for me.” “I feel like treatment centers are all the same. How much choice is there? What can I expect as a parent?” UXR Insight: When speaking about treatment more broadly, participants stated the need to ensure treatment is aligned with the specific individual and their context 💡 🙋 ♂️
  • 25. Behavioral What people do Attitudinal What people say Contextual Interviews Quantitative How Many & How Much Qualitative Why & How A/B Tests Analytics Use Case: Informing a test hypothesis
  • 26. Contact Submissions: +20% UXR Insight: When speaking about treatment more broadly, participants stated the need to ensure treatment is aligned with the specific individual and their context ✅ Confirmed!
  • 27. Behavioral What people do Attitudinal What people say Contextual Interviews Quantitative How Many & How Much Qualitative Why & How Use Case: Informing a test hypothesis
  • 28. What people say ≠ What people do
  • 29. “I would be more inclined to go somewhere that aligned with issues that are important to me, even if it was more expensive” “That tagline attracts my attention, what are you going to give me in return?” UXR Insight: Getting something in return (like contributions towards social causes) would incentivize customers to purchase. 💡 🙋 ♂️ Use Case: Informing a test hypothesis
  • 30. Use Case: Informing a test hypothesis Behavioral What people do Attitudinal What people say Quantitative How Many & How Much Qualitative Why & How A/B Tests Contextual Interviews Surveys
  • 32. Audience Poll: Which variation resulted in more orders? Control Variation A
  • 33. AB Test Insight: Social causes & messaging are not a motivating factor in driving purchases. Control Variation A Orders: -0.5%
  • 39. 3 Keys to success ✓ Cross team processes & collaboration ✓ Enablement through tools ✓ Shared language
  • 40. Liftmap® Program reporting Idea & test prioritization Workflow management Roadmapping Improving collaboration Knowledge Database & repository
  • 41. Liftmap® Program reporting Idea & test prioritization Workflow management Roadmapping Improving collaboration Knowledge Database & repository
  • 42. 3 Keys to success ✓ Cross team processes & collaboration ✓ Enablement through tools ✓ Shared language
  • 46. Whirlpool Case Study An example of a mixed methods program in action Read the Case study →
  • 47. Q&A

Notas del editor

  1. Thank you for joining us today, and VWO for the opportunity. The topic that we will be discussing today is Mixed Methods Experimentation, and in particular the combination of AB testing & UX Research methodologies. As an organization until about 5 years ago, we were predominantly focused on AB testing and CRO, doing some qualitative research outside of this, but this was very much conducted separately to the AB testing side of things. We started seeing a real opportunity & a clear impact when these methodologies were applied together, and seen as truly complementary. However something that we experienced and have seen other organizations face too, is how challenging it can be to bring together processes, structures and teams in a cohesive & productive way. Today we want to use our experience in this space to share the value that can be unlocked by overcoming these challenges, and some tangible examples of how we have broken down silos to bring the two together, which we hope is valuable for others. We’d love this to be interactive and have left time at the end for Q&A, and please do feel free to add any thoughts or questions in the chat so we can address these then. We also have a couple of polls to keep things dynamic.
  2. First a quick introduction…
  3. And for those who might not know of us, our core focus is on supporting our clients in making decisions that are rooted in evidence. These decisions can be tactical, focused on improving specific parts of a digital experience, or they can be bigger, supporting marketing or strategic direction. There’s of course lots of different types of evidence that can be valuable in different contexts, and as mentioned our particular focus is on the combination of UX Research with AB Testing and Analytics.
  4. We have been around for 15 years, and have offices in North America & the UK, and over our years have partnered closely with a wide range of brands.
  5. First we want to spend a little bit of time defining what we mean when we talk about Mixed Methods Experimentation Different types of insights you can generate about your audience Behavioral vs. Attitudinal Quantitative vs. Qualitative
  6. You can then plot some of the different activities you can carry out to generate these insights, as seen here E.g. AB testing top left (Behavioural, Quantitative) E.g. Contextual interviews (Attitudinal, Qualitative) But it’s not enough to just be doing these activities separately…
  7. What is key, and where the value is unlocked, is when these methodologies are applied together Considering & applying a mix of methodologies together helps confidently answer a range of business questions, drives growth and allows us to obtain a more robust picture of our customers.
  8. However many experimentation programs are largely focused on AB testing & Analytics.. Focusing solely on the top left means you are getting a strong sense of what, but having to infer the why behind it. & Vice versa- many researchers are not combining those activities with behavioral ones to measure/validate through this lens. What this means is that you aren’t able to validate whether what customers are saying translates into their actual behaviour. For many organizations, both AB testing and UX Research will be happening, but often they are siloed from each other, with the different teams working quite separately. These siloes often evolve very naturally, given the differing skill sets people might have, where they sit in the organization etc.
  9. Allowing these siloed ways of working to persist, creates many challenges for both your experimentation & research programs, as well as your business more broadly. This isn’t an exhaustive list, but these are some key challenges that we observed in our own company as well as with others. Lack of supporting evidence behind decisions: One type of evidence informing a decision rather than multiple. On the AB testing side for example it can mean biased hypotheses, hypotheses not backed by evidence Difficulties prioritizing experiments & initiatives: Lack of confidence in knowing what might be most impactful Limited customer understanding: Only understanding customers in terms of either their behaviours or their attitudes Diminishing returns from investment: Moving from optimizations to bigger swings Information silos: Without setting up processes & platforms to work together, information is often not shared across the organization and teams aren’t on the same page Teams limited by their own creativity: If we work just in testing or research teams we might not be challenging ourselves or each other as much, not bringing i n diverse perspectives Challenges in informing certain business questions: Methodologies are best suited to certain business questions, so if you are only using one type you are limited in the questions you can answer Redundant/repeated workstreams: Related to information silos, teams might be replicating each other’s work and not making the best use of resources
  10. Wanted to do a quick poll to understand which of these might be the biggest challenges that you see in your organizations.
  11. We’ve spoken about what mixed methods experimentation is, and some of the common challenges that occur by keeping these as fairly siloed activities. Next we want to talk about the value of adopting this approach, and why it is is worth the effort to break down these silos. I’ll talk to 5 key areas of impact, before I hand it over to Alex to share our perspective on how you can move from working in silos to an integrated way of working together.
  12. The first area of impact here, is that adopting a mixed methods approach allows us to have more confidence in the decisions we are making. 1 data point means we only have a 1 dimensional view of our customer, but if we can start to observe customers from different perspectives and see the same insight across AB testing, usability tests and contextual interviews for example, then we have much more confidence in this insight and the opportunity it presents.
  13. The second point is that this approach means you are focusing on the customer, which has a side-effect of removing ego within your organization. No longer is it based on what we think customers want, but rather a more accurate picture of what customers actually need. Companies that don't take this approach can be oblivious to key customer problems and create a gap between what customers need and what you're offering And this is exactly what Bain and company found when they surveyed over 400 companies and their customers. Many businesses had an artificially high level of confidence that what they were delivering was the best experience When in actuality, they weren’t close enough or addressing the real customer problem, resulting in this delivery gap.
  14. In addition the inclusion of UX Research in your experimentation program improves the outcomes of AB testing, both improving the win rate as well as conclusivity rate, based on data as shown on this slide. Here you can see that for AB tests informed by UXR, the win rate was 51% on average, compared to 35% for those not informed by UXR. And the proportion which were inconclusive is also lower in those informed by UXR. An analogy which we find helpful in explaining this is battleships. By informing your AB tests with UXR, these are much more likely to address real customer problems, which makes finding and sinking more ships easier. It effectively gives you a “cheat sheet”, helping inform where you focus, and reducing the need to make stabs in the dark.
  15. Incorporating UX Research into your process also allows for more innovation. If we are just conducting AB tests, we might do some tests which are more exploratory, but the majority will be in the exploit side of the process, focused on optimization. UX Research allows us to vet ideas and hypotheses, before putting money/resources against them. We can put half baked ideas in front of people, pilot test low fidelity wireframes, and get sentiment and feedback before putting too many resources against an idea or an experience. This allows us to be much more innovative, spending time doing generative work and gathering user sentiment early on to help build out the best hypotheses and experiences.
  16. And finally, but importantly, it creates more business value. By gathering user feedback and vetting ideas earlier on in the process, we are able to take bigger risks. As our work is backed by multiple sources and we have a higher level of confidence in our decisions… We MAXIMIZE business value. The bets we make have lower inherent risk because they are built on compounding and convergent learnings that paint a fuller customer picture, reducing the downside when we do have a failure .
  17. So you may be asking yourself, how do I actually put this into practice?
  18. We’ll talk through 3 strategies that we’ve had success with
  19. The first is understanding how and when these teams work together. At the most basic level… We created cross-teams meetings & forums to enable these teams to collaborate and share learnings In addition, because each team has a very specific skill-set and knowledge base, cross-team learning and education was also important to establish a baseline understanding of each other’s work. Lastly, was to build a process to facilitate mixed methods The CRO and UXR teams would kick off the research together. After the kick off, the teams would work separately so as to not bias each other’s research and interpretations Once the work was complete, the teams would come back together to compare learnings and insights. This would help expose any learnings that were convergent, those that validate one another or point in the same direction. Or those that are divergent, those insights that contradict one another. This process may seem very intuitive but many businesses fail to recognize that that bias can creep in
  20. We’ve seen success by creating common use cases or templates for how these two teams can work together and leverage each other to help inform business decisions. Not only do these use cases provide a template for how these teams should interact, but they also help establish a cadence and how to actually sequence the different methodologies. Which sequence you use ultimately depends on the business question or questions you’re asking. The first is to inform a hypothesis for an AB test. The UXR team could conduct quantitative surveys or contextual interviews, find relevant insights, and from those create hypotheses to test on the website, which eventually becomes an AB test. In this way, we can overcome one of those challenges Harriet mentioned earlier about being limited by our own creativity. The customer can tell us what to do. We don’t have to infer or put ourselves in the customer’s shoes which can be very hard to do. The next is to gain additional insight from a test. In this sequence, you could create a hypothesis and then try to validate with either an AB test or a user research study or BOTH and this can even be done simultaneously in most cases, The last use case is pilot testing. Going back to Harriet’s example of the innovation engine, this sequence allows you to rapidly test bold (and sometimes scary) business initiatives or ideas. By creating a concept of a webpage or experience, testing it with customers through a pilot test or contextual interviews, and then refining the idea to be run in AB test to validate behavioral responses from customers. A great example of this is when a company wants to create a new product or service and want rapid feedback to drive iterations and refinement.
  21. Today, we’ll talk through 2 of the use cases with white labeled examples to really drive this home and illustrate how these templates can help your teams implement a mixed methods system
  22. The first example is about gaining additional insight from an AB test and analytics. Let’s put ourselves in the shoes of a CRO team…
  23. There are 2 sites that we actively manage and test. One is geared towards helping people treat eating disorders while the other is aimed at helping support mental health. We ran a test on one site and saw fantastic results… However, when we ported over the learnings to the other site, like many businesses do, the win didn’t translate—We didn’t see that same fantastic result! What gives? By only running one of these methods, we only had an inferred WHY. We didn’t know, with a high level of certainty why user did not respond to this refreshed messaging. We sought answers by turning to the UXR team
  24. Running a number of interviews with users who matched the target audience, we discovered that people considering mental health treatment cared much more about personalized treatment options. When comparing that to the copy that we ran, there was a huge mismatch and we weren’t addressing this key customer need.
  25. Going back to the quadrants, we can then begin to triangulate the insights across these methodologies From the AB tests, we know the previous treatment was ineffective at driving change in user behavior. From the contextual interviews, we know that customers want personalized care.
  26. So we brought that into the experience—We swapped out the copy & value propositions for those that focused on personalized care… and saw a dramatic increase in leads. Going back to the battleship example… Using BOTH of these methods enabled us to find the battleship (or lack there of) and then get hints as to where it’s located and actually sink it (or get a win)
  27. On the flip side, let’s put ourselves into the shoes of the UX team If we are conducting contextual interviews, we get a strong WHY about how users rationalize their decisions and feel about certain things… But we don’t have a high level of confidence that our findings apply to the majority of our customers and the reason for that is…
  28. READ SLIDE Humans are innately complex Especially in research settings it can be tricky due to the observer (or the hawthorne) effect What people say doesn’t always replicate or equate to what they would do in the real life scenario Engaging the experimentation team is a great way to validate whether the attitudes actually match the behavior This seems intuitive, but many companies overlook this and end up making changes to their business or their site, that may not actually resonate with the majority of their customer base, which could be detrimental to their business
  29. To better illustrate this use case, let’s bring in an example. We were conducting UX research for a global ecommerce retailer of consumer appliances and wanted a better understanding of what would motivate first time customers to purchase, and previous purchasers to purchase AGAIN! We’ve done contextual interviews with customers and found that getting something in return, like contributions to social causes or organizations would help incentivize customers to consider purchasing. That’s a fantastic little nugget, but we need to prove it out before investing time and resources into scaling this.
  30. We then ran surveys to further quantify the attitudes of customers and then ran an AB test to see how customers would actually behave when prompted. In this example, we used a number of different sources for a very specific reason—Users may still be biased because of existing social pressures or obligations to support social causes. Having a respondent answer a question still could be hypothetical or allow for bias to creep into their answers We need to measure how users WOULD ACTUALLY BEHAVE which is why we paired with an AB test.
  31. So the UXR team brought this to the experimentation team And asked How could we validate this insight? Does this actually hold true for customers? Would they be more likely to purchase if we incentivized them by supporting a social cause when thye purchase? Is this enough of a “return” to trigger that kind of behavior? The experimentation team took this question and designed a test to validate this insight. On the product detail page, a section was added to promote the partnership of this ecommerce retailer with a well known charity.
  32. Now, turning to the audience—How do we think this did? Did promoting this charity persuade users to convert? Did the variation drive more orders than the control?
  33. The result is a little disappointing in that it didn’t drive any meaningful change in user behavior They were no more likely to purchase those products because of the social cause highlight—Interesting! What we learned in the UXR did not actually hold true when tested behaviorally. Now, looking at this result, it’s very easy to dismiss this and chuck it in the bin. But it’s important to recognize that there are a few other unknowns which may have actually contributed to this null result. The first is execution. The test targeted the PDP, but maybe that’s too far down the funnel—Maybe the social causes are more relevant or important for top of funnel awareness. Similarly, the added content was lower on the page, so maybe users aren’t seeing it enough to make a difference. The second is the content itself. What if this charity specifically did not resonate with users. What if we promoted different social causes? Would this cause users to alter their behavior at all? Remember to take those into account so as to not lose any potential winning bets or experiences.
  34. Now, we’ve talked through 2 examples of how to apply this. Let’s zoom out a bit and consider what this looks like for a business over time. It can get quite complex when these two teams begin to work with one another..
  35. Running experiments on the website, through emails and then correlating those with the insights from the user research team…
  36. Connecting disparate learnings hts and findings from the different activities….
  37. We get a much more robust picture of the customer…
  38. But it can get overwhelming and complex… How do you capture all of these learnings in a coherent way? It’s necessary to capture and categorize these insights so that you can conduct meta analysis and therefore the compounding effects of a mixed methods program.
  39. The way in which we’ve had success doing this, is by leveraging tools to capture and categorize our insights and learnings…
  40. At Conversion, we use our proprietary software, Liftmap. Teams use it to not only manage the work and production flows, report out on programs, and prioritize studies and tests, but to store all the results, key learnings and insights.
  41. In that way, we’re able to capture the data and categorize it with the use of tags and conduct meta analysis across all of our our clients to see what ACTUALLY works With the integration of UXR more closely with the experimentation team, we’ve had to be more innovative in how we actually categorize our tests to enable these two teams to talk to one another and create these disparate connections between activities
  42. And to do that, we create a shared language or framework that spans these two disciplines and is rooted in customer psychology.
  43. It is called the Levers framework We’ve developed this hierarchical framework over the last 16 years of working with our clients. It aims to describe changes to user experience at three levels of granularity. Put simply, the Levers Framework is a comprehensive taxonomy of different features of user experience that influence user behavior. It allows us to categorize any experiment, uxr study, and resulting insights. ANd by doing so… It enables us to: Conduct richer meta analysis by linking insights and learnings Through this meta analysis we can identify and understand why users convert And from this, we can ideate with precision ___ Ideation – we were finding that once we’d identified a problem on a website, we were often spending time reinventing user experience solutions that already existed. Structured iteration – though we were focussed on learning and iteration, we had no means of converting insights into a common language to permit more structured, longer-term forms of iteration. Data – we had mountains of experiment data, from countless clients and industries, but we were struggling to operationalize this resource and integrate it with our Machine Learning tool, ConfidenceAI.
  44. It creates a common language between these two teams by connecting disparate insights and learnings from the different activities and methods Through this tagging & linking we can conduct richer meta analysis—Seeing which insights are validated by multiple sources, giving us more substantial support for why we should pursue an idea…
  45. With the ultimate goal of understanding what makes users convert. Through the closer integration of CRO and UXR, the rich tagging and categorization systems via tools, and the Levers framework, we have a high level of confidence as to which levers drive the best outcomes for specific businesses, industries, products, or even experiences.
  46. If you’d like to learn more about how we apply this with our clients, we’ve included a case study that you can access through this deck. Thank you! We’ll now open it up to questions.