7. PART I
A model for policy experimentation
9.30 Welcome
Embracing double-edged complexity
Towards a new conception of the role of government
A holistic approach to policy experiments
From challenges and opportunities to impact
Horizon scanning
Sensing the next policy challenge
Co-designing hypotheses to test
How to build an experiment portfolio
Co-producing by experimentation
Prototyping, programming and scaling using design approaches
Learning from experiments
Measuring outcomes
10.20 Q&A
10.30 End session
8.
9.
10. Public policy: A design problem
“How can you make sensible policy or strategy in a nondeterministic,
evolutionary, highly complex world, that is, a world where the most
desirable outcomes are unknown but there may be many possible
acceptable outcomes, where change is characterized by both path
dependence and unpredictability, and where there are many diverse
components, interactions, and feedback among components and
multiple dimensions to each problem? This is the design problem with
respect to public policy.”
Carlsson (2004:36)
11.
12. When government policy fails…
• United States: Obamacare digital platform
• Denmark: Runaway applications for solar energy scheme
• Germany: Voluntary Technical Year
• Singapore: Relationship programmes
15. ”The state has not just fixed markets,
but actively created them”. Marianna Mazzucato
16. ”[We] suggest institutional changes that shift innovation policy
towards a more experimental conception of the role of the state in
facilitating entrepreneurship, and thereby innovation”. Hasan Bakshi
17. “This country needs, and unless I
mistake its temper, the country
demands bold persistent
experimentation. It is common sense
to take a method and try it. If it fails,
admit it frankly and try another. But
above all, try something.”
Franklin Delano Roosevelt
18. “An appalling piece of political
stupidity.”
Louis Howe, adviser to Franklin Delano Roosevelt
22. Horizon
scanning
What?
• Sensing coming trends and developments with potential policy or organisational
consequence
• Establishing insight, foresight and scenarios to visualize plausible futures
Why?
• Creating awareness of context factors of importance to the organisation
• Preparedness, resilience in view of possible disruptions
• Basis for policy planning and action
Key questions?
• Which political, economic, environmental, societal and technological factors
should we care about?
• How could these driving forces influence us in the future?
• What should we do now to shape our future in a desirable direction?
23. Horizon
scanning
Cases
• The Singaporean Government: 2050 foresight strategy
• Policy Horizons Canada: IMPACT - a serious foresigt
board game for public servants
• OECD: Schooling for tomorrow
• Danish Design Center: Scenarios Healthcare Denmark
2050
• UAE: Museum of government futures
• Dubai Future Foundation
26. Co-design What?
• Exploring problems from end user perspective
• Co-creating new ideas with users and stakeholders
• Prototyping and testing early ideas “in the lab”
Why?
• To build an early validation of fit and function of a policy idea
• Create basis for redesign and ultimately for decision-making
Key questions?
• Who are the end users?
• How might this policy intervention work for them?
• Which other aspects do we need to take into account?
35. Co-producing
by
experimentation
What?
• Organising and implementing policy through collaborative networks
• Leveraging all relevant resources to produce policy outcomes
• Establishing the hypotheses of change to experiment with policy by co-production
• Ensuring rigorous collection of qualitative and quantitative
Why?
• To be explicit about which actions and factors we expect will create intended change
• Raise awareness about critical success factors
• To know what to measure to track changes, including unintended consequences
Key questions?
• Based on our co-design process, which hypothesis is it now we are testing?
• What inputs, activities and outputs do we expect to realize?
• What would outcomes look like, if we are successful?
36. Co-producing by
experimentation
What we do
View all policy interventions as essentially
experimental
Realise co-production at three scales
• Prototype: High on experimenting
Key questions: How does the intervention work?
Who does it work for (who benefits)?
• Program: High on learning
Key questions: How can we learn from this now
that the design is being realized?
• Scale: High on sharing
Key questions: How can we share our insights
and tools? Which actors can embed activities to
go to scale? How can we reach more people/
businesses?
37. • Finland PMO: Government experimentation
programme and funding platform for citizen-led
experiments
• UK CO: Government Digital Services
• UAE: Dubai Future Accelerators Programme
Co-producing
by experimentation
Cases
41. “If you don’t measure outcomes,
you cannot tell the difference
between success and failure. That
means you might be rewarding
failure.”
Ray Rist, former senior advisor, World Bank
42. Outcome
measurement
What?
• Establishing a systematic set of methodologies to document inputs, activities,
outputs, and short- and long term outcomes of interventions
• Establishing key perfomance indicators: Best indications of what success could look
like
• Collecting data systematically
Why?
• Using data to document for accountability and transparency
• Drive continuous learning, and increase organisational performance
• Produce stronger outcomes
Key questions?
• Do our hypotheses hold?
• Are we achieving the positive change and outcomes we intended?
• What are unintended consequences - what should we adjust?
45. Measuring
outcomes
What do we
do?
Outcome measurement system
Measures outcomes systematically around three overall strategic objectives:
• Contribution to business growth (economic value)
• Contribution to branding of Danish design (economic value)
• Contribution to societal impacts (societal outcomes in education and sustainability)
This is done by assessing progress against a logic model of hypotheses of change effect chains.
Quantitatively: Surveys among businesses, media impact data, etc.
Qualitatively: Observation studies, interviews, design research
Enables cost benefit analysis: What is the return of investing in the Danish Design Centre?
User feedback surveys
• Net promoter scores, evaluation of projects et promotor score
• Measures loyalty from participants in seminars and events
Measuring design impact for business
• Comprehensive case research methodology
• Survey data
• National statistical data
46. Measuring
outcomes
How does it
differ in
experimental
policy?
Traditional policy
(operations)
Experimental policy
(innovation)
Purpose
Documentation, accountability,
performance
Learning, adaptation, redesign
(or termination)
Focus
Optimizing the use of existing
resources
Discovering additional
resources to be leveraged
Data Mainly quantitative Quantitative and qualitative
Tools Statistics, surveys, other
A/B test, RCT’s, cases, design
research, future probes, etc.
Time horizon
Long-term systematic
measurements; on-going
Tailored on concrete prototype
or programme design
Challenge
Setting right KPIs
- and meeting them!
Capturing causal elements of
hypotheses of change
48. Towards experimental
government?
• Which approaches do you use today from design to
measurement?
• How could systematic experimentation become part
of the “new normal” of governing?
• What would be the benefits?
• Which challenges to you foresee?