This document summarizes a presentation on evaluating engagement activities. The presentation aimed to help participants develop evaluation strategies and make strong cases for engagement. It covered why evaluation is important, how to identify what to evaluate using logic models, who evaluations are for, and making the case for engagement through evaluation. The presentation included activities where participants discussed their experiences with evaluation and worked through examples of logic models and evaluation plans.
2. This presentation is developed from a number of
presentations originally created by the National
Coordinating Centre for Public Engagement and the
Beacons for Public Engagement through the HE
STEM Programme.
http://www.publicengagement.ac.uk/evaluating-
stem-outreach
As such, this presentation is released under the
same Creative Commons licence of Attribution-
Share Alike 3.0 Unported
http://creativecommons.org/licenses/by-sa/3.0/
3. Aims of day
• To help develop a shared set of approaches to
evaluating engagement across Wellcome Trust
Centres
• To develop skills of participants to develop
high quality evaluation strategies
• To help participants to make strong cases for
engagement with research and for the
evaluation of this activity
4. Timetable
9am Arrive (tea and coffee)
9.15 Introductions
9.20 Why Evaluate?
9.40 How do I know what I am evaluating?
10.30 Break
10.45 How do I know what I am evaluating? (cont)
11.45 Who is the evaluation for?
12.30 Lunch
13.30 Making the case for engagement (and evaluation
of engagement)
15.00 End
5. Introductions
• Who you are
• What experience you have in evaluation
• What you are hoping from the day
10. Source: Ingenious evaluations: A guide for grant holders, The Royal Academy of Engineers
2a: What are you aiming to do?
Beginner’s Guide to Evaluation
11. 1. Aim (what do you want to achieve? Big picture!)
2. Objectives (what you need to do to achieve your aim?)
3. Evaluation questions (what do you want to know?)
4. Methodology (what strategy will you use?)
5. Data collection (what techniques will you use to collect
your evidence?)
6. Data analysis (how will you analyse your data?)
7. Reporting (who will be reading your report?)
What goes in an evaluation plan?
Beginner’s Guide to Evaluation
12. • Pick an activity that you know well
• Pair up with someone you do not know and
explain you activity to each other
– Why you do the activity
– What you hope to achieve by doing the activity
Activity
14. Term Definition Example
Inputs Public sector resources required to
achieve the policy objective
Resources used to deliver the
policy
Activities What is delivered on behalf of the
public sector to the recipient
Provision of seminars, training
events, consultations etc.
Outputs What the recipient does with the
resources, advice/training
received, or intervention relevant
to them
The number of training
courses completed
Intermediate
Outcomes
The intermediate outcomes of the
policy produced by the recipient
Jobs created, turnover,
reduced costs or training
opportunities provided
Impacts Wider societal and economic
outcomes
The change in personal
incomes and ultimately
wellbeing
HM Treasury Definitions (p22)
15. • Pick an project you are familiar and start to
work out the steps of a logic model for it
• Consider:
– Inputs – Resources Used
– Activities – What the project did/does
– Outputs – What the participants did/do
– Intermediate Outcomes – What changed in the
participants
– Impact – Wider societal effects
Your activity
16. • Understanding the theory of the change you
are aiming for improves evaluation
– You can see what you need to evaluate
– You can see what you do not need to evaluate
– You can see the assumptions you may be making
Logic Models & Evaluation
17.
18. Pool of long term
unemployed who
lack skills
Obtain Placements
and undertake
training
Improve
Qualifications and
workplace skills of
attendees
Job Training scheme example
Obtain
Interviews and
Job Offers
Increase in
jobs and
incomes
Lower overall
unemployment
HM Treasury, Magenta Book, p23
19. • What evaluation questions might you want to
ask about this project?
– Are we promoting it sufficiently to the target
audience?
– Are the training courses at the right level?
– Are they improving the skills and qualifications of
attendees?
– Are they getting more interviews? If not, why not?
– Etc.
Job Training scheme example
21. 1. Analysis of the project’s Context
2. Stakeholder Analysis
3. Problem Analysis/Situation Analysis
4. Objectives Analysis
5. Plan of Activities
6. Resource Planning
7. Indicators/Measurements of Objectives
8. Risk Analysis and Risk Management
9. Analysis of the Assumptions
Other Templates – Logical Framework
22. • Participatory approach to evaluation
• Looks to understand the contribution of a
project to changes in practice of stakeholders
• Needs skilled facilitation and a budget
Other Templates – Outcomes Mapping
23. • “So That” chains
• UNDP template:
– Identify the desired change
– Identify the agents of change
– Identify the assumptions
– Pathways to Change
– Indicators of Change
• Theory U (www.presencing.com)
Other Templates
24. • HM Treasury, The Magenta Book: Guidance for
evaluation (2011)
• Annie E Casey Foundation, Theory of Change:
A Practical Tool for Action, Results and
Learning (2004)
http://www.aecf.org/upload/publicationfiles/c
c2977k440.pdf
• www.outcomemapping.ca
• www.theoryofchange.org
References
26. Why and Who of evaluation
Post up as many audiences that you can think
of for our evaluation work on the WHO
flipchart.
27. What do these audiences want?
In groups consider the following questions:
• Why is this audience interested in your
evaluation?
• What are the top three things they would
want to know?
• What are the things you, as the organiser,
would like them to know?
29. Activity
Look at the example evaluation strategies.
Is the approach suitable for all the potential
audiences of evaluation?
What else could the organisers do to help improve
their evaluation plan?
Read the feedback – do you agree/ disagree with
the suggestions?
30.
31. Corrosion Summer Ball
A family activity during the Manchester Science Festival with 4 table-top interactive experiences related to
corrosion science. The aims of the activity were to:
• inspire the general public with an introduction to corrosion.
• communicate that corrosion is interesting and relevant to people's daily lives.
• provide an exciting and memorable learning experience.
• make universities more accessible to the general public.
What could be the possible outputs, outcomes
and impact of this activity?
http://www.publicengagement.ac.uk/how/case-studies/corrosion-summer-ball
Outputs, outcomes and impact
32. Challenges of measuring impact
What are the key challenges to measuring
impact?
Results
Behaviour
L e a r n I n g
R e a c t I o n
34. 3. Making an impact with your evaluation
How can you make use of evaluation?
• Self reflection
• Reports
– Case studies and other formats eg presentations/ video/
audio etc
• Making a case for future funding/ support
35. What worked well?
Why?
What did not work well?
Why not?
What will I do the same next time?
What will I do differently next time?
36. What are the key
things you need
to include in a
report?
Reports
37. Other ways of reporting
Case studies/ Video etc
What are the pros and cons of using
case studies as a way of reporting on
your evaluation?
38. Making a case
• Things you can evidence
–History of evaluative practice informing
development of activities
–Learning (self/team reflection)
–Approach is informed by target audiences
–Effective practice
–Commitment to future evaluation to
inform activity
39. Beyond the report
What are the opportunities for sharing your
evaluation with others?
• On your website
• With funders
• With partners
• With others e.g. NCCPE; Collective Memory
40. Top tips
• Think about your audience
• Develop your evaluation plan at the beginning
• Don’t collect data you can’t use
• Beware of misrepresenting your data
• Back up qualitative data with quantitative data
• Don’t hide mistakes – learn from them
• Reflect on what you would do differently next time
• Recognise the challenges of measuring impact
• Be realistic about what you can measure
• Remember the value of using evaluation during the
project
• Share what you have learnt
43. NCCPE http://www.publicengagement.ac.uk/how/guides/evaluation/resources
Manchester Beacon Evaluation Guide http://www.manchesterbeacon.org/about/
UCL Evaluation Toolkit
http://www.ucl.ac.uk/public-engagement/research/toolkits/Event_Evaluation
RCUK Evaluation Guide
http://www.rcuk.ac.uk/documents/publications/evaluationguide.pdf
HE STEM http://www.hestem.ac.uk/evaluation
Inspiring Learning for All http://www.inspiringlearningforall.gov.uk/toolstemplates/
Useful Resources
Notas del editor
SophieImage[Wriggly Rangoli Project – Manchester Science Festival: The project was collaboration between Manchester Development Education Project, UoM researchers and Inspired Sisters (a group of Asian women and their children from Longsight) to raise awareness of parasitic infections and global poverty. A workshop with scientists and a group of Asian women informed them of the science and discussed their experiences. The science then inspired designs which were translated into large-scale public art (Rangoli) in Longsight and Manchester Museum.]
Introduction activity – when people arrive and register – as them to take part in this activity
Suzanne : Introduce the evaluation plan as part of the project plan – ie need to think about it at the start.Develop your evaluation plan alongside your event/activity plan. This will help you plan your project, as thinking about aims and objectives is clearly part of developing your project plan anyway. Does not have be long eg. 1-2 sides of A4.Helps keep you focused and clear – what you want to know; how you will collect the data; what data you need to collect; etc.
talk through the slide – details of all the things you need in an evaluation planPoint out there is one in the participant handbook – and copy will be available on ning site for download
5 mins
This is a very simplified version of a logic model. It assumes that there is a linear and singular relationship between the various stages. We will look at some more complex versions later. There is often some confusion between outputs and outcomes and I like the Treasury’s definitions that they use in their guide to Evaluating policy impact, although like many Civil Service definitions, they can be a little circular
10 mins
Once you have an understanding of how what your project does links with what you are hoping it will change, it becomes easier to plan the evaluation. It also helps you see how there are different aspects that may be evaluated.
An example of a more complicated logic model on the control of Striga (a weed that infests maize crops in Africa) with multiple causal links
Each of these steps has a set of assumptions and it is often these assumptions that are the key to the evaluation that you want to do.
Take answers (5 mins)
Introduction activity – when people arrive and register – as them to take part in this activity
This exercise is an opportunity for participants to explore what they already know about the key audiences for their evaluation. Each group is given one of the key audiences: You and your team; senior managers; funders; stakeholders (which could include participants in the activity). Feedback onto a grid – with key things people want to know from the evaluation and four columns which we can tick to say which of the audiences each is relevant to ....
Unless you know the purpose and audiences for your evaluation – it is impossible to come up with a good plan.For example if you are not prepared to learn from the evaluation in order to inform your own practice – then you may want to reconsider whether to evaluate your activity at all.If the funder expects certain things from your evaluation – then you need to ensure your plan enables you to collect the relevant data to address the questions the funder has.Finally – an understanding of the limits of budget and resources, means that when you put your funding proposal together – you don’t overclaim what the evaluation can show you. This is particularly pertinent is you are claiming impacts that are measurable over a long time period and have no plan to evaluate after the end of the project.So how do you go about putting an evaluation plan together – this is a quick overview of what we cover in our day course – a beginners guide to evaluation. We won’t have much time to dwell on this – but it is important framing for the rest of the day.
See strategies from the case study doc.
Use Summer Corrosion Ball to quickly group brainstorm and then define outputs, outcomes and impact. Possible answers could include:Outputs = Number of people who took part. Type of people who took part eg. families. How the activity could be improved. Any unexpected outputs.Outcomes = Public knowledge of corrosion improved. People developed a better understanding of how corrosion affects their everyday lives. People enjoyed the experience. Young people were exposed to area of science - corrosion. Any unexpected outcomes.Impact = Was there an impact? If so, what type of impact? Could the impact have been greater?
Using the quotes – which are good examples of impact? What else would you need to know.In groups come up with top challenges of measuring impact.These could include Difficulty of proving or measuring causality Resources needed to properly measure long term impacts (ie longitudinal studies are resource intensive) Attrition Variety of factors involved many of which are nothing to do with you Lots of things outside of your control Many impacts are unexpected and therefore difficult to set up a system to measure them Difficulty of setting up control group -to ensure differences in outcomes are not attributed to the intervention if they are in fact the result of changes in the ‘state of the world’.A useful tool for thinking about the methodology of an evaluation is the Kirkpatrick model.4 This model is helpful for thinking about how much evaluation to undertake for a particular initiative. There are four levels ofpotential impact of a initiative according to the Kirkpatrick model:a. Reaction – The initial response to participation (e.g. immediatefeedback on the initiative including things like enjoyment, facilities, best and worst aspects of initiative)b. Learning – Changes in people’s understanding, or raising theirawareness of an issue (this might include a comparison group tomeasure how things have changed as a result of the initiative, or use a baseline to establish changes)For instance, comparing a group of participating pupils with an otherwise similar set of non participating pupils. This is further detailed in the RCUK ‘Evaluation: Practical Guidelines’ document mentioned above.c. Behaviour – Whether people substantially modify what they dolonger term assessment of changes and measurement of the extentd. Results – To track longer-term impacts of the initiative on measurable outcomes e.g. exam results (longer term more complex analysis – might be difficult to separate effects of an initiative with other things which have an impact on the relevant results)
Making use of evaluation – what are the key ways we make use of it
How can you use evaluation to improve your own practice? To maximise the benefits of your evaluation you need to critically reflect on your project. It is about learning from experience. Activity: Brainstorm what makes up critical reflection? Questioning Seeking alternatives Keeping an open mind Comparing and contrasting Viewing from various perspectives Asking "what if...?" Asking for others' ideas and viewpoints Considering consequences Hypothesising Synthesising and testing Seeking, identifying, and resolving problems [Source: Roth, R. A. "Preparing the Reflective Practitioner: Transforming the Apprentice through the Dialectic." Journal of Teacher Education 40, no. 2 (March-April 1989): 31-35.] A well-used reflection model is based on three questions:WHAT? (what happened?)SO WHAT? (what did you learn?)AND WHAT? (what will do as result of experience?) When reflecting yourself or facilitating others to reflect remember to select a style that suits your audience and situation; be creative; cater for different learning styles; and give people time to think! Activity: Think of an activity/event or project that you have been involved in. Summarise what your activity was in one sentence (WHAT). Draw a picture that illustrates what you learnt (SO WHAT). State one thing you will/have done as a result of the activity (AND WHAT). Ask for people to share in groups?/pairs? To summarise the KEY QUESTIONS to ask are:what worked well? why? what did not work well? why not? what will I do the same next time? what will I do differently next time? Consider using different creative methods to facilitate reflection eg.Using drawing or images, objectsPose questions, interview each other, videoWritten diaries, logs, stories, journalScrapbooks, graffiti walls
In final report it is important to consider strengths and weaknesses and lessons learned for the future.There is no point collecting data unless you are going to make use of it and share it with colleagues/stakeholders. Producing a written report of the evaluation process can be very useful – even if it is a short summary.WHAT DO YOU NEED TO THINK ABOUT WHEN WRITING YOUR REPORT?Ask participants to suggest things needed in reportAudience – who will be reading your report?Structure - should be structured around the evaluation questions/objectives your evaluation set out to address and include:The context of the evaluationAim, objectives and evaluation questionsDescription of activity/eventMethodologySummary of evidence (data itself may form appendix)Overview of the activity/eventConclusions and recommendationsLayout – standard report including an exec summary; case study approach Critical Reflection - reflect on what you have learned from the experience. What changes will you make next time?Public - if possible remember to feedback findings to those involved, value their contribution and thank them.Next steps - make sure the findings are acted upon.
Think about your audiences (potential participants and audiences for your evaluationDevelop your evaluation plan at the beginningDon’t collect data you can’t useBeware of misrepresenting your dataBack up qualitative data with quantitative dataDon’t hide mistakes – learn from themReflect on what you would do differently next timeRecognise the challenges of measuring impact – but don’t let this put you off. Be realistic about what you can measure with your evaluation – and don’t overpromiseRemember the value of using evaluation during the project, to create more effective activities and eventsShare what you have learnt with others and use it to make a case for the future