SlideShare una empresa de Scribd logo
1 de 58
Descargar para leer sin conexión
Evaluation of outcomes of CGIAR’s CRPs 
ECoP training session 
25-26 September 2014 
Burt Perrin La Masque 
Burt@BurtPerrin.com 30770 Vissec 
FRANCE 
+33 4 67 81 50 11 1
Purpose of the training session 
Consider approaches to the evaluation of outcomes 
Complex programmes/initiatives 
Focus on the CRPs 
Outcomes of the session 
Better understanding: what’s involved in evaluation of outcomes in complex environ. 
Appreciation of challenges – and opportunities 
Ideas that you can use 
2
Topics to explore 
How to plan evaluations 
Complexity: what it is, implications for evaluation of research, CRPs 
Evaluation vs. other related activities 
Some tools for evaluation planning (evaluability assessment, TOC, outcome trajectories) 
Focus on evaluation use 
Evaluation designs and methods 
Analysis and interpretation 
What this means for evaluation of outcomes of CRPs 3
Characteristics of Evaluation 
4
Why do evaluation? 
Raison d’être of evaluation 
 
Social betterment 
 
Sensemaking 
More generally, rationale for evaluation 
 
To be used! 
 
Improved policies, programmes, projects, services, thinking 
5
Evaluation – some key aspects 
Systematic, data based, “objective” 
 
Evidence can come from multiples sources 
Can consider any aspect of a strategy, policy, programme, project 
Major focus on outcomes that follow from the intervention (i.e. attribution, cause) 
E - valua - tion 
6
Different types of evaluation 
Ex-ante vs. ex-post 
Process vs. outcome 
Formative vs. summative 
Descriptive vs. judgemental 
Accountability vs. learning (vs. advocacy vs. pro-forma) 
Short-term actions vs. long-term thinking 
Etc. 
7
Maximising evaluation value & use 
The right questions! 
Outcome focus 
Emergent – not restricted to pre- determined objectives/indicators 
Respects context, identifies how it interacts with what is done 
Identifies alignment of activities/projects/programmes with strategy/goal 
Assesses results orientation as well as actual results achieved 8
What is “complexity”? 
Emergent vs. predetermined outcomes 
Feedback loops 
Indirect, non linear trajectories; tipping points 
Unpredictability, random events 
Multiple components: partners, levels, causal package (complicated) 
(But: try to explain complex situations as simply as possible!) 
9
Nature of intervention and logic chain (e.g. Rogers) 
Simple 
 
E.g. following a recipe 
 
Linear cause-and effect chain 
Complicated 
 
E.g. sending a rocket to the moon 
 
Multiple factors happening simultaneously 
Complex 
 
E.g. raising a child 
 
Recursive (feedback loops), emergent outcomes that can’t be identified in advance 
 
Tipping points 
10
Some characteristics of non- linear change (complexity science) 
Cause-effect distance (outcome trajectory): long (or short) in time 
Depends upon a large number of intervening variables 
Usually several causes for any effect 
Change not proportional, incremental; qualitative leaps and bounds 
 
Sometimes initial ‘negative’ effects (e.g. the J-curve) – implications for evaluation? 
Feedback loops 
11
Werner Herzog: 
1. “Man is a god when he dreams, but a beggar when he reflects.” 
2. “Facts do not constitute the truth. There is a deeper stratum.” 
Agree or not? 
Implications for evaluation? 
12
Future orientation - Dilemma 
“The greatest dilemma of mankind is that all knowledge is about past events and all decisions about the future. 
The objective of this planning, long-term and imperfect as it may be, is to make reasonably sure that, in the future, we may end up approximately right instead of exactly wrong.” 
13
14
Similarities, differences, complementarities 
Evaluation and: 
 
Research 
 
Monitoring 
 
Audit 
15
Evaluation vs. Research 
Research 
 
Primary objective: long-term knowledge generation (no single study rarely sufficient) 
 
Theory creation/testing/revision 
 
Evidence needs: the more the better 
Evaluation 
 
Reference to a particular type of situation 
 
Practical application/utilisation in some form an essential component 
 
Evidence needs: as little as necessary to support meaningful use (level of confidence required) 
But: evaluation makes use of research methodologies – from diverse disciplines 
16
Monitoring – the concept and common definitions 
Tracking progress in accordance with previously identified objectives, indicators, or targets (plan vs. reality) 
 
RBM, performance measurement, performance indicators … 
En français: “suivi” vs. “contrôle” 
Some other uses of the term 
 
Any ongoing activity involving data collection and performance (usually internal, sometimes seen as self evaluation) 
17
Monitoring and Evaluation 
Monitoring 
Periodic, using data routinely gathered or readily obtainable, generally internal 
Assumes appropriateness of programme, activities, objectives, indicators 
Tracks progress against small number of targets/ indicators (one at a time) 
Usually quantitative 
Cannot indicate causality 
Difficult to use for impact assessment 
Evaluation 
Generally episodic, often external 
Can question the rationale and relevance of the program and its objectives 
Can identify unintended as well as planned impacts and effects 
Can address “how” and “why” questions 
Can provide guidance for future directions 
Can use data from different sources and from a wide variety of methods18
How Monitoring and Evaluation can be complementary 
Ongoing monitoring 
Can identify questions, issues for (in-depth) evaluation 
Provide data for evaluation 
Nature of the intervention 
Evaluation 
Can identify what should be monitored in the future 
19
20 
Monitoring, Evaluation and Impact Evaluation 
Inputs 
Outputs 
Outcomes 
Impact 
Investments (resources, staff…) and activities 
Products 
Intermediate achievements of 
the project 
Long-term, sustainable changes 
Monitoring: what has been invested, done and produced, and how are we progressing towards the achievement of the objectives? 
Evaluation: what occurred and what has been achieved as a result of the project? 
Impact evalua- tion: what long- term, sustainable changes have been produced (e.g. poverty reduction)?
Evaluation vs. audit 
21 
Audit 
Compliance focus 
 
Rules and procedures 
 
Divergence: planned vs. actual 
Main attention to process 
Identify transgressions 
Standardised approach 
Outside scrutiny 
Evaluation 
Outcome orientation, context and rationale, attribution 
Constructive guidance 
“Why” and “how” as well as “what” considerations 
Unintended as well as planned impacts and effects 
Wide range of potential approaches and methods
Evaluability assessment (including theory of change) 
22
What is an evaluability assessment (EA)? 
Essentially, evidence-based plan for evaluation 
What aspects of the programme are evaluable – and when? 
 
E.g. coherent programme logic, data availability, conducive environment … 
What the programme needs to do 
Expected outcome trajectories 
TOC that includes above considerations 
23
Elements in an EA 
(Involve stakeholders – build buy-in) 
Review/clarify programme intent; identify varying perspectives 
Help articulate the TOC; identify the soundness of the programme logic, including gaps 
Identify evaluation priorities and questions 
Identify evaluation implications for the program 
Explore feasibility of addressing potential questions (data availability, cost, other considerations) 
Explore alternative evaluation designs 24
Outcome focus: what is this? 
Change that follows from the intervention in some way 
 
OECD/DAC: The likely or achieved short- term and medium-term effects of an intervention’s outputs 
Can/should consider other factors/ interventions 
Consider the “whys” 
25
Outcomes (vs. process, impact) 
Level 
What is this 
Example: farmer training 
Process 
Activities, outputs 
What was done 
E.g. programme set up, implemented (as expected, or differently), needs assessment carried out, curriculum developed, outreach, training delivered 
Outcomes 
Changes following from the programme 
E.g. learning/expertise, confidence, planting practices, increased yields, new markets, increased revenues 
Impact 
Long-term effects following from intervention 
Invariably in combination 
Raison d’être 
Sustainability of short-term gains, Poverty, hunger, malnutrition reduction; natural resources sustainability 
26
Questions for evaluation 
Start with the questions 
 
Choice of methods to follow 
How to identify questions 
 
Who can use evaluation information? 
 
What information can be used? How? 
 
Different stakeholders – different questions 
 
Consider responses to hypothetical findings 
 
Develop the theory of change 
How many questions? 
27
28
Three key evaluation questions 
What’s happening? 
(planned and unplanned, little or big at any level) 
Why? 
So what? 
29
UNEG’s three evaluation questions 
Are we doing the right thing? 
Are we doing it right? 
Are there better ways of achieving the results? 
30
OECD/DAC Evaluation Criteria 
Relevance 
Effectiveness 
Efficiency 
Impact 
Sustainability 
31 
• Evaluation criteria vs. evaluation questions 
• Breadth vs. focus 
• Intelligent vs. mechanical use
Some uses for evaluation 
Programme improvement 
Identify new policies, programme directions, strategies 
Programme formation 
Decision making at all levels 
Accountability 
Learning 
Identification of needs 
Advocacy 
Instilling evaluative/questioning culture 
32
Some priorities for an EA 
Focus on outcomes 
 
Identify expected/potential outcomes 
 
Be open to unintended outcomes 
 
Outcome trajectories 
Evaluation priorities and questions 
Surface and question assumptions 
 
Implicit and explicit 
Be realistic (priorities, expectations of the programme and the evaluation) 
 
Don’t set up the programme for failure 
33
34
Theory of Change 
Why a useful tool for planning an evaluation 
Alternative terms (intervention logic, logic model, results chain …) 
Linear vs. models that reflect complexity 
35
Results chain 
Impact 
Outcomes 
Reach 
Outputs 
Processes 
Inputs 
36
Intervention logic model 
InputsActivitiesOutputsResults/ Intermediate OutcomesUltimate Impacts 
37
Generic logic model (simplified) 
38
Generic logic model – in context 
InputsActivitiesIntermediate results (1) Intermediate results (2) ImpactsOther resultsOther resultsOther resultsOther resultsOther factorsOther factorsOther factorsNeedsEnvironment et contextKnowledgeOutputsOther factorsOther interventionsOther interventions 
39
IMPACT ON CHILDREN 
IPEC/partner Initiatives 
Targeted Interventions 
Capacity building 
Children 
Families and communities 
The enabling environment 
(Institutions, policies & programmes, legislation, awareness, mobilization…) 
40
41 
Outline of factors affecting maternal and child health and nutrition 
Fig. from Victora, Cesar G, Robert E Black, J Ties Boerma, Jennifer Bryce. (2010). Measuring impact in the Millennium Development Goal era and beyond: a new approach to large-scale effectiveness. The Lancet. Published Online July 9, 2010
42 
AAS Theory of Change
43 
AAS Theory of Change: Stakeholder engagement workshop
Design, analysis and method considerations 
44
Alternative models of causality (All recognised in the physical and social sciences) 
Successionist (factual) causality 
 
Counterfactual logic 
 
All but one possible explanation ruled out 
Generative (physical) causality 
 
Focus on underlying processes, the “signature” 
Simultaneous or alternative causal strands 
 
“INUS” conditions: insufficient but necessary, sufficient but unnecessary 
Non linear (e.g. “tipping point”) causality 
45
Some considerations in choice of design (and methods) 
Addresses, somehow, priority questions 
Simplest approach – at needed confidence 
Internal/external validity 
Face validity, construct validity 
Gets at “the whys” as well as “the whats” 
Engages stakeholders, partners 
Practicality (resources, time, data …) 
46
Determining attribution – some alternatives 
Experimental/quasi-experimental designs (counterfactual, randomisation) 
Eliminate rival plausible hypotheses 
Generative (physical) causality, INUS, non linear (“tipping point”) 
Theory of change approach 
“Reasonable attribution” 
“Contribution” vs. “cause” 
47
Eliminate rival plausible hypotheses (Donald T. Campbell) 
Identify plausible alternative explanations 
Plausible to multiple stakeholders 
Anticipate possible questions of sceptics 
Consider threats to both internal and to external validity 
Use the simplest means possible to rule out likelihood of alternative explanations 
48
Contribution Analysis (Mayne: Using performance measures sensibly) 
1. 
Develop the results chain 
2. 
Assess the existing evidence on results 
3. 
Assess the alternative explanations 
4. 
Assemble the performance story 
5. 
Seek out additional evidence 
6. 
Revise and strengthen the performance story 
49
Further considerations for meaningful outcome evaluation 
Need information about inputs and activities as well as about outcomes 
 
Check, don’t assume that what is mandated in (Western) capitals is what actually takes place sur le terrain 
 
Check: are data sources really accurate? 
Dealing with responsiveness – a problem or a strength? 
(Internal vs. external validity) 
50
Some alternative approaches 
Theory based 
Realist evaluation 
Most Significant Change, Success Case Method, Appreciative Inquiry 
Participative 
Outcome mapping/harvesting 
Anthropological 
Etc. etc. etc. 
51
To bear in mind 
“For every complex question, there is a simple answer – and it is wrong.” – H.L. Mencken 
“One cannot succeed on visible figures alone… The most important figures that one needs for management are unknown or unknowable.” – W. Edward Deming 
“Not every that can be counted counts, and not everything that counts can be counted.” – Einstein 
52
And … 
“Assessment of many of the most common activities in government requires soft judgment… Measurement often misses the point, sometimes causing awful distortions.” – Mintzberg 
“Better an approximate answer to the right question than an exact answer to the wrong question that can always be made precise.” – Tukey 
53
Methods for data gathering: possible options 
Surveys 
Panel studies/longitudinal 
(experimental/quasi-experimental) 
Interviews, group interviews 
Documentation, analysis of records 
Observation (quantitative, qualitative) 
Community members as researchers 
Alternative methods 
Multiple methods 
54
Making evaluation useful - 1 
Be strategic 
 
E.g. start with the big picture – identify questions arising 
Focus on priority questions and information requirements 
Consider needs, preferences, of key evaluation users 
Don’t be limited to stated/intended effects 
Be realistic, don’t set programs up for failure 
Don’t try to do everything in one evaluation 
55
Making evaluation useful - 2 
Primary focus: how evaluation can be relevant and useful 
Bear the beneficiaries in mind 
Take into account diversity, including differing world views, logics, and values 
Be an (appropriate) advocate 
Don’t be too broad 
Don’t be too narrow 
56
How else can one practice evaluation so that it is useful? 
Follow the Golden Rule 
 
“There are no golden rules.” (European Commission) 
 
Art as much as science 
Be future oriented – focused on use 
Involve stakeholders 
Use multiple and complementary methods, qualitative and quantitative 
Recognize differences between monitoring and evaluation 
57
Conclusion 
Primary focus: helping to make a difference (think strategically!) 
 
Requires focus of some form on outcomes 
 
What happens when, why, and so what 
 
Use evaluation to embrace complexity – as simply as possible 
 
Questions are more important than the “right” method 
Thank you / grazie / merci / gracias 
Burt Perrin 
Burt@BurtPerrin.com 58

Más contenido relacionado

La actualidad más candente

Difference between monitoring and evaluation
Difference between monitoring and evaluationDifference between monitoring and evaluation
Difference between monitoring and evaluation
Doreen Ty
 
Monitoring and Evaluation Policy
Monitoring and Evaluation PolicyMonitoring and Evaluation Policy
Monitoring and Evaluation Policy
Komal Zahra
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
Michelle Joja
 

La actualidad más candente (20)

Monitoring & Evalution ...... Orientation PPT
Monitoring & Evalution ...... Orientation PPT Monitoring & Evalution ...... Orientation PPT
Monitoring & Evalution ...... Orientation PPT
 
CPE Monitoring and Evaluation
CPE Monitoring and EvaluationCPE Monitoring and Evaluation
CPE Monitoring and Evaluation
 
Afid project monitoring and evaluation practices and lessons
Afid project monitoring and evaluation  practices and lessonsAfid project monitoring and evaluation  practices and lessons
Afid project monitoring and evaluation practices and lessons
 
7 M&E: Indicators
7 M&E: Indicators7 M&E: Indicators
7 M&E: Indicators
 
Rotary Foundation Cadre Training: Monitoring and Evaluation
Rotary Foundation Cadre Training: Monitoring and EvaluationRotary Foundation Cadre Training: Monitoring and Evaluation
Rotary Foundation Cadre Training: Monitoring and Evaluation
 
Monitoring and Evaluation of Plan
Monitoring and Evaluation of PlanMonitoring and Evaluation of Plan
Monitoring and Evaluation of Plan
 
Monotoring and evaluation principles and theories
Monotoring and evaluation  principles and theoriesMonotoring and evaluation  principles and theories
Monotoring and evaluation principles and theories
 
Two Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and EvaluationTwo Examples of Program Planning, Monitoring and Evaluation
Two Examples of Program Planning, Monitoring and Evaluation
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
United Way or Erie County - Programs, Program Monitoring and Evaluation, and ...
United Way or Erie County - Programs, Program Monitoring and Evaluation, and ...United Way or Erie County - Programs, Program Monitoring and Evaluation, and ...
United Way or Erie County - Programs, Program Monitoring and Evaluation, and ...
 
M& e slide share
M& e   slide shareM& e   slide share
M& e slide share
 
Training on Monitoring & Evaluation (M&E) of Adaptation and the NAP process
Training on Monitoring & Evaluation (M&E) of Adaptation and the NAP processTraining on Monitoring & Evaluation (M&E) of Adaptation and the NAP process
Training on Monitoring & Evaluation (M&E) of Adaptation and the NAP process
 
Introduction - Monitoring and evaluation framework
Introduction - Monitoring and evaluation frameworkIntroduction - Monitoring and evaluation framework
Introduction - Monitoring and evaluation framework
 
Difference between monitoring and evaluation
Difference between monitoring and evaluationDifference between monitoring and evaluation
Difference between monitoring and evaluation
 
Monitoring and Evaluation Policy
Monitoring and Evaluation PolicyMonitoring and Evaluation Policy
Monitoring and Evaluation Policy
 
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
 
Monitoring and Evaluation Framework
Monitoring and Evaluation FrameworkMonitoring and Evaluation Framework
Monitoring and Evaluation Framework
 
M&E Plan
M&E PlanM&E Plan
M&E Plan
 

Similar a "Assessing Outcomes in CGIAR: Practical Approaches and Methods"

Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
contentli
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
WorldFish
 
Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9
Harinder Goel
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
Institute of Development Studies
 

Similar a "Assessing Outcomes in CGIAR: Practical Approaches and Methods" (20)

Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
MEAL ETH11171-draft.pptx
MEAL ETH11171-draft.pptxMEAL ETH11171-draft.pptx
MEAL ETH11171-draft.pptx
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
Curriculum monitoring
Curriculum monitoringCurriculum monitoring
Curriculum monitoring
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
Importance of M&E
Importance of M&EImportance of M&E
Importance of M&E
 
M&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMMM&E CLW 26Nov2015, MMM
M&E CLW 26Nov2015, MMM
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)USP Sport Matters - Workshop (Sherry)
USP Sport Matters - Workshop (Sherry)
 
evaluation
evaluationevaluation
evaluation
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Insights from Program Evaluation for Retrospective Reviews of regulations
Insights from Program Evaluation for Retrospective Reviews of regulationsInsights from Program Evaluation for Retrospective Reviews of regulations
Insights from Program Evaluation for Retrospective Reviews of regulations
 
Day 2
Day 2Day 2
Day 2
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9Project monitoring-evaluation-sr-1223744598885201-9
Project monitoring-evaluation-sr-1223744598885201-9
 
Collaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandraCollaborative 2 ingrid margarita and sandra
Collaborative 2 ingrid margarita and sandra
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 

Más de Independent Evaluation Arrangement of CGIAR

Más de Independent Evaluation Arrangement of CGIAR (10)

Models and approaches to decentralized evaluation system - considerations for...
Models and approaches to decentralized evaluation system - considerations for...Models and approaches to decentralized evaluation system - considerations for...
Models and approaches to decentralized evaluation system - considerations for...
 
CGIAR Consortium/System Office - Monitoring, Evaluation and Learning
CGIAR Consortium/System Office - Monitoring, Evaluation and Learning CGIAR Consortium/System Office - Monitoring, Evaluation and Learning
CGIAR Consortium/System Office - Monitoring, Evaluation and Learning
 
The role of evaluation in a Monitoring, Evaluation, and Impact Assessment Fra...
The role of evaluation in a Monitoring, Evaluation, and Impact Assessment Fra...The role of evaluation in a Monitoring, Evaluation, and Impact Assessment Fra...
The role of evaluation in a Monitoring, Evaluation, and Impact Assessment Fra...
 
Standing Panel on Impact Assessmernt (SPIA/CGIAR) - presentation to evaluatio...
Standing Panel on Impact Assessmernt (SPIA/CGIAR) - presentation to evaluatio...Standing Panel on Impact Assessmernt (SPIA/CGIAR) - presentation to evaluatio...
Standing Panel on Impact Assessmernt (SPIA/CGIAR) - presentation to evaluatio...
 
Evaluation use, learning, and influence
Evaluation use, learning, and influence Evaluation use, learning, and influence
Evaluation use, learning, and influence
 
Evaluations in CGIAR - 2015 update from IEA
Evaluations in CGIAR - 2015 update from IEA Evaluations in CGIAR - 2015 update from IEA
Evaluations in CGIAR - 2015 update from IEA
 
Independent evaluation of CGIAR Research Programs PIM, WHEAT, MAIZE and AAS: ...
Independent evaluation of CGIAR Research Programs PIM, WHEAT, MAIZE and AAS: ...Independent evaluation of CGIAR Research Programs PIM, WHEAT, MAIZE and AAS: ...
Independent evaluation of CGIAR Research Programs PIM, WHEAT, MAIZE and AAS: ...
 
CRP on WHEAT indepedent evaluation: Brief summary of findings, conclusions an...
CRP on WHEAT indepedent evaluation: Brief summary of findings, conclusions an...CRP on WHEAT indepedent evaluation: Brief summary of findings, conclusions an...
CRP on WHEAT indepedent evaluation: Brief summary of findings, conclusions an...
 
CRP on Aquatic Agricultural Systems (AAS) independent evaluation: Brief summa...
CRP on Aquatic Agricultural Systems (AAS) independent evaluation: Brief summa...CRP on Aquatic Agricultural Systems (AAS) independent evaluation: Brief summa...
CRP on Aquatic Agricultural Systems (AAS) independent evaluation: Brief summa...
 
Evaluation community of practice workshop presentation
Evaluation community of practice   workshop presentationEvaluation community of practice   workshop presentation
Evaluation community of practice workshop presentation
 

Último

Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...
Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...
Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...
gajnagarg
 

Último (20)

Dating Call Girls inBaloda Bazar Bhatapara 9332606886Call Girls Advance Cash...
Dating Call Girls inBaloda Bazar Bhatapara  9332606886Call Girls Advance Cash...Dating Call Girls inBaloda Bazar Bhatapara  9332606886Call Girls Advance Cash...
Dating Call Girls inBaloda Bazar Bhatapara 9332606886Call Girls Advance Cash...
 
Call Girls Mehsana / 8250092165 Genuine Call girls with real Photos and Number
Call Girls Mehsana / 8250092165 Genuine Call girls with real Photos and NumberCall Girls Mehsana / 8250092165 Genuine Call girls with real Photos and Number
Call Girls Mehsana / 8250092165 Genuine Call girls with real Photos and Number
 
Call Girls Basheerbagh ( 8250092165 ) Cheap rates call girls | Get low budget
Call Girls Basheerbagh ( 8250092165 ) Cheap rates call girls | Get low budgetCall Girls Basheerbagh ( 8250092165 ) Cheap rates call girls | Get low budget
Call Girls Basheerbagh ( 8250092165 ) Cheap rates call girls | Get low budget
 
2024: The FAR, Federal Acquisition Regulations, Part 31
2024: The FAR, Federal Acquisition Regulations, Part 312024: The FAR, Federal Acquisition Regulations, Part 31
2024: The FAR, Federal Acquisition Regulations, Part 31
 
Contributi dei parlamentari del PD - Contributi L. 3/2019
Contributi dei parlamentari del PD - Contributi L. 3/2019Contributi dei parlamentari del PD - Contributi L. 3/2019
Contributi dei parlamentari del PD - Contributi L. 3/2019
 
Just Call VIP Call Girls In Bangalore Kr Puram ☎️ 6378878445 Independent Fem...
Just Call VIP Call Girls In  Bangalore Kr Puram ☎️ 6378878445 Independent Fem...Just Call VIP Call Girls In  Bangalore Kr Puram ☎️ 6378878445 Independent Fem...
Just Call VIP Call Girls In Bangalore Kr Puram ☎️ 6378878445 Independent Fem...
 
Sustainability by Design: Assessment Tool for Just Energy Transition Plans
Sustainability by Design: Assessment Tool for Just Energy Transition PlansSustainability by Design: Assessment Tool for Just Energy Transition Plans
Sustainability by Design: Assessment Tool for Just Energy Transition Plans
 
31st World Press Freedom Day - A Press for the Planet: Journalism in the face...
31st World Press Freedom Day - A Press for the Planet: Journalism in the face...31st World Press Freedom Day - A Press for the Planet: Journalism in the face...
31st World Press Freedom Day - A Press for the Planet: Journalism in the face...
 
Finance strategies for adaptation. Presentation for CANCC
Finance strategies for adaptation. Presentation for CANCCFinance strategies for adaptation. Presentation for CANCC
Finance strategies for adaptation. Presentation for CANCC
 
Election 2024 Presiding Duty Keypoints_01.pdf
Election 2024 Presiding Duty Keypoints_01.pdfElection 2024 Presiding Duty Keypoints_01.pdf
Election 2024 Presiding Duty Keypoints_01.pdf
 
Vasai Call Girls In 07506202331, Nalasopara Call Girls In Mumbai
Vasai Call Girls In 07506202331, Nalasopara Call Girls In MumbaiVasai Call Girls In 07506202331, Nalasopara Call Girls In Mumbai
Vasai Call Girls In 07506202331, Nalasopara Call Girls In Mumbai
 
2024 UNESCO/Guillermo Cano World Press Freedom Prize
2024 UNESCO/Guillermo Cano World Press Freedom Prize2024 UNESCO/Guillermo Cano World Press Freedom Prize
2024 UNESCO/Guillermo Cano World Press Freedom Prize
 
Financing strategies for adaptation. Presentation for CANCC
Financing strategies for adaptation. Presentation for CANCCFinancing strategies for adaptation. Presentation for CANCC
Financing strategies for adaptation. Presentation for CANCC
 
sponsor for poor old age person food.pdf
sponsor for poor old age person food.pdfsponsor for poor old age person food.pdf
sponsor for poor old age person food.pdf
 
Make a difference in a girl's life by donating to her education!
Make a difference in a girl's life by donating to her education!Make a difference in a girl's life by donating to her education!
Make a difference in a girl's life by donating to her education!
 
The NAP process & South-South peer learning
The NAP process & South-South peer learningThe NAP process & South-South peer learning
The NAP process & South-South peer learning
 
2024: The FAR, Federal Acquisition Regulations, Part 32
2024: The FAR, Federal Acquisition Regulations, Part 322024: The FAR, Federal Acquisition Regulations, Part 32
2024: The FAR, Federal Acquisition Regulations, Part 32
 
Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...
Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...
Top profile Call Girls In Haldia [ 7014168258 ] Call Me For Genuine Models We...
 
3 May, Journalism in the face of the Environmental Crisis.
3 May, Journalism in the face of the Environmental Crisis.3 May, Journalism in the face of the Environmental Crisis.
3 May, Journalism in the face of the Environmental Crisis.
 
Peace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdf
Peace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdfPeace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdf
Peace-Conflict-and-National-Adaptation-Plan-NAP-Processes-.pdf
 

"Assessing Outcomes in CGIAR: Practical Approaches and Methods"

  • 1. Evaluation of outcomes of CGIAR’s CRPs ECoP training session 25-26 September 2014 Burt Perrin La Masque Burt@BurtPerrin.com 30770 Vissec FRANCE +33 4 67 81 50 11 1
  • 2. Purpose of the training session Consider approaches to the evaluation of outcomes Complex programmes/initiatives Focus on the CRPs Outcomes of the session Better understanding: what’s involved in evaluation of outcomes in complex environ. Appreciation of challenges – and opportunities Ideas that you can use 2
  • 3. Topics to explore How to plan evaluations Complexity: what it is, implications for evaluation of research, CRPs Evaluation vs. other related activities Some tools for evaluation planning (evaluability assessment, TOC, outcome trajectories) Focus on evaluation use Evaluation designs and methods Analysis and interpretation What this means for evaluation of outcomes of CRPs 3
  • 5. Why do evaluation? Raison d’être of evaluation  Social betterment  Sensemaking More generally, rationale for evaluation  To be used!  Improved policies, programmes, projects, services, thinking 5
  • 6. Evaluation – some key aspects Systematic, data based, “objective”  Evidence can come from multiples sources Can consider any aspect of a strategy, policy, programme, project Major focus on outcomes that follow from the intervention (i.e. attribution, cause) E - valua - tion 6
  • 7. Different types of evaluation Ex-ante vs. ex-post Process vs. outcome Formative vs. summative Descriptive vs. judgemental Accountability vs. learning (vs. advocacy vs. pro-forma) Short-term actions vs. long-term thinking Etc. 7
  • 8. Maximising evaluation value & use The right questions! Outcome focus Emergent – not restricted to pre- determined objectives/indicators Respects context, identifies how it interacts with what is done Identifies alignment of activities/projects/programmes with strategy/goal Assesses results orientation as well as actual results achieved 8
  • 9. What is “complexity”? Emergent vs. predetermined outcomes Feedback loops Indirect, non linear trajectories; tipping points Unpredictability, random events Multiple components: partners, levels, causal package (complicated) (But: try to explain complex situations as simply as possible!) 9
  • 10. Nature of intervention and logic chain (e.g. Rogers) Simple  E.g. following a recipe  Linear cause-and effect chain Complicated  E.g. sending a rocket to the moon  Multiple factors happening simultaneously Complex  E.g. raising a child  Recursive (feedback loops), emergent outcomes that can’t be identified in advance  Tipping points 10
  • 11. Some characteristics of non- linear change (complexity science) Cause-effect distance (outcome trajectory): long (or short) in time Depends upon a large number of intervening variables Usually several causes for any effect Change not proportional, incremental; qualitative leaps and bounds  Sometimes initial ‘negative’ effects (e.g. the J-curve) – implications for evaluation? Feedback loops 11
  • 12. Werner Herzog: 1. “Man is a god when he dreams, but a beggar when he reflects.” 2. “Facts do not constitute the truth. There is a deeper stratum.” Agree or not? Implications for evaluation? 12
  • 13. Future orientation - Dilemma “The greatest dilemma of mankind is that all knowledge is about past events and all decisions about the future. The objective of this planning, long-term and imperfect as it may be, is to make reasonably sure that, in the future, we may end up approximately right instead of exactly wrong.” 13
  • 14. 14
  • 15. Similarities, differences, complementarities Evaluation and:  Research  Monitoring  Audit 15
  • 16. Evaluation vs. Research Research  Primary objective: long-term knowledge generation (no single study rarely sufficient)  Theory creation/testing/revision  Evidence needs: the more the better Evaluation  Reference to a particular type of situation  Practical application/utilisation in some form an essential component  Evidence needs: as little as necessary to support meaningful use (level of confidence required) But: evaluation makes use of research methodologies – from diverse disciplines 16
  • 17. Monitoring – the concept and common definitions Tracking progress in accordance with previously identified objectives, indicators, or targets (plan vs. reality)  RBM, performance measurement, performance indicators … En français: “suivi” vs. “contrôle” Some other uses of the term  Any ongoing activity involving data collection and performance (usually internal, sometimes seen as self evaluation) 17
  • 18. Monitoring and Evaluation Monitoring Periodic, using data routinely gathered or readily obtainable, generally internal Assumes appropriateness of programme, activities, objectives, indicators Tracks progress against small number of targets/ indicators (one at a time) Usually quantitative Cannot indicate causality Difficult to use for impact assessment Evaluation Generally episodic, often external Can question the rationale and relevance of the program and its objectives Can identify unintended as well as planned impacts and effects Can address “how” and “why” questions Can provide guidance for future directions Can use data from different sources and from a wide variety of methods18
  • 19. How Monitoring and Evaluation can be complementary Ongoing monitoring Can identify questions, issues for (in-depth) evaluation Provide data for evaluation Nature of the intervention Evaluation Can identify what should be monitored in the future 19
  • 20. 20 Monitoring, Evaluation and Impact Evaluation Inputs Outputs Outcomes Impact Investments (resources, staff…) and activities Products Intermediate achievements of the project Long-term, sustainable changes Monitoring: what has been invested, done and produced, and how are we progressing towards the achievement of the objectives? Evaluation: what occurred and what has been achieved as a result of the project? Impact evalua- tion: what long- term, sustainable changes have been produced (e.g. poverty reduction)?
  • 21. Evaluation vs. audit 21 Audit Compliance focus  Rules and procedures  Divergence: planned vs. actual Main attention to process Identify transgressions Standardised approach Outside scrutiny Evaluation Outcome orientation, context and rationale, attribution Constructive guidance “Why” and “how” as well as “what” considerations Unintended as well as planned impacts and effects Wide range of potential approaches and methods
  • 22. Evaluability assessment (including theory of change) 22
  • 23. What is an evaluability assessment (EA)? Essentially, evidence-based plan for evaluation What aspects of the programme are evaluable – and when?  E.g. coherent programme logic, data availability, conducive environment … What the programme needs to do Expected outcome trajectories TOC that includes above considerations 23
  • 24. Elements in an EA (Involve stakeholders – build buy-in) Review/clarify programme intent; identify varying perspectives Help articulate the TOC; identify the soundness of the programme logic, including gaps Identify evaluation priorities and questions Identify evaluation implications for the program Explore feasibility of addressing potential questions (data availability, cost, other considerations) Explore alternative evaluation designs 24
  • 25. Outcome focus: what is this? Change that follows from the intervention in some way  OECD/DAC: The likely or achieved short- term and medium-term effects of an intervention’s outputs Can/should consider other factors/ interventions Consider the “whys” 25
  • 26. Outcomes (vs. process, impact) Level What is this Example: farmer training Process Activities, outputs What was done E.g. programme set up, implemented (as expected, or differently), needs assessment carried out, curriculum developed, outreach, training delivered Outcomes Changes following from the programme E.g. learning/expertise, confidence, planting practices, increased yields, new markets, increased revenues Impact Long-term effects following from intervention Invariably in combination Raison d’être Sustainability of short-term gains, Poverty, hunger, malnutrition reduction; natural resources sustainability 26
  • 27. Questions for evaluation Start with the questions  Choice of methods to follow How to identify questions  Who can use evaluation information?  What information can be used? How?  Different stakeholders – different questions  Consider responses to hypothetical findings  Develop the theory of change How many questions? 27
  • 28. 28
  • 29. Three key evaluation questions What’s happening? (planned and unplanned, little or big at any level) Why? So what? 29
  • 30. UNEG’s three evaluation questions Are we doing the right thing? Are we doing it right? Are there better ways of achieving the results? 30
  • 31. OECD/DAC Evaluation Criteria Relevance Effectiveness Efficiency Impact Sustainability 31 • Evaluation criteria vs. evaluation questions • Breadth vs. focus • Intelligent vs. mechanical use
  • 32. Some uses for evaluation Programme improvement Identify new policies, programme directions, strategies Programme formation Decision making at all levels Accountability Learning Identification of needs Advocacy Instilling evaluative/questioning culture 32
  • 33. Some priorities for an EA Focus on outcomes  Identify expected/potential outcomes  Be open to unintended outcomes  Outcome trajectories Evaluation priorities and questions Surface and question assumptions  Implicit and explicit Be realistic (priorities, expectations of the programme and the evaluation)  Don’t set up the programme for failure 33
  • 34. 34
  • 35. Theory of Change Why a useful tool for planning an evaluation Alternative terms (intervention logic, logic model, results chain …) Linear vs. models that reflect complexity 35
  • 36. Results chain Impact Outcomes Reach Outputs Processes Inputs 36
  • 37. Intervention logic model InputsActivitiesOutputsResults/ Intermediate OutcomesUltimate Impacts 37
  • 38. Generic logic model (simplified) 38
  • 39. Generic logic model – in context InputsActivitiesIntermediate results (1) Intermediate results (2) ImpactsOther resultsOther resultsOther resultsOther resultsOther factorsOther factorsOther factorsNeedsEnvironment et contextKnowledgeOutputsOther factorsOther interventionsOther interventions 39
  • 40. IMPACT ON CHILDREN IPEC/partner Initiatives Targeted Interventions Capacity building Children Families and communities The enabling environment (Institutions, policies & programmes, legislation, awareness, mobilization…) 40
  • 41. 41 Outline of factors affecting maternal and child health and nutrition Fig. from Victora, Cesar G, Robert E Black, J Ties Boerma, Jennifer Bryce. (2010). Measuring impact in the Millennium Development Goal era and beyond: a new approach to large-scale effectiveness. The Lancet. Published Online July 9, 2010
  • 42. 42 AAS Theory of Change
  • 43. 43 AAS Theory of Change: Stakeholder engagement workshop
  • 44. Design, analysis and method considerations 44
  • 45. Alternative models of causality (All recognised in the physical and social sciences) Successionist (factual) causality  Counterfactual logic  All but one possible explanation ruled out Generative (physical) causality  Focus on underlying processes, the “signature” Simultaneous or alternative causal strands  “INUS” conditions: insufficient but necessary, sufficient but unnecessary Non linear (e.g. “tipping point”) causality 45
  • 46. Some considerations in choice of design (and methods) Addresses, somehow, priority questions Simplest approach – at needed confidence Internal/external validity Face validity, construct validity Gets at “the whys” as well as “the whats” Engages stakeholders, partners Practicality (resources, time, data …) 46
  • 47. Determining attribution – some alternatives Experimental/quasi-experimental designs (counterfactual, randomisation) Eliminate rival plausible hypotheses Generative (physical) causality, INUS, non linear (“tipping point”) Theory of change approach “Reasonable attribution” “Contribution” vs. “cause” 47
  • 48. Eliminate rival plausible hypotheses (Donald T. Campbell) Identify plausible alternative explanations Plausible to multiple stakeholders Anticipate possible questions of sceptics Consider threats to both internal and to external validity Use the simplest means possible to rule out likelihood of alternative explanations 48
  • 49. Contribution Analysis (Mayne: Using performance measures sensibly) 1. Develop the results chain 2. Assess the existing evidence on results 3. Assess the alternative explanations 4. Assemble the performance story 5. Seek out additional evidence 6. Revise and strengthen the performance story 49
  • 50. Further considerations for meaningful outcome evaluation Need information about inputs and activities as well as about outcomes  Check, don’t assume that what is mandated in (Western) capitals is what actually takes place sur le terrain  Check: are data sources really accurate? Dealing with responsiveness – a problem or a strength? (Internal vs. external validity) 50
  • 51. Some alternative approaches Theory based Realist evaluation Most Significant Change, Success Case Method, Appreciative Inquiry Participative Outcome mapping/harvesting Anthropological Etc. etc. etc. 51
  • 52. To bear in mind “For every complex question, there is a simple answer – and it is wrong.” – H.L. Mencken “One cannot succeed on visible figures alone… The most important figures that one needs for management are unknown or unknowable.” – W. Edward Deming “Not every that can be counted counts, and not everything that counts can be counted.” – Einstein 52
  • 53. And … “Assessment of many of the most common activities in government requires soft judgment… Measurement often misses the point, sometimes causing awful distortions.” – Mintzberg “Better an approximate answer to the right question than an exact answer to the wrong question that can always be made precise.” – Tukey 53
  • 54. Methods for data gathering: possible options Surveys Panel studies/longitudinal (experimental/quasi-experimental) Interviews, group interviews Documentation, analysis of records Observation (quantitative, qualitative) Community members as researchers Alternative methods Multiple methods 54
  • 55. Making evaluation useful - 1 Be strategic  E.g. start with the big picture – identify questions arising Focus on priority questions and information requirements Consider needs, preferences, of key evaluation users Don’t be limited to stated/intended effects Be realistic, don’t set programs up for failure Don’t try to do everything in one evaluation 55
  • 56. Making evaluation useful - 2 Primary focus: how evaluation can be relevant and useful Bear the beneficiaries in mind Take into account diversity, including differing world views, logics, and values Be an (appropriate) advocate Don’t be too broad Don’t be too narrow 56
  • 57. How else can one practice evaluation so that it is useful? Follow the Golden Rule  “There are no golden rules.” (European Commission)  Art as much as science Be future oriented – focused on use Involve stakeholders Use multiple and complementary methods, qualitative and quantitative Recognize differences between monitoring and evaluation 57
  • 58. Conclusion Primary focus: helping to make a difference (think strategically!)  Requires focus of some form on outcomes  What happens when, why, and so what  Use evaluation to embrace complexity – as simply as possible  Questions are more important than the “right” method Thank you / grazie / merci / gracias Burt Perrin Burt@BurtPerrin.com 58