SlideShare una empresa de Scribd logo
1 de 45
www.wateraid.org
Webinar on final evaluation
and impact assessment of
Governance and Transparency
Fund Programme
Tuesday 23rd
April 2013
www.wateraid.org
Costa Rica
Honduras
Nicaragua
Guatemala
Burkina Faso
Ghana
Mali
Nigeria
Ethiopia
Kenya
Madagascar
Malawi
Uganda
Zambia
India
Bangladesh
WA/FAN GTF programme
www.wateraid.org
Background and purpose of the two
exercises
• The evaluation is
primarily for
Accountability and the
impact assessment is
primarily for Learning
• Complementary
exercises
• CPs and partners are
primary users of results
• Results for
communication,
fundraising etc
• Progress against the
baseline data is critical
www.wateraid.org
Overview of all evaluation and learning
processes and how they link together
 Mid Term Review – WHAT so far?
 Evaluation – WHAT?
 Impact assessment – So WHAT?
 Learning review – HOW?
 Most significant change
analysis – the WHAT about the SO
WHAT?
www.wateraid.org
Key Stakeholders in the process
DFID/KPMG
www.wateraid.org
Which country programme is doing what
www.wateraid.org
Different levels and how to deal with
this
• 7 countries are doing a full scale evaluation
• 9 countries are doing small scale evaluation
• All countries are doing an impact
assessment except from Kenya
• Small scale = updating Mid Term Review
• Full Scale = in depth assessment based on
key areas
www.wateraid.org
Length of the consultancy and how to
use your time
• Total number of days = 25 to be shared between two
exercises
• Rough Guideline:
Step 1: Understanding the context - understanding of the
problem in country that GTF is addressing
• Background reading - 1 day
• Working with country prog staff and key informants - reinforcing
understanding of programme, stakeholders and intervention design,
findings of MTR (if there was one) and conclusions - up to 3 days
Step 2: Enquiry – conducting self-assessment, semi-
structured interviews, FGD…..8 days
Step 3: Analysis - Self-assessment collation of results,
coding of qualitative data….2.5 accountability and 2.5 days
for learning analysis;
Step 4: Write the first draft of the report - 4 days
Step 5: Revisions and redraft of the report - 4 days
www.wateraid.org
Timeline
Dates Actions
4th
April ToR for Evaluation and Impact Assessment sent out to all countries
19th
April Each country to sign contracts with local consultants
23th April Webinar with all consultants
May 17th
Local consultants to submit draft Impact assessment section of
report
May 31st
Local consultants to submit draft Evaluation reports
June 24th
Local consultants to submit final Evaluation report including Impact
assessment
July 11th
MoF to submit global consolidated impact assessment
Week of July 22nd
CM to submit draft Evaluation report and share report to KPMG
Week of July 29th
Last Annual learning meeting
Mid Sept CM to submit Final Global Evaluation report
End of Sept /mid
October
Papa to share report with KPMG
End of October Submission of WaterAid PCR to KPMG
www.wateraid.org
The difference between an
evaluation and impact
assessment
www.wateraid.org
In summary....
Questions Monitoring Evaluation Impact Assessment
Why do we do
it?
Measures on-going
activities
Measures performance
against objectives
Assesses change in peoples
lives
What is the
main focus?
Focus on programme
interventions
Focus on programme
interventions
Focus on stakeholders
At what level? Outputs Outcomes/impact Impact and change
What are the
key questions
to ask?
•What is being done?
•Is our programme
progressing as
planned?
•What happened? Did
we achieve what we set
out to achieve in terms
of:
•Effectiveness
•Efficiency
•Relevance
•Sustainability
•Impact
•So what actually
changed?
•For whom?
•How significant is it for
them?
•Will it last?
•What, if anything, did our
programme contribute?
www.wateraid.org
The Evaluation Component
Catherine Currie
www.wateraid.org
GTF Summative Evaluation
• We are conducting a critical analysis of
the GTF Programme in order to assess
whether or not it achieved its goals
 Whether the planned activities occurred
 Whether the activities led to achievement of goals;
 How effective the project was;
 How costly the project was; etc.
• This is a summative or end of programme
evaluation
www.wateraid.org
Purpose of evaluation
• For accountability- to enable
beneficiaries, board members, etc to know
how funds have been used;
• Our country evaluations will assess:
– objectives against logframe targets and
milestones;
– programme performance by OECD DAC
criteria of effectiveness, efficiency, relevance,
sustainability, replicability and impact
www.wateraid.org
A useful Global Evaluation
Report• Top tips include:
– i) the process of collation, analysis and write up
– ii) enhanced rigour and comparability of results and
reports…………...so
• A consistent stance
• Support and advice through online forum
• Verification of evidence
• Step-by-step validation of evaluation results
• Quality assurance processes
• Prioritise the use of country systems
• Use a set of agreed working definitions for key terms
• Use the WaterAid report template
www.wateraid.org
Evaluation Questions
Relevance:
• What we expect: Details of the programme’s significance
with respect to increasing voice, accountability and
responsiveness within the local context.
Evaluation Questions:
1. How well did the programme relate to governance priorities at
local, national or internal levels? Please demonstrate with
examples in relation to: i)increasing voice; ii) accountability;
and, iii) responsiveness within the local context.
2. How well did the programme relate to the Country Strategy
Paper aims and objectives? Of WaterAid and where
applicable of the FAN network – ie regional secretariats and of
DFID.
3. How logical is the current theory of change?
www.wateraid.org
Effectiveness
• What we expect: An assessment of how far the
intended outcomes were achieved in relation to
targets set in the original logical framework.
Evaluation Questions:
1. Have interventions achieved the objectives? At
country regional and global level.
2. How effective and appropriate was the programme
approach? How effective was the MEL system and
framework?
3. With hindsight, how could it have been improved?
www.wateraid.org
Partnership
• What we expect: How well did the
partnership and management
arrangements work and how did they
develop over time? Please consider areas
such as monitoring, evaluation and
learning arrangements. If possible,
consider from a regional perspective.
www.wateraid.org
Advocacy
• What we expect: To what extent has GTF
contributed to WaterAid influencing
targets?
Evaluation Questions:
1.How has the programme helped
implement successful advocacy
strategies? Are there any lessons learned
about measuring influencing.
2.How has the programme contributed to
the overall in country advocacy strategy?
www.wateraid.org
Equity
• What we expect: Discussion of social differentiation (e.g. by
gender, ethnicity, socio economic group, disability, etc) and
the extent to which the programme had a positive impact
(from an accountability perspective) on the more
disadvantaged groups.
Evaluation Questions:
1. How did the programme actively promote gender equality?
2. What was the impact of the programme on children, youth
and the elderly?
3. What was the impact of the programme on ethnic minorities?
4. If the programme involved work with children, how were child
protection issues addressed?
5. How were the needs of excluded groups, including people
with disabilities and people living with HIV/AIDS addressed
within the programme?
www.wateraid.org
Value for Money
• What we expect: Good value for money is the optimal use of
resources to achieve the intended outcome.
Evaluation Questions:
1. Has economy been achieved in the implementation of
programme activities?
2. Could the same inputs have been purchased for less money?
3. Were salaries and other expenditures appropriate to the
context?
4. What are the costs and benefits of this programme?
5. Is there an optimum balance between Economy, Efficiency
and Effectiveness? Overall, did the programme represent
good value for money?
www.wateraid.org
Efficiency
• What we expect: How far funding, personnel, regulatory,
administrative, time, other resources and procedures
contributed to or hindered the achievement of outputs.
Evaluation Questions:
1. Are there obvious links between significant expenditures and
key programme outputs? How well did the partnership and
management arrangements work and how did they develop
over time?
2. How well did the financial systems work?
3. Were the risks properly identified and well managed?
4. For advice on measuring value for money in governance
programmes see DFID’s Briefing Note (July 2011)
Indicators and VFM in Governance Programming, available
at: www.dfid.gov.uk
www.wateraid.org
Sustainability
• What we expect: Potential for the
continuation of the impact achieved and of
the delivery mechanisms following the
withdrawal of existing funding.
Evaluation Questions:
1.What are the prospects for the benefits of the
programme being sustained after the funding
stops? Did this match the intentions?
2.How have collaboration, networking and
influencing of opinion support sustainability?
www.wateraid.org
Innovation & Replicability
• What we expect: How replicable is the
process that introduced the changes/impact?
Refer especially to innovative aspects which
are replicable.
Evaluation Questions:
1.What aspects of the programme are
replicable elsewhere?
2.Under what circumstances and/or in what
contexts would the programme be replicable?
www.wateraid.org
Expected impact and change
• What we expect: Details of the broader economic, social, and political
consequences of the programme and how it contributed to the overall
objectives of the Governance and Transparency Fund (increased capability,
accountability and responsiveness) and to poverty reduction.
Evaluation Questions:
1. It is critical to demonstrate the progress in relation to the indicators included in the
GTF programme logframe. The focus is on accountability for the impact.
2. What was the programme’s overall impact and how does this compare with what was
expected? Please demonstrate from an accountability perspective if the perceived
impact was achieved and if not, why not.
3. Did the programme address the intended target group and what was the actual
coverage? Again from an accountability perspective, was the coverage reached? If
not, why not, if yes, how?
4. Who were the direct and indirect/wider beneficiaries of the programme? Again, the
importance here is to set out who these were for accountability purposes.
5. What difference has been made to the lives of those involved in the programme?
Describe the impact.
6. As you are aware, the Consultant is also conducting more detailed critical analysis on
Impact for learning purposes.
www.wateraid.org
Vertical Logic of Programme
Impact is the higher level
situation that the project
contributes towards
achieving
Outcome identifies
what will change and
who benefits during
the lifetime of the
project
Outputs are specific
deliverables
Human Resource and
financial inputs
LEARNING:
For the GTF Global
Consultants this
requires evidence of :
‘so what’?
LEARNING:
For the GTF Global
Consultants this
requires evidence of :
‘so what’?
ACCOUNTABILITY
For the GTF Global
Consultants this requires
evidence against
programme specific
objectives
ACCOUNTABILITY
For the GTF Global
Consultants this requires
evidence against
programme specific
objectives
www.wateraid.org
The Impact Assessment
Component
www.wateraid.org
Why conduct the impact assessment
component?
• To learn and improve:
• To enable Country programme staff, stakeholders in
country, Wateraid staff and others to really understand
what changed as a result of the programme and to apply
this to future plans
• To test and refine our understanding of how change
happens and how successful we have been in supporting
positive changes for our stakeholders:
• To what extent did we work with the right people? In the right way? How did
this all link up ?
• To what extent did the changes we expected to see along the way support
the long term changes we were aiming to influence?
• What does this tell us about the way we think we can influence change?
• What should we do differently next time?
www.wateraid.org
The Impact Assessment
– Focus on the “so what question”
• what’s changed?
• For whom?
• How significant/lasting are these changes for
different stakeholder groups?
• In what ways did the programme contribute
– Expect the unexpected - we are looking for
evidence of positive/negative/ intended and
unintended changes,
– Prioritise analysis over gathering information
– need for open and probing questions
www.wateraid.org
Key questions for the
Impact Assessment
www.wateraid.org
Background and context – what we need
to know (Country Programme Theory of
Change):
• The local and national context, including key social,
political and environmental conditions and how they
have changed over the life time of the programme
• Key issues that the programme planned to address
• The target groups who would ultimately benefit from
the programme and how each would benefit?
• The process or sequence of changes that would lead
to the desired long-term goal
• The assumptions that the programme made about the
anticipated process of change
• The other actors/factors who had the potential to
influence the changes sought, both positively or
negatively.
www.wateraid.org
Four Domains of Change
1. Changes in the ways in which CSOs function and
network, and their capacity to influence the design,
implementation and evaluation of effective WASH policies at
all levels
2. Changes in the ways that CSOs, including those
representing marginalised groups, are able to engage in
decision-making processes affecting the WASH sector.
3. Changes in the ways in which members of local
communities demand accountability and responsiveness
from governments and service providers in the WASH sector
4. Changes in the ways that Governments and service
providers are accountable to citizens and end users in the
WASH sector
www.wateraid.org
Each Domain is broken down further into
“areas of enquiry”
These are the key questions you need to explore across all Domains:
1. What has actually changed for each of the different stakeholder groups,
especially the poorest and most marginalized communities in relation to
WASH (positive, negative, intended and/or unintended changes)
2. How significant and/or sustainable are these changes for the different
target groups?
3. To what extent do these changes compare with baselines and changes
that were planned and expected?
4. How do they link together and/or influence each other?
5. To what extent did the GTF programme contribute to these changes?
How?
6. Who or what else might have contributed to these changes? How?
7. How confident are you in these findings (levels of evidence)?
www.wateraid.org
Areas of Enquiry Domain 1
Domain 1 Key Areas of Enquiry
Changes in the ways in which
CSOs function and network,
and their capacity to influence
the design, implementation
and evaluation of effective
WASH policies at all levels
• Ways in which networks have developed
and function over time
• Shifts in CSO capacity
• How this capacity change has influenced
policy and practice at
o local levels
o National level
Note: we will provide further guidelines on how to assess these areas of enquiry in
the next week
www.wateraid.org
Areas of Enquiry Domain 2
• P
Domain 2 Key Areas of Enquiry
Changes in the ways that
CSOs, including those
representing marginalized
groups, are able to engage
in decision-making
processes affecting the
WASH sector.
• Shift in awareness, knowledge and confidence of
marginalized groups
• Shifts in the ways that people have been able to
demand their rights
• The extent to which the voices of marginalized
people are making a difference to policy and practice
• Ways in which different CSO strategies have
influenced change (e.g. budget tracking, participation
in stakeholder reviews, etc…)
Note: we will provide further guidelines on how to assess these areas of
enquiry in the next week
www.wateraid.org
Areas of Enquiry Domain 3
Domain 3 Key Areas of Enquiry
Changes in the ways in which
members of local communities
demand accountability and
responsiveness from governments
and service providers in the WASH
sector
• Levels of awareness of rights in local
communities
• Ways in which media coverage
supports understanding of rights
• Ways in which citizens are influencing
policy and practice over time
• Changes in community access to WASH
• Changes in community influence over
natural resources
Note: we will provide further guidelines on how to assess these areas of enquiry in
the next week
www.wateraid.org
Areas of Enquiry Domain 4
Domain 4 Key Areas of Enquiry
Changes in the ways that
Governments and service providers
are accountable to citizens and end
users in the WASH sector
• Changing levels of governance,
transparency and compliance
• Changes in policy and regulation (e.g.
new policies, laws, standards, political
and institutional framework) – and
the consequences of these
• Changes in practice relating to WASH
(e.g. delivery of new services and
systems) and the consequences
Note: we will provide further guidelines on how to assess these areas of enquiry in the
next week
www.wateraid.org
Methodology for both components
Restrict yourselves to using a few tried and tested
tools. We suggest:
– Facilitated self assessment: building on MTR which will
support the evaluation component
• How to do this and who should be involved
• Note: we will be adding some more change questions this time
– Follow up workshop to validate findings and focus on the
impact assessment element
• How to do this and who should be involved
– Other in depths interviews /FGD with key informants as
required
• This might enable a deeper understanding of e.g. how changes
affected particular target groups
www.wateraid.org
Guiding Principles for Methodology
• Create an atmosphere where informants
feel able to be honest and provide critical
feedback. Use an appreciative enquiry
approach
• Ensure a mix of both qualitative and
quantitative data is gathered
• For the impact assessment – ask open
and probing questions for a deeper
understanding of change
• Findings must be backed up with evidence
and be set against the original baseline
www.wateraid.org
Consideration of sample size
Questions:
• How many people to interview?
• What is a “good enough sample?”
Answers:
• Be pragmatic (you have limited time but need to be
representative).
• Plan with in country focal point:
– Include people/groups/interventions which represent
• good/strong
• Medium
• Poor/weak
• Explain your sampling decisions in the methodology
section of the report ( with an indication of the level of
rigour you believe this provides)
www.wateraid.org
The Report
We will provide more detailed guidance over the next week but this is
the guide
35 -40 pages to include:
• Executive Summaries x 2 - (4 pages in total)
• Contents + Abbreviations (1 page)
• Methodology + challenges and limitations (2 pages )
• Country Context and introduction to the programme (3 pages )
• Evaluation Report (10 pages) – findings and conclusions under the
following headings:
– Relevance, Effectiveness, Partnership, Advocacy Equity, Value for
Money, Efficiency, Sustainability, Innovation and Replicability, Expected
Impact and change.
• Impact Assessment Report (10 pages) – findings and conclusions under the
following headings:
– Changes under each Domain
– Overall analysis of impact for different target groups
– What difference the programme has made overall
• Overall conclusions and learning for Country Programmes, for the sector
and globally (4 pages)
• Annexes
www.wateraid.org
Other ways to present findings
• Opportunity for Wateraid to share findings and
learning with a wide group of stakeholders -
• Target groups:
– Country programme staff
– Networks in country
– Partners
– Wateraid donors
– Sector specialists
• Supplementary ways of presenting findings (optional)
– Case studies
– Video footage
– Photos
www.wateraid.org
Next steps
• Maureen and Catherine to send more
detailed guidelines by Friday April 26th
• In country evaluation team to take stock of
the outcomes of the webinar and
• Prepare and send proposal to Catherine and
Maureen copied to Marta and Papa by
Friday April 26th
with a brief overview of your
plan including:
– Time line
– Your methodology
– Key informants
– Sample size and rationale for this
www.wateraid.org
Support and assistance
• Guidance notes to follow:
– Some thematic guidance
– Self-assessment format to use
– Learning questions
– Report format
• Online forum: you will be able to post questions, debate issues and
findings with other consultants involved in the GTF final evaluation and
impact assessment.
– Please expect an email in your inbox with a password and user details from
either Catherine or her colleague Erica Packington Erica will manage access to
the private forum.
– Please action the email immediately to guarantee access.
• E mail contacts:
– Technical advice
• Catherine Currie catherine@iodparc.com
• Maureen O’Flynn maureen@oflynn.demon.co.uk
– Logistics:
• Papa Diouf PapaDiouf@wateraid.org
• Marta Barcelo MartaBarcelo@wateraid.org
Please copy everything to Marta
www.wateraid.org
Questions?

Más contenido relacionado

La actualidad más candente

Addressing the Resolution Gap
Addressing the Resolution GapAddressing the Resolution Gap
Addressing the Resolution GapJordan Teague
 
Conference evaluation at a glance
Conference evaluation at a glanceConference evaluation at a glance
Conference evaluation at a glanceGlenn O'Neil
 
Global Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatGlobal Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatRotary International
 
2012 AERA Presentation
2012 AERA Presentation2012 AERA Presentation
2012 AERA PresentationLiam Goldrick
 
7b highlights of the expert meeting to assess progress on na ps
7b highlights of the expert meeting to assess progress on na ps7b highlights of the expert meeting to assess progress on na ps
7b highlights of the expert meeting to assess progress on na psNAP Events
 
Programmatic approach: External presentation may 2011
Programmatic approach: External presentation may 2011Programmatic approach: External presentation may 2011
Programmatic approach: External presentation may 2011ICCO Cooperation
 
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...Achieving Measurable Collective Impact with Results-Based Accountability - Mu...
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...Clear Impact
 
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...Tariq A. Deen
 
Knowledge and awareness WP7 and Capacity development WP6_steven downey_28 aug
Knowledge and awareness WP7 and Capacity development WP6_steven downey_28 augKnowledge and awareness WP7 and Capacity development WP6_steven downey_28 aug
Knowledge and awareness WP7 and Capacity development WP6_steven downey_28 augGlobal Water Partnership
 
New Directions for the Quality Matters Program
New Directions for the Quality Matters ProgramNew Directions for the Quality Matters Program
New Directions for the Quality Matters ProgramMarylandOnline
 
Monitoring the process using the PEG M&E Tool
Monitoring the process using the PEG M&E ToolMonitoring the process using the PEG M&E Tool
Monitoring the process using the PEG M&E ToolNAP Events
 

La actualidad más candente (20)

Addressing the Resolution Gap
Addressing the Resolution GapAddressing the Resolution Gap
Addressing the Resolution Gap
 
Evaluation of Nutrition-Sensitive Programs
Evaluation of Nutrition-Sensitive ProgramsEvaluation of Nutrition-Sensitive Programs
Evaluation of Nutrition-Sensitive Programs
 
MPA: Some perspectives on project delivery
MPA: Some perspectives on project deliveryMPA: Some perspectives on project delivery
MPA: Some perspectives on project delivery
 
Framework user guide presentation cpw dec132015
Framework user guide presentation cpw dec132015Framework user guide presentation cpw dec132015
Framework user guide presentation cpw dec132015
 
Monitoring and Evaluation System for CAADP Implementation_2010
Monitoring and Evaluation System for CAADP Implementation_2010Monitoring and Evaluation System for CAADP Implementation_2010
Monitoring and Evaluation System for CAADP Implementation_2010
 
Conference evaluation at a glance
Conference evaluation at a glanceConference evaluation at a glance
Conference evaluation at a glance
 
Challenges and Opportunities of Trilateral Partnership to Promote Greater Loc...
Challenges and Opportunities of Trilateral Partnership to Promote Greater Loc...Challenges and Opportunities of Trilateral Partnership to Promote Greater Loc...
Challenges and Opportunities of Trilateral Partnership to Promote Greater Loc...
 
Global Grants: Moving from Good to Great
Global Grants: Moving from Good to GreatGlobal Grants: Moving from Good to Great
Global Grants: Moving from Good to Great
 
2012 AERA Presentation
2012 AERA Presentation2012 AERA Presentation
2012 AERA Presentation
 
7b highlights of the expert meeting to assess progress on na ps
7b highlights of the expert meeting to assess progress on na ps7b highlights of the expert meeting to assess progress on na ps
7b highlights of the expert meeting to assess progress on na ps
 
infob106
infob106infob106
infob106
 
Programmatic approach: External presentation may 2011
Programmatic approach: External presentation may 2011Programmatic approach: External presentation may 2011
Programmatic approach: External presentation may 2011
 
Programmatic approach
Programmatic approach Programmatic approach
Programmatic approach
 
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...Achieving Measurable Collective Impact with Results-Based Accountability - Mu...
Achieving Measurable Collective Impact with Results-Based Accountability - Mu...
 
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...
PEG M&E tool: a tool for monitoring and reviewing Progress, Effectiveness & G...
 
Introducing the National Adaptation Plans technical guidelines from the UNFCC...
Introducing the National Adaptation Plans technical guidelines from the UNFCC...Introducing the National Adaptation Plans technical guidelines from the UNFCC...
Introducing the National Adaptation Plans technical guidelines from the UNFCC...
 
RiPPLE Communications
RiPPLE CommunicationsRiPPLE Communications
RiPPLE Communications
 
Knowledge and awareness WP7 and Capacity development WP6_steven downey_28 aug
Knowledge and awareness WP7 and Capacity development WP6_steven downey_28 augKnowledge and awareness WP7 and Capacity development WP6_steven downey_28 aug
Knowledge and awareness WP7 and Capacity development WP6_steven downey_28 aug
 
New Directions for the Quality Matters Program
New Directions for the Quality Matters ProgramNew Directions for the Quality Matters Program
New Directions for the Quality Matters Program
 
Monitoring the process using the PEG M&E Tool
Monitoring the process using the PEG M&E ToolMonitoring the process using the PEG M&E Tool
Monitoring the process using the PEG M&E Tool
 

Similar a Final outline plan for webinar evaluation and impact assessment mof 2004

PCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on EvaluationPCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on Evaluationrexcris
 
What to Learn? How to Learn? Results from the River Basin Breakout Sessions
What to Learn? How to Learn? Results from the River Basin Breakout SessionsWhat to Learn? How to Learn? Results from the River Basin Breakout Sessions
What to Learn? How to Learn? Results from the River Basin Breakout SessionsIwl Pcu
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluationCarlo Magno
 
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...OECD CFE
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...JSI
 
Results Monitoring Reporting 18 1 16
Results Monitoring  Reporting 18 1 16Results Monitoring  Reporting 18 1 16
Results Monitoring Reporting 18 1 16Joseph Banda
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
 
USER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATUSER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATLenny Hidayat
 
Effectiveness: Funder perspectives (B Dornan, Scottish Government)
Effectiveness: Funder perspectives (B Dornan, Scottish Government)Effectiveness: Funder perspectives (B Dornan, Scottish Government)
Effectiveness: Funder perspectives (B Dornan, Scottish Government)NIDOS
 
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...OECD Governance
 
Monitoring and Evaluation of welfare projects
 Monitoring and Evaluation of welfare projects Monitoring and Evaluation of welfare projects
Monitoring and Evaluation of welfare projectsJayapriya Dhilipkumar
 
Almm monitoring and evaluation tools draft[1]acm sir revised
Almm monitoring and evaluation tools draft[1]acm sir revisedAlmm monitoring and evaluation tools draft[1]acm sir revised
Almm monitoring and evaluation tools draft[1]acm sir revisedAlberto Mico
 
Participatory Monitoring- WG6.ppt
Participatory Monitoring- WG6.pptParticipatory Monitoring- WG6.ppt
Participatory Monitoring- WG6.pptMdFarhanShahriar3
 
The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)IRC
 
February 2019 CoP Webinar - Building Maturity - using the Change Management M...
February 2019 CoP Webinar - Building Maturity - using the Change Management M...February 2019 CoP Webinar - Building Maturity - using the Change Management M...
February 2019 CoP Webinar - Building Maturity - using the Change Management M...Prosci ANZ
 

Similar a Final outline plan for webinar evaluation and impact assessment mof 2004 (20)

PCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on EvaluationPCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on Evaluation
 
What to Learn? How to Learn? Results from the River Basin Breakout Sessions
What to Learn? How to Learn? Results from the River Basin Breakout SessionsWhat to Learn? How to Learn? Results from the River Basin Breakout Sessions
What to Learn? How to Learn? Results from the River Basin Breakout Sessions
 
Monitoring evaluation
Monitoring evaluationMonitoring evaluation
Monitoring evaluation
 
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...
Evaluation of SME and entreprenuership programme - Jonathan Potter & Stuart T...
 
Nokuzola Mamabolo - FHI360, South Africa
Nokuzola Mamabolo - FHI360, South AfricaNokuzola Mamabolo - FHI360, South Africa
Nokuzola Mamabolo - FHI360, South Africa
 
Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...
 
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
Evidence On Trial: weighing the value of evidence in academic enquiry, policy...
 
Results Monitoring Reporting 18 1 16
Results Monitoring  Reporting 18 1 16Results Monitoring  Reporting 18 1 16
Results Monitoring Reporting 18 1 16
 
Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
USER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYATUSER GUIDE M&E 2014 LENNY HIDAYAT
USER GUIDE M&E 2014 LENNY HIDAYAT
 
M&E Concepts.pptx
M&E Concepts.pptxM&E Concepts.pptx
M&E Concepts.pptx
 
Effectiveness: Funder perspectives (B Dornan, Scottish Government)
Effectiveness: Funder perspectives (B Dornan, Scottish Government)Effectiveness: Funder perspectives (B Dornan, Scottish Government)
Effectiveness: Funder perspectives (B Dornan, Scottish Government)
 
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
Evaluating the performance of OECD Committees -- Kevin Williams, OECD Secreta...
 
M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
Monitoring and Evaluation of welfare projects
 Monitoring and Evaluation of welfare projects Monitoring and Evaluation of welfare projects
Monitoring and Evaluation of welfare projects
 
Almm monitoring and evaluation tools draft[1]acm sir revised
Almm monitoring and evaluation tools draft[1]acm sir revisedAlmm monitoring and evaluation tools draft[1]acm sir revised
Almm monitoring and evaluation tools draft[1]acm sir revised
 
Participatory Monitoring- WG6.ppt
Participatory Monitoring- WG6.pptParticipatory Monitoring- WG6.ppt
Participatory Monitoring- WG6.ppt
 
The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)
 
February 2019 CoP Webinar - Building Maturity - using the Change Management M...
February 2019 CoP Webinar - Building Maturity - using the Change Management M...February 2019 CoP Webinar - Building Maturity - using the Change Management M...
February 2019 CoP Webinar - Building Maturity - using the Change Management M...
 

Último

Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Enterprise Knowledge
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEarley Information Science
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...gurkirankumar98700
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 

Último (20)

Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...Driving Behavioral Change for Information Management through Data-Driven Gree...
Driving Behavioral Change for Information Management through Data-Driven Gree...
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 

Final outline plan for webinar evaluation and impact assessment mof 2004

  • 1. www.wateraid.org Webinar on final evaluation and impact assessment of Governance and Transparency Fund Programme Tuesday 23rd April 2013
  • 3. www.wateraid.org Background and purpose of the two exercises • The evaluation is primarily for Accountability and the impact assessment is primarily for Learning • Complementary exercises • CPs and partners are primary users of results • Results for communication, fundraising etc • Progress against the baseline data is critical
  • 4. www.wateraid.org Overview of all evaluation and learning processes and how they link together  Mid Term Review – WHAT so far?  Evaluation – WHAT?  Impact assessment – So WHAT?  Learning review – HOW?  Most significant change analysis – the WHAT about the SO WHAT?
  • 5. www.wateraid.org Key Stakeholders in the process DFID/KPMG
  • 7. www.wateraid.org Different levels and how to deal with this • 7 countries are doing a full scale evaluation • 9 countries are doing small scale evaluation • All countries are doing an impact assessment except from Kenya • Small scale = updating Mid Term Review • Full Scale = in depth assessment based on key areas
  • 8. www.wateraid.org Length of the consultancy and how to use your time • Total number of days = 25 to be shared between two exercises • Rough Guideline: Step 1: Understanding the context - understanding of the problem in country that GTF is addressing • Background reading - 1 day • Working with country prog staff and key informants - reinforcing understanding of programme, stakeholders and intervention design, findings of MTR (if there was one) and conclusions - up to 3 days Step 2: Enquiry – conducting self-assessment, semi- structured interviews, FGD…..8 days Step 3: Analysis - Self-assessment collation of results, coding of qualitative data….2.5 accountability and 2.5 days for learning analysis; Step 4: Write the first draft of the report - 4 days Step 5: Revisions and redraft of the report - 4 days
  • 9. www.wateraid.org Timeline Dates Actions 4th April ToR for Evaluation and Impact Assessment sent out to all countries 19th April Each country to sign contracts with local consultants 23th April Webinar with all consultants May 17th Local consultants to submit draft Impact assessment section of report May 31st Local consultants to submit draft Evaluation reports June 24th Local consultants to submit final Evaluation report including Impact assessment July 11th MoF to submit global consolidated impact assessment Week of July 22nd CM to submit draft Evaluation report and share report to KPMG Week of July 29th Last Annual learning meeting Mid Sept CM to submit Final Global Evaluation report End of Sept /mid October Papa to share report with KPMG End of October Submission of WaterAid PCR to KPMG
  • 10. www.wateraid.org The difference between an evaluation and impact assessment
  • 11. www.wateraid.org In summary.... Questions Monitoring Evaluation Impact Assessment Why do we do it? Measures on-going activities Measures performance against objectives Assesses change in peoples lives What is the main focus? Focus on programme interventions Focus on programme interventions Focus on stakeholders At what level? Outputs Outcomes/impact Impact and change What are the key questions to ask? •What is being done? •Is our programme progressing as planned? •What happened? Did we achieve what we set out to achieve in terms of: •Effectiveness •Efficiency •Relevance •Sustainability •Impact •So what actually changed? •For whom? •How significant is it for them? •Will it last? •What, if anything, did our programme contribute?
  • 13. www.wateraid.org GTF Summative Evaluation • We are conducting a critical analysis of the GTF Programme in order to assess whether or not it achieved its goals  Whether the planned activities occurred  Whether the activities led to achievement of goals;  How effective the project was;  How costly the project was; etc. • This is a summative or end of programme evaluation
  • 14. www.wateraid.org Purpose of evaluation • For accountability- to enable beneficiaries, board members, etc to know how funds have been used; • Our country evaluations will assess: – objectives against logframe targets and milestones; – programme performance by OECD DAC criteria of effectiveness, efficiency, relevance, sustainability, replicability and impact
  • 15. www.wateraid.org A useful Global Evaluation Report• Top tips include: – i) the process of collation, analysis and write up – ii) enhanced rigour and comparability of results and reports…………...so • A consistent stance • Support and advice through online forum • Verification of evidence • Step-by-step validation of evaluation results • Quality assurance processes • Prioritise the use of country systems • Use a set of agreed working definitions for key terms • Use the WaterAid report template
  • 16. www.wateraid.org Evaluation Questions Relevance: • What we expect: Details of the programme’s significance with respect to increasing voice, accountability and responsiveness within the local context. Evaluation Questions: 1. How well did the programme relate to governance priorities at local, national or internal levels? Please demonstrate with examples in relation to: i)increasing voice; ii) accountability; and, iii) responsiveness within the local context. 2. How well did the programme relate to the Country Strategy Paper aims and objectives? Of WaterAid and where applicable of the FAN network – ie regional secretariats and of DFID. 3. How logical is the current theory of change?
  • 17. www.wateraid.org Effectiveness • What we expect: An assessment of how far the intended outcomes were achieved in relation to targets set in the original logical framework. Evaluation Questions: 1. Have interventions achieved the objectives? At country regional and global level. 2. How effective and appropriate was the programme approach? How effective was the MEL system and framework? 3. With hindsight, how could it have been improved?
  • 18. www.wateraid.org Partnership • What we expect: How well did the partnership and management arrangements work and how did they develop over time? Please consider areas such as monitoring, evaluation and learning arrangements. If possible, consider from a regional perspective.
  • 19. www.wateraid.org Advocacy • What we expect: To what extent has GTF contributed to WaterAid influencing targets? Evaluation Questions: 1.How has the programme helped implement successful advocacy strategies? Are there any lessons learned about measuring influencing. 2.How has the programme contributed to the overall in country advocacy strategy?
  • 20. www.wateraid.org Equity • What we expect: Discussion of social differentiation (e.g. by gender, ethnicity, socio economic group, disability, etc) and the extent to which the programme had a positive impact (from an accountability perspective) on the more disadvantaged groups. Evaluation Questions: 1. How did the programme actively promote gender equality? 2. What was the impact of the programme on children, youth and the elderly? 3. What was the impact of the programme on ethnic minorities? 4. If the programme involved work with children, how were child protection issues addressed? 5. How were the needs of excluded groups, including people with disabilities and people living with HIV/AIDS addressed within the programme?
  • 21. www.wateraid.org Value for Money • What we expect: Good value for money is the optimal use of resources to achieve the intended outcome. Evaluation Questions: 1. Has economy been achieved in the implementation of programme activities? 2. Could the same inputs have been purchased for less money? 3. Were salaries and other expenditures appropriate to the context? 4. What are the costs and benefits of this programme? 5. Is there an optimum balance between Economy, Efficiency and Effectiveness? Overall, did the programme represent good value for money?
  • 22. www.wateraid.org Efficiency • What we expect: How far funding, personnel, regulatory, administrative, time, other resources and procedures contributed to or hindered the achievement of outputs. Evaluation Questions: 1. Are there obvious links between significant expenditures and key programme outputs? How well did the partnership and management arrangements work and how did they develop over time? 2. How well did the financial systems work? 3. Were the risks properly identified and well managed? 4. For advice on measuring value for money in governance programmes see DFID’s Briefing Note (July 2011) Indicators and VFM in Governance Programming, available at: www.dfid.gov.uk
  • 23. www.wateraid.org Sustainability • What we expect: Potential for the continuation of the impact achieved and of the delivery mechanisms following the withdrawal of existing funding. Evaluation Questions: 1.What are the prospects for the benefits of the programme being sustained after the funding stops? Did this match the intentions? 2.How have collaboration, networking and influencing of opinion support sustainability?
  • 24. www.wateraid.org Innovation & Replicability • What we expect: How replicable is the process that introduced the changes/impact? Refer especially to innovative aspects which are replicable. Evaluation Questions: 1.What aspects of the programme are replicable elsewhere? 2.Under what circumstances and/or in what contexts would the programme be replicable?
  • 25. www.wateraid.org Expected impact and change • What we expect: Details of the broader economic, social, and political consequences of the programme and how it contributed to the overall objectives of the Governance and Transparency Fund (increased capability, accountability and responsiveness) and to poverty reduction. Evaluation Questions: 1. It is critical to demonstrate the progress in relation to the indicators included in the GTF programme logframe. The focus is on accountability for the impact. 2. What was the programme’s overall impact and how does this compare with what was expected? Please demonstrate from an accountability perspective if the perceived impact was achieved and if not, why not. 3. Did the programme address the intended target group and what was the actual coverage? Again from an accountability perspective, was the coverage reached? If not, why not, if yes, how? 4. Who were the direct and indirect/wider beneficiaries of the programme? Again, the importance here is to set out who these were for accountability purposes. 5. What difference has been made to the lives of those involved in the programme? Describe the impact. 6. As you are aware, the Consultant is also conducting more detailed critical analysis on Impact for learning purposes.
  • 26. www.wateraid.org Vertical Logic of Programme Impact is the higher level situation that the project contributes towards achieving Outcome identifies what will change and who benefits during the lifetime of the project Outputs are specific deliverables Human Resource and financial inputs LEARNING: For the GTF Global Consultants this requires evidence of : ‘so what’? LEARNING: For the GTF Global Consultants this requires evidence of : ‘so what’? ACCOUNTABILITY For the GTF Global Consultants this requires evidence against programme specific objectives ACCOUNTABILITY For the GTF Global Consultants this requires evidence against programme specific objectives
  • 28. www.wateraid.org Why conduct the impact assessment component? • To learn and improve: • To enable Country programme staff, stakeholders in country, Wateraid staff and others to really understand what changed as a result of the programme and to apply this to future plans • To test and refine our understanding of how change happens and how successful we have been in supporting positive changes for our stakeholders: • To what extent did we work with the right people? In the right way? How did this all link up ? • To what extent did the changes we expected to see along the way support the long term changes we were aiming to influence? • What does this tell us about the way we think we can influence change? • What should we do differently next time?
  • 29. www.wateraid.org The Impact Assessment – Focus on the “so what question” • what’s changed? • For whom? • How significant/lasting are these changes for different stakeholder groups? • In what ways did the programme contribute – Expect the unexpected - we are looking for evidence of positive/negative/ intended and unintended changes, – Prioritise analysis over gathering information – need for open and probing questions
  • 30. www.wateraid.org Key questions for the Impact Assessment
  • 31. www.wateraid.org Background and context – what we need to know (Country Programme Theory of Change): • The local and national context, including key social, political and environmental conditions and how they have changed over the life time of the programme • Key issues that the programme planned to address • The target groups who would ultimately benefit from the programme and how each would benefit? • The process or sequence of changes that would lead to the desired long-term goal • The assumptions that the programme made about the anticipated process of change • The other actors/factors who had the potential to influence the changes sought, both positively or negatively.
  • 32. www.wateraid.org Four Domains of Change 1. Changes in the ways in which CSOs function and network, and their capacity to influence the design, implementation and evaluation of effective WASH policies at all levels 2. Changes in the ways that CSOs, including those representing marginalised groups, are able to engage in decision-making processes affecting the WASH sector. 3. Changes in the ways in which members of local communities demand accountability and responsiveness from governments and service providers in the WASH sector 4. Changes in the ways that Governments and service providers are accountable to citizens and end users in the WASH sector
  • 33. www.wateraid.org Each Domain is broken down further into “areas of enquiry” These are the key questions you need to explore across all Domains: 1. What has actually changed for each of the different stakeholder groups, especially the poorest and most marginalized communities in relation to WASH (positive, negative, intended and/or unintended changes) 2. How significant and/or sustainable are these changes for the different target groups? 3. To what extent do these changes compare with baselines and changes that were planned and expected? 4. How do they link together and/or influence each other? 5. To what extent did the GTF programme contribute to these changes? How? 6. Who or what else might have contributed to these changes? How? 7. How confident are you in these findings (levels of evidence)?
  • 34. www.wateraid.org Areas of Enquiry Domain 1 Domain 1 Key Areas of Enquiry Changes in the ways in which CSOs function and network, and their capacity to influence the design, implementation and evaluation of effective WASH policies at all levels • Ways in which networks have developed and function over time • Shifts in CSO capacity • How this capacity change has influenced policy and practice at o local levels o National level Note: we will provide further guidelines on how to assess these areas of enquiry in the next week
  • 35. www.wateraid.org Areas of Enquiry Domain 2 • P Domain 2 Key Areas of Enquiry Changes in the ways that CSOs, including those representing marginalized groups, are able to engage in decision-making processes affecting the WASH sector. • Shift in awareness, knowledge and confidence of marginalized groups • Shifts in the ways that people have been able to demand their rights • The extent to which the voices of marginalized people are making a difference to policy and practice • Ways in which different CSO strategies have influenced change (e.g. budget tracking, participation in stakeholder reviews, etc…) Note: we will provide further guidelines on how to assess these areas of enquiry in the next week
  • 36. www.wateraid.org Areas of Enquiry Domain 3 Domain 3 Key Areas of Enquiry Changes in the ways in which members of local communities demand accountability and responsiveness from governments and service providers in the WASH sector • Levels of awareness of rights in local communities • Ways in which media coverage supports understanding of rights • Ways in which citizens are influencing policy and practice over time • Changes in community access to WASH • Changes in community influence over natural resources Note: we will provide further guidelines on how to assess these areas of enquiry in the next week
  • 37. www.wateraid.org Areas of Enquiry Domain 4 Domain 4 Key Areas of Enquiry Changes in the ways that Governments and service providers are accountable to citizens and end users in the WASH sector • Changing levels of governance, transparency and compliance • Changes in policy and regulation (e.g. new policies, laws, standards, political and institutional framework) – and the consequences of these • Changes in practice relating to WASH (e.g. delivery of new services and systems) and the consequences Note: we will provide further guidelines on how to assess these areas of enquiry in the next week
  • 38. www.wateraid.org Methodology for both components Restrict yourselves to using a few tried and tested tools. We suggest: – Facilitated self assessment: building on MTR which will support the evaluation component • How to do this and who should be involved • Note: we will be adding some more change questions this time – Follow up workshop to validate findings and focus on the impact assessment element • How to do this and who should be involved – Other in depths interviews /FGD with key informants as required • This might enable a deeper understanding of e.g. how changes affected particular target groups
  • 39. www.wateraid.org Guiding Principles for Methodology • Create an atmosphere where informants feel able to be honest and provide critical feedback. Use an appreciative enquiry approach • Ensure a mix of both qualitative and quantitative data is gathered • For the impact assessment – ask open and probing questions for a deeper understanding of change • Findings must be backed up with evidence and be set against the original baseline
  • 40. www.wateraid.org Consideration of sample size Questions: • How many people to interview? • What is a “good enough sample?” Answers: • Be pragmatic (you have limited time but need to be representative). • Plan with in country focal point: – Include people/groups/interventions which represent • good/strong • Medium • Poor/weak • Explain your sampling decisions in the methodology section of the report ( with an indication of the level of rigour you believe this provides)
  • 41. www.wateraid.org The Report We will provide more detailed guidance over the next week but this is the guide 35 -40 pages to include: • Executive Summaries x 2 - (4 pages in total) • Contents + Abbreviations (1 page) • Methodology + challenges and limitations (2 pages ) • Country Context and introduction to the programme (3 pages ) • Evaluation Report (10 pages) – findings and conclusions under the following headings: – Relevance, Effectiveness, Partnership, Advocacy Equity, Value for Money, Efficiency, Sustainability, Innovation and Replicability, Expected Impact and change. • Impact Assessment Report (10 pages) – findings and conclusions under the following headings: – Changes under each Domain – Overall analysis of impact for different target groups – What difference the programme has made overall • Overall conclusions and learning for Country Programmes, for the sector and globally (4 pages) • Annexes
  • 42. www.wateraid.org Other ways to present findings • Opportunity for Wateraid to share findings and learning with a wide group of stakeholders - • Target groups: – Country programme staff – Networks in country – Partners – Wateraid donors – Sector specialists • Supplementary ways of presenting findings (optional) – Case studies – Video footage – Photos
  • 43. www.wateraid.org Next steps • Maureen and Catherine to send more detailed guidelines by Friday April 26th • In country evaluation team to take stock of the outcomes of the webinar and • Prepare and send proposal to Catherine and Maureen copied to Marta and Papa by Friday April 26th with a brief overview of your plan including: – Time line – Your methodology – Key informants – Sample size and rationale for this
  • 44. www.wateraid.org Support and assistance • Guidance notes to follow: – Some thematic guidance – Self-assessment format to use – Learning questions – Report format • Online forum: you will be able to post questions, debate issues and findings with other consultants involved in the GTF final evaluation and impact assessment. – Please expect an email in your inbox with a password and user details from either Catherine or her colleague Erica Packington Erica will manage access to the private forum. – Please action the email immediately to guarantee access. • E mail contacts: – Technical advice • Catherine Currie catherine@iodparc.com • Maureen O’Flynn maureen@oflynn.demon.co.uk – Logistics: • Papa Diouf PapaDiouf@wateraid.org • Marta Barcelo MartaBarcelo@wateraid.org Please copy everything to Marta

Notas del editor

  1. Papa to do the welcome and introductions Explain also that this webinar will be followed up in the next few days with more detailed guidelines
  2. papa
  3. Papa to lead
  4. Papa
  5. papa
  6. Papa to lead
  7. Maureen to lead on this slide with Catherine adding in. Explain that all processes support each other but that they have different focus and purpose. The questions we ask are different. Go through the set. Use social housing example Will share two brief examples of successful programmes which have had negative impacts ... Tanzania and ??? As illustrations Point out that we evaluate impact ( and they will be doing this) to demonstrate the levels to which we achieved the impact we set out to achieve.. But with assessment we go further – instead of starting with the programme logic and plans, we start with changes in governance and transparency and the people this affects ... We explore what has changed for them ( good and bad) and then we assess what – if anything - our programme was able to contribute. WE ARE NOT TALKING ABOUT ATTRIBUTION
  8. Catherine to lead on all of this component
  9. A consistent stance in the evaluation that does not assume attribution of results to the Governance and Transparency Fund, but rather takes a critical approach and examines alternative explanations; Both the consultant in charge of the Impact Assessments and the consultant in charge of the Global Evaluation Report are available to support and advise individual national evaluation coordinators and consultants. Verification of evidence emerging through ongoing triangulation between the multiple data sources and methods employed; Step-by-step validation of evaluation results by national WaterAid teams (with peer review/ discussions as appropriate); Quality assurance processes that are built in to each national evaluation (as well as the preparation of the final global evaluation report) – should all meet the DAC Evaluation Quality Standards, UNEG Standards, or the comparable national or regional standards where these have been adopted; Prioritising the use of country systems to capitalise on existing data/literature including academia, universities, and civil society; and Using a set of agreed working definitions for key terms [and the WaterAid style Guide] to avoid confusion and inconsistent treatment.
  10. Maureen to lead on this component
  11. Build on what was said earlier.. In doing the evaluation, we have completed the accountability to the donor section and answered many questions around relevance and effectiveness etc. Now we really want to focus on the learning
  12. Build on what we said earlier... This is really much more of an investigation... Really trying to find out what happened. I treat it as a murder mystery story... We know what’s changed but we don’t know how it happened. Your job is to find out.
  13. As impact assessment is so focussed on change and our ability to influence change successfully, we need to be very clear about how we thought change would come about in different countries and contexts Each country programme should have developed its Theory of Change . We need to know about context, issues , key stakeholders. We especially need to understand the sequence of change that we thought would work ( what short term changes might lead to longer term change), what assumptions we made ( e.g. Moe press coverage will serve to influence both the wider public and eventually policy makers) We need to be clear abut other actors and factors who might either support or hinder progress in the areas we are working on
  14. This is based on the specific outcomes that GTF hoped to influence. It is worded in neutral language to allow for you to explore negative and/or unintended changes as well as positive and expected changes Example - successful capacity building programme- staff plan and manage well, programmes are focussed and effective, staff are able to source funding etc... All good and lots of ticks. Bu the impact could be... They are so highly trained that they all leave to get jobs for the UN. We have to ask the so what question in order to understand what we should do differently next time
  15. These are key questions ... You may have covered the positive and expected changes in the evaluation section, but this is an opp to really explore the “so what question” And find out if there have been unexpecxted and/or negative changes too. Baseline question – might be tricky to find the baseline, so you should build in questions that help you understand... E.g. Ask how networks have developed and changed over time the linking question is really important, as we want to know for example if it is worth investing efforts into capacity building – its not for itself... It should lead to changes in policy and practice... There might be bettre ways of making this happen To what extent did GTF contribute- really important question. The change might have happened but GTF might not have had anything to do with it. Example of girls education in Ethiopia - figures doubled in 4 years but it was because the big donors gave the Ethiopian gvt an ultimatum. Smaller orgs also working in advocacy in this area took credit for the change, but it wasn’t really as a result of their work. Remember – success has many parents but failure is an orphan! So try to find out who/what else might have been partly responsible for changes that you see Confidence is an important thing to think about. If you dont have concrete evidence for your claims, then the claims are weak. This is not always bad but you need to be clear about your levels of confidence in reporting change
  16. Explain why we use areas of enquiry rather than indicators – to encourage answers that we don’t expect – remember open and probing questions We use words like shift and trends and levels. Where possible it is good to have checklists/scales to help you make sense of what has changed and for us to compare it across countries and programme. We’ll be sending checklists and ideas to support this. Will provide handout on characteristics of a functioning network Shifts in capacity - will be using the 7 s framework.. aspirations; strategy; organisational skills; human resources; systems and infrastructure; organisational structure; and culture. Each section is broken down into several indicators. Need to be aware that some partners may have stronger capacity that Wateraid, so there may be two way capacity building Wateraid advocacy scrap book make be able to help with trends in the way that organisations have been able to influence policy and practice
  17. Awareness... Note to self – look at CAFOD voice and accountabilty tool – what has Wateraid got? See sustainability framework Need to look at all strategies and explore what differemce they have made –if any
  18. Wiil find notes Need soemthing on media, Must read learning papers: Sustainability in Governancve programmes; Governance and Power analysis tools
  19. Catherine to lead on this slide and following two
  20. Catherine
  21. Catherine
  22. Maureen to lead on this slide and the following
  23. Maureen
  24. Papa to lead Mention reading lists
  25. Papa to lead