2. What is Monitoring & Evaluation
Why do we need Monitoring & Evaluation
Guiding principles
Who to involve
Key issues and identifying information we need
How to collect the information
Analysing and using the information
Communicating the data
Presentation Outline
3. • MONITORING
• The continuous, ongoing collection and review of information on
programme implementation, coverage and use for comparison with
implementation plans
• EVALUATION
• A systematic process to determine the extent to which service needs and
results have been or are being achieved and analyse the reasons for any
discrepancy
3
4. Attribute Monitoring Evaluation
Main focus Collecting data on progress Assessing data at critical stages of
the process
Sense of completion Sense of progress Sense of achievement
Time focus Present Past-Future
Main question What is happening now to
reach our goal
Have we achieved our goal
Attention level Details Big picture
Inspires Motivation Creativity
Periodicity Continuous throughout the
whole process
Intermittent; at the beginning or end
of significant milestones
Supports Implementation of a plan Designing the next planning cycle
Skills required Management Leadership
Output processing Progress indicators needs to
be closely monitored by few
people
Evaluation results need to be
discussed, processed and interpreted
by all stakeholders 4
8. MONITORING AND EVALUATION (M&E)
• Key functions:
• To improve the performance of those responsible for implementing
health services.
• Whether a service/program is accomplishing its goals.
• Identifies program weaknesses and strengths, areas of the program
that need revision, and areas of the program that meet or exceed
expectations.
8
11. WHO NEEDS AND USES M&E INFORMATION?
• Managers
To improve
programme
implementation
• Donors
• Governments
• Technocrats
To inform and improve
future programmes
• Donors
• Governments
• Communities
• Beneficiaries
Inform stakeholders
11
12. WHO CONDUCTS M & E?
• Program implementer
• Stakeholders
• Beneficiary
12
13. WHAT TO MONITOR?
1
• Assessment of the process of program
delivery
2
• For understanding how a program
works & how it produces results
3
• How participants recruited &
maintained
4
• How resources acquired & used
5
• How barriers & problems
encountered
PROCESS
Number of ASHAs
selected by due
process (NRHM)
13
14. WHAT TO MONITOR?
• Is program proceeding as per
the plan?Plan
• How well programe
implementation complies with
program plan?
Compliance
• To keep track of ongoing
activities, supplies &
equipment and money spent in
relation to budget allocation
Budget
• Assessment of program deliveryDelivery
PROGRESS
Percentage of anganwadi
workers imparted 5 days
training program at district
level (NHM)
14
15. HOW TO CARRY OUT M & E
Key Features:
1. Program Framework: Analyze and systematically lay out program
elements
2. Identify key elements to monitor and evaluate.
3. Determine and describe the measures to be used for monitoring and
evaluation
4. Develop M & E Framework and action plans, including data collection
and analysis, reporting and dissemination of findings.
15
16. M & E QUESTIONS
• What is being done?
• By whom?
• Target population?
• When?
• How much?
• How often?
• Additional outputs?
• Resources used?
16
17. MONITORING AT DIFFERENT LEVELS
1. Top level: ensuring achievement of impact and provision of inputs
mainly. Major concern to devise strategy and allocate resources
2. Middle level: concerned with getting the desired output with the
inputs utilized. Need to exercise supervision, provide support and
take timely corrective action
3. Lower level/operational level: supervise actual operations and
to ensure that planned activities are being carried out as per
schedule
17
21. REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM
Availability of trained
faculty
Availability of funds
in IEC/training heads
Availability of printed
material for
handouts, venue
INPUT
INDICATORS
21
22. Number of Sensitization meetings
Trainings conducted
Number of sputum samples sent to
laboratory from lower centres
Number of sputum samples examined
under microscopy
PROCESS
INDICATORS
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM
22
23. Number of smear
positive PTB
diagnosed
Number of
registered TB
patients with
non-HIV status
OUTPUT
INDICATORS
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM
23
25. Reduction in TB
prevalence
Reduction in TB
incidence rates
Reduced number of
deaths
IMPACT
INDICATORS
REVISED NATIONAL TUBERCULOSIS CONTROL PROGRAM
25
26. TYPES OF EVALUATION
• Retrospective Evaluation:
• when programs have been functioning for some time.
• Prospective Evaluations:
• when a new program within a service is being introduced
26
28. • Formative evaluation
Evaluation of components and activities of a program other than their
outcomes. (Structure and Process Evaluation)
• Summative evaluation
Evaluation of the degree to which a program has achieved its desired
outcomes, and the degree to which any other outcomes (positive or
negative) have resulted from the program.
Contd.
OUTPUT, OUTCOMES & IMPACTS
INPUT & PROCESSES
28
29. WHO CONDUCTS EVALUATION?
• Internal evaluation (self evaluation), in which people within a
program sponsor, conduct and control the evaluation
E.g. DTO, MO-TC, CMU in NIPCCD
• External evaluation, in which someone from beyond the
program acts as the evaluator and controls the evaluation
E.g. WHO, IIPS
29
30. IE Methodology
Selection of patients
• 2 DMCs with low case load: 4 NSP
patients + 1 previously treated case
• (5*2=10)
• In remaining, 4 NSP + 1 each of Relapse,
Treatment after default and Failure + 1 TB/HIV
patient + 1 DR-TB patient (27)
• 2 pediatric
patients
undergoing
treatment
• Total 36-
39
patients
Selection of TB Units/ DMCs
DMC at DTC
2 DMC that are examining higher
number of presumptive TB case
2 are selected randomly from
remaining DMCs
Selection of Districts
Upto 30 million- 2 districts per
quarter
30 million- 100 million- 3
districts per quarter
>100 million- 3-4 districts per
quarter
30
31. INTERNAL EVALUATION
Advantages Disadvantages
• Knows the implementing organisation, its
programme and operations
• Understands and can interpret behaviour
and attitudes of members of the
organisation
• May possess important informal
information
• Known to staff, less threat of anxiety or
disruption
• More easily accept and promote use of
evaluation results
• Less costly
• Doesn’t require time-consuming
recruitment negotiations
• Contributes to strengthening national
evaluation capability
• May lack objectivity and thus reduce
credibility of findings
• Tends to accept the position of the
organisation
• Usually too busy to participate fully
• Part of the authority structure and may
be constrained by organizational role
conflict
• May not be sufficiently knowledgeable or
experienced to design and implement an
evaluation
• May not have special subject matter
expertise
31
32. EXTERNAL EVALUATION
Advantages Disadvantages
• May be more objective and find it easier
to formulate recommendations
• May be free from organizational bias
• May offer new perspective and additional
insights
• May have greater evaluation skills and
expertise in conducting an evaluation
• May provide greater technical expertise
• Able to dedicate him/herself full time to
the evaluation Can serve as an arbitrator
or facilitator between parties
• Can bring the organization into contact
with additional technical resources.
• May tend to produce overly theoretical
evaluation results
• May be perceived as an adversary
arousing unnecessary anxiety
• May be costly
• Requires more time for contract,
negotiations, orientation and monitoring
32
33. GUIDELINES FOR EVALUATION (FIVE PHASES)
A: Planning the Evaluation
B: Selecting Appropriate Evaluation Methods
C: Collecting and Analysing Information
D: Reporting Findings
E: Implementing Evaluation recommendations
33
34. PHASE A: PLANNING THE EVALUATION
• Determine the purpose of the evaluation.
• Decide on type of evaluation.
• Decide on who conducts evaluation (evaluation team)
• Review existing information in programme documents including
monitoring information.
• List the relevant information sources
• Describe the programme.
34
35. • E.g. RNTCP
• Purpose: to find out the inadequacies in prog implementation
• Type: Internal evaluation
• Who conducts: DTO/MO-IC
• Review of previous visit reports
• Relevant information sources: referral slips, laboratory request form,
tuberculosis treatment card, TB identity card, referral form for treatment
of DR-TB
Registers: tuberculosis laboratory register, culture and DST
laboratory register, Tuberculosis notification register, second line TB
treatment register 35
36. Phase B: Selecting Appropriate Evaluation Methods
• Identify evaluation goals and objectives.
• Formulate evaluation questions and sub-questions
• Decide on the appropriate evaluation design
• Identify measurement standards
• Identify measurement indicators
• Develop an evaluation schedule
• Develop a budget for the evaluation. 36
37. Sample evaluation questions
Program clients:
• Does this program provide us
with high quality service?
• Are some clients provided with
better services than other clients?
If so, why?
Program Staff:
• Does this program provide our
clients with high quality service?
• Should staff make any changes in
how they perform their work, as
individuals and as a team, to
improve program processes and
outcomes?
Program managers:
• Does this program provide our clients
with high quality service?
• Are there ways managers can
improve or change their activities, to
improve program processes and
outcomes?
Funding bodies:
• Does this program provide its clients
with high quality service?
• Is the program cost-effective?
• Should they make changes in how
they fund this program or in the level
of funding to the program?
37
38. Evaluation Area
(Formative
assessment )
Evaluation Question Examples of Specific
Measurable Indicators
Staff Supply Is staff supply sufficient? Staff-to-client ratios
Service Utilization What are the program’s usage
levels?
Percentage of utilization
Accessibility of
Services
How do members of the target
population perceive service
availability?
• Percentage of target
population who are aware of
the program in their area
• Percentage of the “aware”
target population who know
how to access the service
Client Satisfaction How satisfied are clients? Percentage of clients who
report being satisfied with the
service received
38
39. Evaluation Area
(Summative Assessment)
Evaluation question Examples of specific
measurable indicators
Changes in Behaviour Have risk factors for cardiac
disease have changed?
Compare proportion of
respondents who reported
increased physical activity
Morbidity/Mortality • Has lung cancer mortality
decreased by 10%?
• Has there been a reduction in
the rate of low birth weight
babies?
• Age-standardized lung
cancer mortality rates for
males and females
•Compare annual rates of low-
birth weight babies over five
years period
39
40. PHASE C: COLLECTING AND ANALYSING
INFORMATION
• Develop data collection instruments
• Pre-test data collection instruments
• Undertake data collection activities
• Analyse data
• Interpret the data
40
41. Pretesting or piloting
Pilot test should involve
Testing supportive
documents/
procedures
Testing the way
the responses
are recorded
Testing the way the
instrument is
administered
Testing the
instrument
41
46. PHASE D: REPORTING FINDINGS
• Write the evaluation report.
• Decide on the method of sharing the evaluation results and
on communication strategies.
• Share the draft report with stakeholders and revise as
needed to be followed by follow up.
• Disseminate evaluation report.
46
47. • Checklist: e.g.
• Resources:
• Is at least one trained MO officer available in the health facility?
• Is a full time trained Lab Technician (LT) available for sputum
microscopy?
• Have provisions been made for sputum collection when LT is absent?
• Review of forms, registers:
• Are the lab forms for sputum exams filled correctly?
• Is the lab register filled correctly, completely?
• Are results up to date?
• Exit interviews of at least 2 patients undergoing sputum microscopy
• Do the patients know how to cough out good quality sputum
properly?
47
49. EVALUATION REPORT: SUGGESTED OUTLINE
• Title page
• Table of contents
• Acknowledgement
• List of acronyms
• Executive summary
• Introduction
• Finding and conclusion
• Lessons learned
• Recommendations
• Annexures
49
50. RNTCP supervisory register
Recommendation
Name of the Health facility visited
Name and designation of supervisor filling this form
Date and time
Observations on actions taken based on previous visit
Key observation
Politico administrative commitment and resource management
Diagnosis
Drugs and laboratory consumable
DOT and Follow up
TB-HIV activities
Records and reports
ACSM activities
DOTS Plus
Finding of home visit
50
51. PHASE E: IMPLEMENTING EVALUATION
RECOMMENDATIONS
• Develop a new/revised implementation plan in partnership with
stakeholders.
• Monitor the implementation of evaluation recommendations and report
regularly on the implementation progress.
• Plan the next evaluation
51
52. National Health Mission
• The National Health Mission (NHM) encompasses its two Sub-
Missions, the National Rural Health Mission (NRHM) and the
newly launched National Urban Health Mission (NUHM)
• The main programmatic components include Health System
Strengthening in rural and urban areas- Reproductive-
Maternal- Neonatal-Child and Adolescent Health (RMNCH+A),
and Communicable and Non-Communicable Diseases
• The NHM envisages achievement of universal access to
equitable, affordable & quality health care services that are
accountable and responsive to people's needs
52
54. MISSION STEERING GROUP
• Highest policy making and steering institution under NHM.
• Provides broad policy direction to the mission.
• Advises the Empowered programme committee of the mission in
policies and operation
• Exercises the main programme and governance for Health
sector
• Chairperson- Union minister of Health & Family welfare.
• Fully empowered to approve financial norms in respect of all
schemes and components which are part of NHM.
54
55. COMMON REVIEW MISSION
• It was setup as a part of the Mission steering group’s mandate
of review and concurrent evaluation.
• Appraisal was conducted in November 2007.
• The task of the NRHM CRM was to assess the progress of the
NRHM on 24 parameters.
• 52 members
• Ninth CRM was held from 30th October – 6th Novemebr 2015
55
56. Suggested outline of community based monitoring
activity
• The Monitoring committee at each respective level reviews and
collates the records coming from all the committees dealing
with units immediately below it
• Also appoints a small sub-team drawn from its NGO and PRI
representatives who visit a small sample of units under their
purview and review the conditions there
• This enables the committee to not just rely on reports but to
have a first-hand assessment of conditions in their area
56
57. • The Monitoring committee sends a periodic report (Quarterly for Village,
PHC, Block and District levels; Six monthly for State level) to the next
higher level committee
Tools for monitoring:
• Format for Village Health register, Village Health Calendar
• Guideline for information to be collected in Village group discussion
• Schedule of ASHA Interview
• Interview format for MO PHC / CHC
• Format for Exit interview (PHC / CHC)
• Documentation of testimony of denial of health care
57
58. Level Agency Activity (quaterly in all except state which is 6monthly)
Village Village health and
sanitation committee
•Reviews village health register, village health calender
•Reviews performance of ANM,MPW,ASHA
•Sends brief 3 monthly report to PHC committee
PHC PHC monitoring and
Planning committee
•Reviews and collates reports with all VHSC
•NGO/PRI subteam conduct FGD in 3 sample village under PHC
•Visit PHC review records discuss with RKS members
•Sends brief 3monthly report to block committee
Block(CH
C)
Block monitoring and
planning committee
•Reviews and collates reports with all PHC
•NGO/PRI subteam conduct FGD in one PHC ,interview MO
•Visit CHC review records discuss with RKS members
•Sends brief 3 monthly report to District committee
District District monitoring
and planning
committee
•Reviews and collates reports with all PHC
•NGO/PRI subteam conduct FGD in one PHC ,interview MO
•Visit CHC review records discuss with RKS members
•Sends brief 3 monthly report to District committee
State State monitoring and
planning committee
•Reviews and collates reports
•NGO/PRI
•Sends 6 monthly reports to NHM/Union health ministry
58
59. Levels Main issues for
monitoring
Reference
documents
Who When Tools
Village ANM/MPW
services,ASHA
activities
•Village health
plan
•Charter of
citizen
•NHM scheme
VHSC(incl. ASHA
ANM)
Quarterl
y
•Standard agenda
items
•VHR
•VHC
•ANM/MPW records
•Village FGDs
•Interview of
beneficiaries
PHC •Overview of village
level monitoring
•Staffing, supplies and
services availability at
PHC
•Quality of care at PHC
•PHC health
plan
•Charter of
citizen’s
health rights
at PHC
•PHC monitoring
and planning
committee, PRI
members etc.
•PHC RKS
members
Quarterl
y
•Standard agenda
items
•Report from VHSC
•Record of select
village FGD
•Interview of MO PHC
•Exit interview of PHC
patients
Suggested framework to organise information
59
60. Levels Main issues for monitoring Reference
documents
Who When Tools
Block •Overview of health service in
block
•Staffing, supplies and
service availability at CHC
•Quality of care at CHC from
people’s perspective
•CHC health
plan
•Charter of
citizen’s
health rights
at CHC
•CHC
monitoring
and planning
committee
incl PRI
members etc.
•CHC RKS
members
•Facilitation
by nodal
NGO/CBO
Quaterly •Standard agenda
items for CHC
committee meeting
•Reports from PHC
committee
•Records of visits to
select PHCs
•Interview of MO in
charge of CHC
•Report of the
district health
mission
District •Overview of all public health
services in the district(except
services provided by
municipal bodies) state
specific health schemes
•Quality of care at district
hospital and sub-divisional
hospital
•District
health plan
•Charter of
citizen’s
health rights
at District
•District
health
monitoring
and planning
committee
•Public
hearing
facilitator
team
6 monthly •Standard agenda
items for District
committee meeting
•Reports from Block
health committee
•Records of visits to
select sub-divisional
hospitals/CHCs
60
61. Levels Main issues
for monitoring
Reference documents Who When Tools
State All issues of
rural public
health
services/NHM
in the satte
including state
specific health
schemes
•State health plan, state
PIP
•NHRC
recommendations and
state govt component of
NHRC national action
plan
•All NHM schemes
ASHA,JSY, United
funds expenditure
•IPHS and functioning
of various level facilities
•National health
programmes and family
planning insurance
scheme
•PPP and related
regulations
•State health budget
and expenditure
•State health
monitoring and
planning
committee
•State peoples
rural health
watch
report/citizens
report by civil
society groups
•Public meeting
of state mission
with civil society
representative
•Six monthly
committee
meetings
•Annual
independent
reports,public
meetings
•Report from
district health
committee
•Record of visit
to select
districts
•Report of state
health mission
•Report of
district public
hearing
•Independent
reports
61
62. CONCLUSION:
WHY MONITORING AND EVALUATION
M&E
should be
part of
the design
of a
program
Ensures
systemati
c
reporting
Communi
cates
results
and
accounta
bility
Measures
efficiency
and
effectiven
ess
Provides
informatio
n for
improved
decision
making
Ensures
effective
allocation
of
resources
Promotes
continuou
s learning
and
improvem
ent
62