1. 06/12/13 Prepared by Bongs Lainjo
Program Indicator Screening
Matrix (PRISM): A Composite
Score Framework
Bongs Lainjo, MASc Engineering
RBM Systems Consultant and
Former UN Senior Program Advisor
Montreal, Canada, Email: bsuiru@bell.net
Canadian Evaluation Society (CES) Conference
Toronto, Canada
June 9 – 12, 2013
2. Presentation Outline
• Introduction;
• Objectives;
• Relevance;
• Target Audience;
• Evaluation Life Cycle (ELC);
• Program Design Framework (PDF);
• Themes;
• PRISM: A Composite Score Framework;
• Lessons Learned.
06/12/13 Prepared by Bongs Lainjo
3. PRISM: Introduction
• Evaluation – Demand driven;
• Participatory;
• Inclusive;
• Bottom Up Strategy (BUS);
• Consensus - Based ;
• Random Thematic Sub-Groups;
• Intra-Thematic-Group Concordance;
• Inter-Thematic-Group Concordance;
• Bar (Gold Standard vs. Effective);
• Binary Outcome;
• Delphi Methodology;
• Mapping;
• Scope : Africa, Asia and Pacific Island Countries.
06/12/13 Prepared by Bongs Lainjo
4. OBJECTIVES: General
• To strengthen the knowledge of IPs, PMs and other
key stakeholders emphasizing sustainable
engagement in program management and
implementation.
• This is in an attempt to address existing nuances,
highlighting the synergies that exist among the
different result levels of the SFW and hence
facilitating a common ground between potential
evaluators and different interested parties.
06/12/13 Prepared by Bongs Lainjo
5. OBJECTIVES: Specific
• Streamline by improving indicator causal links at all
result levels;
• Mitigate duplication of indicators;
• Establish authentic contributions between different
result levels;
• Establish meaningful synergies among different
results levels: no lower level result can contribute
to more than one upper-level result;
06/12/13 Prepared by Bongs Lainjo
6. OBJECTIVES : Specific (Cont’d)
• Strengthen the program design;
• Promote a common understanding among key
actors and
• Minimize cost and optimize the number of
indicators included in the program.
06/12/13 Prepared by Bongs Lainjo
7. PRISM: Relevance
• Improves intended and unintended intervention
results and makes foreign aid more focused with
evidence-based results;
• Establishes more effective, continuous and
sustainable synergies among frontline forces, IPs,
Funding Agencies, Stakeholders and Beneficiaries.
06/12/13 Prepared by Bongs Lainjo
8. PRISM: Target Audience
• Funding Agencies;
• IPs;
• Program Managers;
• Relevant Stakeholders;
• Evaluators;
• Development Partners.
06/12/13 Prepared by Bongs Lainjo
9. Evaluation Life Cycle
• Demand Recognition;
• Evaluation Team Identified;
• Inception Report Developed;
• Evaluation Process implemented;
• Draft Report Developed and Presented;
• Final Report Developed and submitted.
06/12/13 Prepared by Bongs Lainjo
10. Program Design Frameworks
Type
Logic
Framework
(Logframe)
Results Levels
•Impact
•Outcome
•Output
Agencies
UN,
CIDA,EU,
AusAID,
DfID, WB
Strategic
Objective
Strategic objective
Program Objectives
Program Sub-objectives
USAID
06/12/13 Prepared by Bongs Lainjo
12. PRISM: Definition
• An R by C Matrix where
• R = Number of Thematic Indicators and
• C = Six Screening Criteria.
• Each Indicator is cross-tabulated with each criterion;
• The intersecting cell is filled with either a “1” or a “0”;
• The former if the indicator satisfies the criterion and the latter if it
doesn’t;
• Exercise continues until ALL indicators are screened;
• A corresponding final score (%) per indicator is established for each
row. These are used in establishing Group Concordance;
• Thematic Group and Sub Groups agree on effective %;
• Each Sub-Group is made up of Moderator, Rapporteur and Team.
06/12/13 Prepared by Bongs Lainjo
13. PRISM: Themes
• Health;
• Education;
• Environment;
• Governance;
• Poverty;
• Judiciary;
• Agriculture;
• Social Security and Protection.
06/12/13 Prepared by Bongs Lainjo
15. PRISM: Criteria Definition
• Specificity:
• This refers to the likelihood of the indicator measuring
the relevant result. In other words, is there a possibility
that the result the indicator represents does not
represent exactly what we are looking for?
06/12/13 Prepared by Bongs Lainjo
16. PRISM: Criteria Definition
• Reliability:
• This criterion is synonymous to replication. That is
does the indicator consistently produce the same
result when measured over a certain period of time?
For example, if two or more people calculated this
indicator independently, will they come up with the
same result? If the answer is yes, then the indicator
has satisfied that condition and hence a ‘one’ is
entered in that cell. And zero entered otherwise.
06/12/13 Prepared by Bongs Lainjo
17. PRISM: Criteria Definition
• Sensitivity:
• It’s a test that tries to assess the stability of an
indicator. For example, does the indicator continue to
deliver the same result with a small variation of either
the numerator or denominator? How does the result
change when assumptions are modified? Does the
indicator actually contribute to the next higher level?
For example, if the same indicator accounts for two or
more higher result levels simultaneously, it is not
stable.
06/12/13 Prepared by Bongs Lainjo
18. PRISM: Criteria Definition
• Simplicity:
• A convoluted indicator represents challenges at many
levels. Hence here, we are looking for an indicator that
is easy to collect, analyze and disseminate. Any
indicator that satisfies these conditions automatically
qualifies for inclusion.
06/12/13 Prepared by Bongs Lainjo
19. PRISM: Criteria Definition
• Utility:
• This refers to degree to which information generated
by this indicator will be used. The objective of this
criterion is to assist in streamlining an indicator in an
attempt to help the decision making in making an
informed-decision. This can either be during the
planning process or during the re-alignment process.
The latter representing occasions when organizations
are evaluating the current status of its mandate.
06/12/13 Prepared by Bongs Lainjo
20. PRISM: Criteria Definition
• Affordability:
• This is simply a cost-effective perspective of the
indicator in question. Can the program/project afford
to collect and report on the indicator? In general, it
takes at least two comparable indicators to establish a
more efficient and cost-effective one. The one that
qualifies is included at that criterion level.
06/12/13 Prepared by Bongs Lainjo
21. 06/12/13 Prepared by Bongs Lainjo
INDICATOR SCREENING
Prepared by:- Bongs Lainjo
Programme Indicator Screening Matrix (PRISM)
Implemente
d
Yes/No
%
Score
7
Total
Yes
6
Afforda
blity
5
Utility
4
Simpl
icity
3
Sensiti
vity
2
Reliabl
ity
1
Speci
ficity
Thematic Area: RH, PDS, GDR, Other
Results Level: Goal, Outcome, Output
INDICATOR
Specificity - Does it measure the result and contribute to ONLY 1 higher level
indicator?
Reliability - Is it consistent measure over time?
Sensitivity - When the result changes will it be sensitive to those changes?
Simplicity - Will it be easy to collect and analyze the Data?
Utility - Will the information be useful for decision-making and learning?
Affordability - Can the program/project afford to collect the Data?
22. PRISM: Implementation
• Theme Identification;
• Thematic Group Selection;
• Random Thematic Sub Group selection;
• Selection of Sub group Moderator and Rapporteur ;
• Individual Thematic Sub-group member Scoring;
• Establish Intra-Thematic-Sub-Group Concordance;
• Establish Inter-Thematic Group Concordance;
• Conduct Thematic group Plenary;
• Establish Group Consensus;
• Select Final Set of Indicators.
06/12/13 Prepared by Bongs Lainjo
23. PRISM: Algorithm
06/12/13 Prepared by Bongs Lainjo
Select
Indicator
Screen
Indicator
Comp.Scor
e
>=Bar?
Accept
Indicator
Drop
Indicator
Last
Indicator?
Concordance1
1
N Y
YN
End Process
24. PRISM: Lessons Learned
• Team Composition Homogeneity Important;
• Consensus Building Required;
• Not more than ten members per thematic Sub-group;
• Solid knowledge of theme essential;
• Time Management important;
• Framework useful pre-program implementation;
• Also essential during Mid-Term-Review (MTR);
• Active involvement of top Management critical ;
• Feedback provided to all active teams required;
• Useful initial contact tool for evaluation team and relevant
program key players.
06/12/13 Prepared by Bongs Lainjo
25. 06/12/13 Prepared by Bongs Lainjo06/12/13 Prepared by Bongs Lainjo
06/12/13 Prepared by Bongs Lainjo
25
Thank You
Notas del editor
Testing Internal and External Logic: If the OUTPUTS are delivered through planned ACTIVITIES and using relevant INPUTS and corresponding ASSUMPTIONS at the OUTPUT, OUTCOME and IMPACT levels remain valid, then the desired OUTCOME will materialise leading to the intended IMPACT