Presented by Iddo Dror at the SEARCA Forum-workshop on Platforms, Rural Advisory Services, and Knowledge Management: Towards Inclusive and Sustainable Agricultural and Rural Development, Los Banos, 17-19 May 2016
1. Tools for reflexivity and innovation
platforms
Iddo Dror
SEARCA Forum-workshop on Platforms, Rural Advisory Services, and Knowledge
Management: Towards Inclusive and Sustainable Agricultural and Rural Development, Los
Banos, 17-19 May 2016
2. Reflexive M&E
Reflect – Looking back and
thinking about what happened,
what does it mean, how to
proceed?
Reflexive- Challenging rules,
practices, assumptions, modes of
thinking, of ‘others’ and self.
Collective
activity
Obser-
vation
Analys-
is
Reflec-
tion
Adapta
tion
3. Reflexive M&E v’s Traditional M&E
REFLEXIVE M&E
• Indicators are not static
• Process indicators
predominate in short term
• Focus is on change
• Design may change based
on reflection about
original assumptions
• Impact is important
TRADITIONAL M&E
• Static indicators as part of
logframe
• Output and outcome
indicators predominate
• Focus in on achievement
of predefined goals
• Structure of project is set
in the design and can be
modified but not radically
changed
• Impact is important
4. Reflexive Monitoring in Action (RMA)
• Monitoring is an integral
part of platform activity.
• Flexible selection of tools:
the challenges of the
moment determine the
best monitoring tool to
use.
• Every monitoring activity
encourages reflection and
learning aimed at system
innovation.
5. Role of the Monitor
• Role and function distinct from that of
facilitator
• Advise on dynamic composition of the
platform to meet evolving needs
• Retain focus on unresolved problems and
long term ambitions
• Maintain the learning history
• Seize opportunities for reflection
6. Monitoring Tools
Tools available to the RMA monitor…
Diary/Log book, Participant
Profile, Google Analytics, SNA
Most Significant ChangeDynamic Learning Agenda
Timeline and Learning History
Farmer Field Days
Causal Analysis
8. Timeline or Learning History
• Monitor collects data to put
together a timeline showing
which events happened when.
• Participants write down their
key moments of learning or
change (highs and lows) in key
words on post-its.
• Each participant can share
three key comments by placing
their post-it on the timeline.
9. Data collection and Analysis
data documentation tools
• Event log
• Participant Profile
• Meeting minutes
• periodic reflections of people
engaged in key functions
• Audio-visual records of the
major events
10. Data collection and Analysis
data reporting tools
• Google Forms
• Google Drive
• Dropbox
• Wiki
11. Data collection and Analysis
data analysis tools
• Descriptive statistics
• Statistical Analysis
• Text Analysis
• Social Network Maps
• Mind Mapping
• Econometric Analysis
• Spatial modelling
12. Most Significant Change (MSC)
Identify no more than 3
types of change you
would like to document.
Collect stories from
platform members and
end users and shortlist
the best stories.
Select story reviewers
from amongst your
donors and those whom
the platform seeks to
influence.
Reviewers decide on best story and
discuss why it is significant.
13. Example MSC Question
‘Looking back over the last month, what do you think was the most significant
change in the quality of people’s lives in this community?’
Davies, R. and Dart, J. (2005): The 'Most Significant Change' (MSC) Technique', Accessed at
http://www.mande.co.uk/docs/MSCGuide.pdf 28th April, 2016.
14. More information
This module is associated with an elearning module
on ‘Understanding, Facilitating and Monitoring
Agricultural Innovation Platforms’ available at:
http://learning.ilri.org/course/detail/24
15. The presentation has a Creative Commons licence. You are free to re-use or distribute this work, provided credit is given to ILRI.
better lives through livestock
ilri.org
Notas del editor
In the previous module, we went to great lengths to analyze the problems that confronted us; to identify what we would jointly do to address the problem; to describe what would need to change in terms of stakeholder knowledge, skills, attitudes, and practice in order to effect the changes necessary to institutionalize our proposed solutions; and to determine indicators, milestones and means of data collection. Having done all that, have we designed a rigid framework for action that we need to see through to the end without any possibility for rethinking and redesigning? If a strategy that we thought would change the practice of an actor group, proves not to work, can we change our strategy and the related indicators and milestones? The answer is a very definite “Yes”. We not only can, but should, change our plans as needed to work towards our ultimate goal. We need to frequently “reflect” on the results of our endeavours and to “be reflexive” in analyzing the evidence of those results. Being reflexive means being prepared to challenge our assumptions about how thing work and being prepared to think outside the box about how to effect change. Reflexive monitoring can be said to involve double-loop learning. This sets it apart from traditional outputs monitoring which involves adjustment of plans based on the effect of actions taken but no real questioning of the basic assumptions and perceptions that went into making those plans initially.
What implications does being “reflexive” have for monitoring the activities for an innovation platform? If our indicators of success are not static, we can’t monitor the extent to which we managed to achieve against these indicators in comparison to a baseline….or can we? The answer is that we can, but the sorts of indicators we should use are dictated by how change occurs in complex systems. Change occurs through learning which affects knowledge, skills, attitudes and, ultimately, practice. Hence the indicators we use in the short to medium term tend to be process indicators although, of course, it is the ultimate impact that we are interested in. With reflexive M&E, we need to consider the possibility that we realize that our intervention was ill conceived and start again with a completely new approach and new indicators. This can make working with donors who are used to the logframe approach a little difficult so it is necessary to thoroughly document the evidence on which you have based your decision. We will look at documentation a little later in this module.
Researchers at Wageningen University and the VU University Amsterdam have been working together on a type of monitoring that they have called reflexive monitoring in action (RMA). RMA positions monitoring as part of the process of working towards system change and an essential component of the Observation – Analysis – Reflection – Adaptation cycle. RMA posits that platform facilitators need to convene regular reflection sessions on current platform activities, what has already been achieved, the barriers and opportunities in the current system, and how platform actions are contributing to the ultimate goal of changing the system. The tools to be used can be selected to meet the need at hand and this selection is often a matter of personal preference. What is important, is that every monitoring activity encourages reflection and learning aimed at system innovation. For instance, being interviewed about systemic barriers and the relationships between system stakeholders can be expected to cause a potential platform member to reflect on these issues and possible solutions.
Proponents of RMA recommend that monitoring be a dedicated function of a single individual. The monitor needs to maintain a little distance from the activities of the platform so this is not an appropriate role for the facilitator to take on. He or she needs to keep an eye on the composition of the platform and ensure that new key players are invited to join as they are needed. Where platform members might become absorbed in the day to day activities of the platform, the monitor needs to maintain perspective and remind them of their long term ambitions. He or she should challenge members to maintain those ambitions rather than compromise on them in the face of difficulties. Seemingly unresolvable problems such as legislation that might stand in the way of an innovation coming to scale or missing or inefficient infrastructure needed to support newly designed systems can easily be swept under the carpet and ignored for long periods of time because they are simply too hard. It is the job of the monitor to draw the attention of the members back to these problems. At the same time, the monitor should draw attention to external developments likely to impact on platform activities. On a more mundane level, the monitor needs to be able to help platform members reflect on what they have achieved and whether they are on course to achieve their stated goals. To do that he or she needs to maintain the learning history for the platform – what decisions were made, what actions were taken, why, what data was collected, what conclusions were drawn, and so on. We will see examples of this sort of activity later in this module.
The main tools for the RMA Monitor are participatory observation and interviews. He or she uses these tools to record the project strategy, underlying assumptions and activities on an everyday basis. However, sometimes it is useful to be able to draw on more structured tools and these tools are the topic of this next section. In many cases the monitor will use these tools to monitor process and effect indicators. In all cases such monitoring activities should encourage reflection and learning aimed at system innovation. Monitoring of outputs is also important but can be done with more traditional monitoring tools. Because of this continuous collection of information, the monitor is able to act as a key informant to the platform facilitator who might otherwise become so involved in the day-to-day activities of the platform that they lose this perspective.
You have already met Causal Analysis in the form of Problem Tree Analysis. Farmer Field Days were covered in Module 9 on communications. Click on the other images for an introduction to some tools you may not yet be familiar with.
The objective of causal analysis is to be able to identify the causes of a problem since it is always better to treat the root cause rather than merely alleviate the symptoms. Lead the group to ask “Why?” multiple times as they construct the cause-effect or Fishbone diagram. What makes a fishbone a fishbone is the angling of the arrows which gives the diagram a 'fishy' appearance. However, causes are often circular such as births leading to population increase which leads to more birth or systemic with multiple related issues being the cause and this can be difficult to represent with a fishbone diagram.
The timeline method provides a working format for expressing the challenges, successes and learning experiences explicitly, together with the project participants. It is the task of the monitor to lead the group in reflecting on events that meant different things to different stakeholders.
When monitoring an Innovation Platform it is necessary to record both factual events and perceptions since both contribute to group planning. Possible documentation tools include event logs, participant profiles, meeting minutes, periodic reflections, and audio visual records of the major events. Continuity of documentation is critical for success since this data is meant to form the basis of the tight feedback loops that keep the activity of the IP on track. A rush to find project documentation just before the project impact evaluation is a clear indicator that the innovation platform has not really been active in finding innovative solutions.
The Innovation Platform monitor needs to find a way to report back to platform members using simple visual representations that makes it easy to draw conclusions and reflect on the way ahead. Google forms is a useful tool here providing the capability for those with Internet access to enter their data directly and generating visual reports automatically. There also needs to be a way of sharing data and reports. Options include dropbox and wikis. Wikis were dealt with in the unit of communications. Google drive allows for the co-creation of knowledge.
Apart from the simple descriptive statistics shown on the previous slide, there are many tools for statistical analysis which can help represent monitoring data in a manner intended to promote reflection. The social network analysis tools shown on the bottom left will already be familiar to you from module 8 which dealt with stakeholder analysis in depth. The curve from Burundi represents the level of stakeholder engagement in the platform. Showing platform members such a curve is a good way to start a group reflection on “why did this happen?” or “How can we avoid this in future?”. Likewise the curve from Uganda shows the impact of recruiting a field researcher on platform activity and is a useful aid in planning. As the platform monitor these and other tools such as mind mapping, and spatial modelling tools can be invaluable aids when conducting reflection and analysis meetings.
Link from graphic: http://www.mande.co.uk/docs/MSCGuide.pdf
Click on the graphic to download the guideline to implementing the approach by the originators of the tool - Rick Davies and Jess Dart.
The MSC approach seeks to find evidence of change in intangibles like attitude and perception. This is important to allow the platform to know whether it is heading in the right direction. However, it is equally important for your donors and/or government officials to know that your actions are having an effect and one which is likely to ultimately impact on farmer livelihoods. This is the reason that the final review panel should include donors and government although, of course, you are likely to have several levels of review below that to short list the finalists.