Closing the Knowledge Gap Between Evaluators and Stakeholders
1. Closing the Knowledge Gap Between
Evaluators and Stakeholders:
Competencies, methods and technologies to optimise
evaluator learning during the evaluation process
Prepared for the 2013 Canadian Evaluation Society Conference:
Evaluation Across Boundaries
Prepared by Werner Meier
http://www.RBMG.ca
June 10, 2013
2. Presentation Overview
Objective: To share with you some of the
approaches, evaluation methods and
technologies that I have found useful in
bridging the knowledge gap.
Part 1 – useful attitudes, aptitudes and skills;
Part 2 - how to integrate opportunities for
learning into the evaluation methodology;
Part 3 - demonstrate how CAQDAS can enhance
evaluator learning and evaluation quality.
June 14, 2013 2www.rbmg.ca
4. What Knowledge Gap?
Basic premise: At the outset of an
evaluation the stakeholders are more
knowledgeable about the evaluand than
the evaluators.
If this statement were not true, then the
evaluators are likely to be in a conflict of
interest and unable to provide an independent
and impartial perspective.
If the statement is true, then what is the
nature of the Knowledge Gap?
June 14, 2013 4www.rbmg.ca
5. Where is the Knowledge Gap?
It depends, but +/- in all components of the
realistic formula:
Mechanism + Context = Outcomes
(Realistic Evaluation, Pawson & Tilley 2008)
Mechanism: origin, foundational
principles, influential decisions, micro/macro
processes, socio-cultural acceptance, outcome
triggers, etc.
Context:
political, institutional, economic, social, cultural,
environmental, etc.
Outcomes: performanceJune 14, 2013 5www.rbmg.ca
6. Overcome the Knowledge Gap
Useful approaches, attitudes and skills:
Approach each evaluation as a learning experience;
Identify your information needs and develop a research plan;
Use soft skills (EIQ) to enhance openness and collaboration;
Employ critical thinking techniques (Scriven & Paul 1987);
Use appropriate software applications to systematically
analyse large amounts of quantitative and qualitative data;
Maintain an impartial attitude, let the evidence speak for
itself and be prepared to demonstrate your evidence base.
June 14, 2013 6www.rbmg.ca
7. How to integrate opportunities for
learning into the evaluation
methodology
Part 2
8. Small Evaluation Example:
Bridging Political Boundaries
A review of occupational studies in public health care; a two
person evaluation with a $50K budget, an ample
implementation timeframe, but a complex and contentious
topic with ambitious pan-Canadian outcomes to assess.
The Concentric Circles Methodology was implemented in a
sequential manner “from the outside-in”.
It involved data collection and analysis for each line of
evidence in a predetermined sequence.
The acquired understandings and insights were reinvested
in the design of subsequent data collection instruments and
evaluation techniques.
June 14, 2013 8www.rbmg.ca
9. 1: Content analysis:
50 reference documents
2: E-survey: 300
organizations with follow-up
telephone interviews
5: F2F interviews:
w/ 4 program managers
Lines of Evidence Sequence
4: Tel. interviews:
w/ 25 committee members
3: F2F interviews:
w/ 10 senior managers of key
health sector professional
associations
1. Reference Documents:
GoC policy statements, Dept. RPP /
DPRs, program and project files;
performance data & reports
2. Stakeholders:
Health service delivery
providers across Canada
3. Key Stakeholders:
Canadian Medical Association,
Canadian Nurses Association,
College of Surgeons and
Physicians, etc.
4. Advisory Cmtes:
Health Delivery and Human
Resources, including P/T
government representatives
5. Client Programme:
Health Canada and HRSDC
Data Sources Data Collection Techniques
Concentric Circles Methodology
June 14, 2013 9www.rbmg.ca
10. Concentric Circles Methodology
Benefits / Outcomes
The lines of evidence implementation sequence was cost-
effective and time sensitive;
The available time with key informants was optimised;
The potential for bias resulting from precipitate contact with
those most invested in the programme was avoided; and
The knowledge-base of the evaluators was gradually built
up with each successive data gathering/analysis step so
that better informed questions were asked and answers
could be clarified in light of the data already acquired.
June 14, 2013 10www.rbmg.ca
11. Large Evaluation Example
A strategic policy evaluation of a $500 million Fund with a
$750K budget, 7 team members, a short data collection
timeframe, large in scope with 33 projects in 6 sectors and pan-
African outcomes to assess.
June 14, 2013 11www.rbmg.ca
13. The
Funder
Canadian
Organizations International
Organisations
African
Partners and
Beneficiaries
Snowball Methodology
Data Collection Stages
Stage 1 – Document
content analysis,
interviews, team
discussion and sector
work planning
Stage 2 – Data
collection, analysis,
sector briefing papers,
and team discussion
Stage 3 – Data
collection, analysis,
sector briefing papers,
team discussion and
mission planning
Implementation challenges, solutions, findings and lessons were
reinvested in sharpening the focus on emerging themes and issues.
Stage 4 – Data
collection, analysis,
preliminary findings
briefs, team discussion
and report preparationJune 14, 2013 13www.rbmg.ca
14. Snowball Methodology
Benefits / Outcomes
The evaluation lines of enquiry were refined and new ones
opened in a process of discovery and learning;
Standardised data analysis and reporting techniques were
developed iteratively as the needs were identified and the
evaluation become progressively focussed ;
The sector evaluators were better prepared to engage
project stakeholders and beneficiaries on outcomes
achieved and their sustainability; and
Presentations and discussions with the client programme
on Fund level policy and programme issues was grounded
by the data and findings at the sector level.
June 14, 2013 14www.rbmg.ca
16. Use of CAQDAS in Social Sciences
Literature Search Summary
“In an examination of the Sociological Abstracts
database, we found only 31 references to either
Nud*ist, Atlas.ti, NVivo, winMAX, Kwalitan, MAX
qda, Qualrus, or Hyperresearch since
1990, compared to 220 references to
SPSS, SAS, and Stata.”
The Wow Factor: Preconceptions and Expectations for Data
Analysis Software in Qualitative Research, Katie MacMillan and
Thomas Koenig, Social Science Computer Review, Vol. 22 No. 2, Summer
2004 179-186.
June 14, 2013 16www.rbmg.ca
17. Use of CAQDAS in Evaluations
Literature Search Summary
Scholars Portal Search String: CAQDAS (long &
short) in Anywhere, PubYear 2008 to present.
All Journals = 98
Journals w/Method/Methodology in Title = 12
International Journal of Social Research
Methodology = 8
Journals w/”Evaluation” in title = 2
Studies in Educational Evaluation = 2
AJE/CJPE: = 0
June 14, 2013 17www.rbmg.ca
18. Some CAQDAS Method Articles
2004 Software and Method, Int'l JRN of Social Research
Methodology
2006 Using CAQDAS to Develop a Grounded Theory
Project, Field Methods
2009 Advances in Qualitative Methods, Int'l JRN of
Qualitative Methods
2009 The use of CAQDAS in educational research, Int'l
JRN of Research & Method in Education
2011 How Technological Developments Change Our
Ways of Data Collection, Transcription and
Analysis, Forum for Qualitative Social Research
2013 Using diagrams to support the research
process, Qualitative Research
June 14, 2013 18www.rbmg.ca
19. Some CAQDAS Applied Articles
2009 Qualitative Data Analysis - A Procedural
comparison, JRN of Applied Sport Psychology
2011Membership categorization and the accomplishment
of 'coding rules' in research team talk, Discourse Studies
2012 Developing midwifery practice through work-based
learning, Nurse Education in Practice
2012 Human vs CAQDA Ratings of Spiritual Content in
Dreams, The Humanistic Psychologist
2012 Progressive Focussing and Trustworthiness in
Qualitative Research, Management Int'l Review
2013 Food safety practices and managers
perceptions, Int'l JRN of Hospitality Management
June 14, 2013 19www.rbmg.ca
20. Use of CAQDAS in Evaluations
Why Not Us?
CES-NCC Annual Learning Event – Thematic
Lunch Roundtable Summary:
little awareness of what this type of software
can do and how it can be applied
perception that statistical software has made
the processing of quantitative data more
reliable, but not so for CAQDAS.
most evaluators are “comfortable” with their
current qualitative data processing techniques.
June 14, 2013 20www.rbmg.ca
21. Use of CAQDAS in Evaluations
Why Not Us?
CES-NCC Annual Learning Event – Thematic
Lunch Roundtable Summary:
Demand: remains soft for more systematic and
rigorous qualitative data analysis that is
demonstratively evidence-based.
Procurement: staff unfamiliarity, a lack of
understanding of its utility, cost of licensing
and frequency of renewals with added costs.
Training: effort required to develop the
technical skills considered disproportionate in
relation to the perceived utility.
June 14, 2013 21www.rbmg.ca
22. Use of CAQDAS in Evaluations
The Demonstration Effect – Free Software
QDA Miner Lite – this is an easy-to-use version of the popular
MAXQDA software.
Saturate – a smart solution to
memo, code, categorize, search, and archive your text
data, tabular data, audio data, and Web pages, all in a multi-
user environment.
TextStat – produces word frequency lists, concordances.
Dedoose – for analyzing text, video, and spreadsheet data;
web-based pay for the months you use the app.
CDC EZ-Text – a software program developed to assist
researchers create, manage, and analyze semi-structured
qualitative databases.
CAQDAS Networking Project (http://caqdas.soc.surrey.ac.uk/
June 14, 2013 22www.rbmg.ca
23. Use of CAQDAS in Evaluations
The Demonstration Effect – Not So Free
June 14, 2013 www.rbmg.ca 23
24. Use of CAQDAS in Evaluations
The Demonstration Effect – Not So Free
June 14, 2013 www.rbmg.ca 24
25. Use of CAQDAS in Evaluations
The Demonstration Effect
Go to Atlas.ti Demo
June 14, 2013 www.rbmg.ca 25
26. Closing the Knowledge Gap Between
Evaluators and Stakeholders:
Competencies, methods and technologies to optimise
evaluator learning during the evaluation process
Thank You for Attending
From Werner Meier
http://www.RBMG.ca
June 10, 2013