The document outlines the process undertaken by a panel to develop a Canadian definition of evaluation. It describes conducting a literature review, using social media to gather perspectives, and attempting a survey, each of which provided insights but also challenges. The panel encountered unclear and varied definitions, difficulty accessing intended users, and unanticipated issues with surveys. They invite audience input on next steps. The goal is an inclusive definition that represents diverse views while building consensus around Canadian evaluation.
1. +
With special thanks to
Erin Sulla, University
of Alberta
Towards a
Canadian
Definition of
Evaluation
Cheryl Poth, CE
University of Alberta
Mary Kay
Lamarche, CE
Cairine
Chisamore, CE
Alvin Yapp
Masters student
University of Alberta
http://absoluteadvantage.org/images/contentmgmt/edition_38/evaluation_defin
ition.jpg
2. +
Outline
Who & Why of this Presentation
How & What did we find in our work?
Tamed Literature Search
Wild West of Social Media
Unanticipated Survey Challenges
Where should we go from here?
2
3. +
Who & Why of this presentation?
From each panelist’s perspective
What were the beginning conversations for this work?
How did we undertake this work?
What pressing needs did we anticipate?
What was surprising about this process?
3
4. + How & What did we find in our work?
Tamed Literature Search
Rationale for what a literature
search can offer?
Access the perspectives of a
wide variety of people
Systematic approach to
minimize bias
Process involved in bounding
the search (i.e., parameters)
Years: 2008-present
Databases used
Keywords for search
Additional limits: scholarly peer-
reviewed, full-text, English
4
5. +
How are we defining evaluation in Canada?
Government remains summative-focused due to accountability purpose for
generalizability
E.g., “Treasury Board of Canada Secretariat Policy on Evaluation 3.1: In the
government of Canada, evaluation is the systematic collection and analysis of evidence
on the outcomes of programs to make judgments about their relevance, performance
and alternative ways to deliver them or to achieve the same results.”
Others are focused on a more narrow scope for purpose for understanding
specificities
E.g., Evaluation is interested in understanding what is happening in a specific program
rather than attempting to generalize findings more broadly; it focuses on naturally
occurring groups and events that are presented rather than controlling the setting and
focusing on isolated variables (Levin-Rosalis, 2003).
The most common definition relates to an approach
E.g., Developmental evaluation provides right timed feedback and data that are
necessary for supporting adaptation and reorganization in a highly
dynamic, multidimensional and interdependent interventions (Patton, 2011).
How & What did we find in our work?
Tamed Literature Search
5
6. +
What are the strengths of evaluation in Canada?
Profession of evaluation has greater presence
Professional designation has played a role
Increase in use of evaluations across contexts
Planning for evaluations as part of program planning is more common
Program design is being more frequently informed by evaluation
How & What did we find in our work?
Tamed Literature Search
6
7. +
What are the limitations of evaluation in Canada?
There is a lack of clarity in the definition of evaluation
Few resources lead to post-implementation studies
Rigidity in evaluation within government policies
Little research exists on evaluations and what approaches are being used
How & What did we find in our work?
Tamed Literature Search
7
8. +
Interesting Considerations
Unclear definitions of evaluations and approaches
Mixed-Methods designs seem to be the most common
Move towards online learning modules, webinars, etc.
Summative evaluations in government, 2 out of 15
developmental evaluations, mostly formative
Participatory, utilization and community-based empowerment
evaluations most common approaches
How & What did we find in our work?
Tamed Literature Search
8
9. +
Invitation for Audience Input:
What do you think about what we are finding?
Does this jive with what you expected?
Is there literature that we should be looking at?
What groups make sense to compare?
Government Practitioner, other practitioner, academic
9
10. + How & What did we find in our work?
Wild West of Social Media
Rationale for what social
media can offer?
Access the perspectives of a
wide variety of people around
the world
User-generated content -
unstructured and interactive
Process
Twitter – used
#eval, #evaluation, #CES; 3
responses
LinkedIn – posted in 9
groups; 50 unique responses
from 37 unique responders
Facebook – 1 response
Twitter
How do you define #evaluation? I'm helping conduct
research for #CES on different definitions. Results to be
presented @CESToronto2013 #eval”
Facebook
How do you define evaluation? Colleagues and I are
doing research for the CES, exploring different definitions
of evaluation. Results will be presented at the CES
Conference in Toronto this June. For those who follow
me on twitter or are connected via linked in, my
apologies - my responsibility is to see what results I get
through social media. Please feel free to share the post.
LinkedIn
How do you define evaluation? Colleagues and I are
doing research for the CES, exploring different definitions
of evaluation. Results will be presented at the CES
Conference in Toronto this June.
When prompted in 1 group followed up with some
background/clarification in all groups.
10
11. + How & What did we find in our work?
Wild West of Social Media
Top of mind definitions In-depth discussions of “textbook”
definitions All considerations necessary when trying to
develop a definition
Need to consider:
What are we trying to define
Activities, process, practices, methods, approaches
Relative to other fields
Relative to user expectations & opinions
Evaluand, its context and purpose
Evaluator
Overall observation: NO ONE SIZE FITS ALL
11
12. + How & What did we find in our work?
Wild West of Social Media
Results Overview
12
13. +
A Slightly Different Perspective of our Results
How & What did we find in our work?
Wild West of Social Media
13
14. +
Our Interpretation of What it all Means
How & What did we find in our work?
Wild West of Social Media
Essentially, my job doesn’t
have a hat. Some jobs have
hats. Just look at those hats.
Every child you know can tell
you which job goes with
which hat. I have no hat for
my job.
Source:
Lisa O’Reilly, C.E. -
http://lisaoreilly.ca/work/
14
15. +
Can we can have a hat – or maybe multiple hats?
How & What did we find in our work?
Wild West of Social Media
Source: The Six Thinking Hats (or modes) from de Bono Thinking Systems
http://www.debonothinkingsystems.com/tools/6hats.htm
The White Hat: calls for information known or needed
The Red Hat: signifies feelings, hunches and intuition
The Black Hat: is judgment – the devil’s advocate or why something
may not work
The Yellow Hat: symbolizes brightness and optimism
The Green Hat: focuses on creativity: the possibilities, alternatives
and new ideas
The Blue Hat: used to manage the thinking process
15
16. +
Invitation for Audience Input:
What do you think about what we are finding?
Does this jive with what you expected?
If we are going to define evaluation, what should we be
defining:
What we do?
What we are trying to achieve?
Methods?
Skills?
Other?
What do you think, do we have a hat? Or maybe multiple hats?
16
17. +
Rationale
If our products don’t serve
the needs of intended
user, irrelevant
Some evidence in social
media that users views
important
Process
Drafted survey based on
social media concepts
8 closed questions with
opportunity for comments
Piloted with 2; 5-10
minutes
Not implemented
How & What did we find in our work?
Unanticipated Survey Challenges
Survey to cover
Evaluation – should it make judgments
Rate importance of aspects on which
evaluation could draw conclusions, e.g.
outcomes achieved, implemented as planned
Rate importance of including reasons
for conducting an evaluation, e.g.
learning, inform decisions
Systematic process – yes or no
Independence/objectivity important?
17
18. +
How & What did we find in our work?
Unanticipated Survey Challenges
18
Weekly e-mail to CES members
Please pass on survey link to evaluation users
CES members to evaluation users
Please answer our survey
Users answer survey
It’s as simple as that !
19. +
How & What did we find in our work?
Unanticipated Survey Challenges
19
Weekly e-mail to CES members
Please pass on survey link to evaluation users
CES members to evaluation users
Please answer our survey
Users answer survey
Or maybe not
20. +
Invitation for Audience Input:
How important is the user viewpoint to our work?
What is the best way to access the targeted population of
evaluation users?
Do we have the right questions ?
20
21. +
Take Home Message
Doing this research is hard
Unclear definitions
Challenge will be creating a definition that is reflective of all
perspectives and that we can have some degree of consensus
and how to we go about this?
Our intention is to be inclusive and not marginalize the passion
we see in the perspectives we have so far garnered
21
22. +
Where should we go from here
Social Networking Analysis
Literature Review
Include French language literature
First Nations, Metis and Inuit perspective
Deeper look into grey literature and Canadian Evaluation Theses
Deductive using definitions
Evaluation Definition Survey
22
23. +
Invitation for Audience Discussion:
What are the risks associated with choosing one definition?
Is a definition necessary?
Where do you think we should go next?
If time permits: Definition activity
What do you notice about your definition?
What would be important to build on?
What could be left out; what is not relevant to the Canadian context?
23
24. + Source A: Patton, M.Q. (1997). Utilization-focused
Evaluation, Sage
The systematic collection and analysis of information about
program activities, characteristics, and outcomes to make
judgements about the program, improve program
effectiveness and/or inform decisions about future
programming.
Program Evaluation involves:
(1) The systematic collection and analysis of information
(2) Focuses on a broad range of topics
(accessibility, comprehensiveness, Integration, cost, efficiency,
effectiveness)
(3) Is designed for a variety of uses
(management, accountability, planning)
24
25. + Source B: Fournier, D. (2005). Encyclopaedia of
Evaluation. Also cited by Patton in his Utilization
Focused Evaluation text (2008), 4th edition
Evaluation is an applied inquiry process for collecting and
synthesizing evidence that culminates in conclusions about
the state of affairs, value, merit, worth, significance or quality
of a program, product, person or plan. Conclusions made in
evaluations encompass both an empirical aspect and a
normative aspect. It is the value feature that distinguishes
evaluation from other types of inquiry such as basic science
research, clinical epidemiology investigative journalism or
public polling.
25
26. + Source C: Yarbrough, Shulha, Hopson &
Caruthers, (2011) JCSEE Program Evaluation
Standards 3rd edition
The systematic investigation of the quality of
programs, projects, subprograms, subprojects, and/o
r an of their components or elements, together or
singly for the purposes of decision
making, judgements, conclusion, findings, new
knowledge, organizational development, and
capacity building in response to the needs of
identified stakeholders leading to improvement
and/or accountability in the users’ program and
systems ultimately contributing to organizational or
social value.
26
27. + Source D: Scriven (1991). Evaluation
Thesaurus, 3rd Ed.
Evaluation refers to the process of determining
merit, worth, or value of something, or the product of
that process. . . . The evaluation process normally
involves some identification of relevant standards of
merit, worth, or value; some investigation of the
performance of evaluands on these standards; some
on these standards; and some integration or
synthesis of the results to achieve an overall
evaluation or set of associated evaluations
27
28. + Source E: Preskill & Torres (1999). Evaluative
inquiry for learning in organizations. Sage.
Evaluative inquiry is an ongoing process for
investigating and understanding critical
organizational issues. It is an approach to learning
that is fully integrated with an organization’s work
practices, and as such, it engenders (a)
organization’s members’ interest and ability in
exploring critical issues using evaluation logic, (b)
organization members’ improvement in evaluative
processes, and (c) the personal and professional
growth of individuals within the organization
28
29. + Source F: Lipsey & Freeman (2004).
Evaluation: A Systematic Approach, 7th ed., Sage
Program evaluation is the use of social research
methods to systematically investigate the
effectiveness of social intervention programs. It
draws on the techniques and concepts of social
science disciplines and is intended to be useful for
improving programs and informing social action
ameliorating social problems.
29