1. June 20, 2010 Christina A. Christie, Ph.D. University of California, Los Angeles Michael Harnar, MA Claremont Graduate University
2.
3.
4.
5.
6.
7.
8. Evaluation, like anything else, can be done well and produce benefits or it can be done poorly adding little value, sometimes having negative impact . The following table contrasts characteristics of effective and ineffective evaluation. Evaluation is… Evaluation is not… Done with you Done TO you Able to provide rich information Simply program monitoring Intended to be used Intended to sit on a shelf or to check a box For the program stakeholders For the evaluator or only for management Systematic Haphazard FUN! Scary (Really it isn’t!– you’ll see )
13. The six steps of the CDC Evaluation Model are representative of components of most program evaluation models. There are six basic steps to program evaluation, all of which are related. The first steps provide a foundation for the later ones.
14.
15.
16.
17.
18.
19.
20.
21. Who is… Affected by the program? Involved in program operations? Intended users of evaluation findings? Who do we need to… Enhance credibility? Implement program changes? Advocate for changes? Fund, authorize, or expand the program? To identify stakeholders, we should ask the following questions:
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32. Definition of a Logic Model The program logic model is defined as a picture of how your organization does its work – the theory and assumptions underlying the program. A program logic model links outcomes (both short- and long-term) with program activities/processes and the theoretical assumptions/principles of the program. - The W.K.Kellogg Foundation Logic Model Development Guide
33.
34.
35.
36. External Factors/Context: Description of environment in which program takes place Assumptions: The underlying assumptions that influence the program’s design, implementation or goals Intermediate Long term Short term Resources/Inputs Resources needed to achieve program’s objectives Activities What the program does with resources to meet objectives Outputs Direct products of program activities Outcomes Changes that result from the program’s activities and outputs
41. Program Activities Short-term Outcomes Intermediate Long-term PSA “ Tough Classes” Target Audience: 8th - 10th grade students nationwide Assumptions: (1) students have access to computers and the internet (to view PSA and related materials), and (2) attitudes are the primary determinants of course-taking behaviors knowledge of classes that prepare for college students view taking tough classes as cool, rebellious taking of college prep courses college eligibility college application and acceptance preparedness knowledge of resources related to college access utilization of resources (including campaign website) college achievement
42. Sample Logic Model Framework source: http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
46. What Type of Question Will be Answered? Formative Evaluation (Improve) Summative Evaluation (Prove) Periodic reports Share quickly Demonstrate results to stakeholders Monitor Progress Mid-course corrections Intermediate outcomes and impact Determine value and worth based on results Helps to bring suggestions for improvement to the attention of the staff Describes quality and effectiveness by documenting impact
48. Focus Area Indicators How to Evaluate Influential Factors Resources Activities Outputs Outcomes & Impacts
49.
50.
51.
52.
53.
54.
55.
56.
57.
58.
59.
60.
61. Is random assignment used? Randomized or true experiment yes Is there a control group or multiple measures? no Quasi-experiment Non-experiment yes no Obtained from the Research Methods Knowledge Base: http://www.socialresearchmethods.net/kb/destypes.php Evaluation designs can be divided into 3 categories or types: Randomized or true experimental design, Quasi-Experimental design and non-experimental design. Decisions about whether or not to use a “control group” and how individual’s are assigned to intervention and control groups determines the category or type of evaluation design.
62.
63.
64.
65.
66.
67.
68.
69.
70.
71.
72.
73.
74.
75.
76.
77.
78.
79.
80.
81.
82.
83.
84.
85.
86.
87.
88.
89.
90.
91.
92.
Editor's Notes
06/19/10
06/19/10
06/19/10
06/19/10
06/19/10 A logic model can be used to help clarify the activities of a program, how those activities lead to certain outcomes and how those outcomes link to the ultimate goal of a program. Kellog defines a logic model as: a picture of how your organization does its work – the theory and assumptions underlying the program. A program logic model links outcomes (both short- and long-term) with program activities/processes and the theoretical assumptions/principles of the program.
06/19/10
06/19/10
06/19/10 YOUR PLANNED WORK describes what resources you think you need to implement your program and what you intend to do. YOUR INTENDED RESULTS include all of the program’s desired results (outputs, out- comes, and impact).
06/19/10
06/19/10
06/19/10
06/19/10
06/19/10
06/19/10 From Kellogg, Ch. 4, p. 36-37 The logic model can be divided into three categories: Context, implementation, and outcomes. Context: influences and resources Implementation: activities and outputs Outcomes: short-term, intermediate, long/term “ Context is how the program functions within the economic, social, and political environment of its community.” “ Implementation assesses the extent to which activities were executed as planned.” “ Outcomes determine the extent to which progress is being made toward the desired changes in individuals, organizations, communities, or systems.”