SlideShare una empresa de Scribd logo
1 de 35
Descargar para leer sin conexión
WHAT IS
PROGRAM EVALUATION?

 JENNIFER ANN MORROW, Ph.D.
  UNIVERSITY OF TENNESSEE
Summary of Current Projects

• Program Evaluation Projects
  – Project Weed & Seed
  – Project Fairchild Tropical Gardens
• Student Development Projects
  – Project Social Networking
  – Project Writing
• Teaching Research Methods/Statistics
  Projects
  – Project RLE
Program Evaluation Philosophy

• Utilization-focused evaluation (Patton, 1996)
  – Evaluations are situation specific
• Comprehensive evaluation designs
  – Formative and summative evaluation
  – Qualitative and quantitative data
• Useful and meaningful data
  – Simple versus sophisticated analyses
• Faculty-student evaluation teams
  – Students as evaluation apprentices
What is Program Evaluation?

• “Program evaluation is the systematic collection of
  information about the activities, characteristics, and
  outcomes of programs for use by specific people to
  reduce uncertainties, improve effectiveness, and
  make decision with regard to what those programs
  are doing and affecting” (Patton, 1986).
• “Evaluation research is the systematic application
  of social research procedures for assessing the
  conceptualization, design, implementation, and
  utility of social intervention program” (Rossi &
  Freeman, 1993).
Types of Evaluation
• Formative Evaluation: focuses on identifying the
  strengths and weaknesses of a program or
  intervention.
  – Comprised of implementation (process) and progress
    evaluation
  – Occurs during the entire life of the program/intervention
  – Is performed to monitor and improve the
    program/intervention
• Summative Evaluation: focuses on determining
  the overall effectiveness or impact of a program or
  intervention.
  – Also called impact evaluation
  – Assesses if the project/intervention met its stated goals
Preparing to
                        Conduct an Evaluation
• Identify the program and its stakeholders
  – Get a description of the program (i.e.,
    curriculum)
  – Meet with all stakeholders (survey or interview
    them)
• Become familiar with information needs
  – Who wants the evaluation?
  – What is the focus of the evaluation? What
    resources do I have available to me?
  – Why is an evaluation wanted?
  – When is the evaluation wanted?
Program Provider/Staff
                                        Issues
• Expectation of a “slam-bang effect”.
• Fear that evaluation with inhibit
  creativity/innovation in regards to the
  program.
• Fear that the program will be terminated.
• Fear that information will be misused.
• Fear that evaluation will drain resources.
• Fear of losing control of the program.
• Fear of the program staff that they are being
  monitored.
What is Program Theory?

• Program theory identifies key program
  elements and how they relate to each other.

• Program theory helps us decide what data
  we should collect and how we should
  analyze it.

• It is important to develop an evaluation plan
  that measures the extent and nature of each
  individual element.
Inadequate Program
                              Evaluation Models
• Social Science Research Model: form two
  random groups, providing one with the
  service and using the other as a control
  group.

• Black-Box Evaluation: an evaluation that
  only looks at the outputs and not the internal
  operations of the program.

• Naturalistic Model: utilizing only qualitative
  methods to gather lots of data.
Theory Driven Model

• Theory-driven evaluations are more likely than
  methods-driven evaluations to discover program
  effects --- they identify and examine a larger set of
  potential program outcomes (Chen & Rossi, 1980).
• Theory-driven evaluations are not limited to one
  method (i.e., quantitative or qualitative), one data
  source (i.e., program participants, artifacts,
  community indexes, program staff), or one type of
  analysis (i.e., descriptive statistics, correlational
  analyses, group difference statistics).
• Theory-driven evaluations utilize mixed-methods
  and derive their data from multiple sources.
Improvement-Focused Model

• Program improvement is the focus.
• Utilizing this type of model, evaluators can
  help program staff to discover discrepancies
  between program objectives and the needs
  of the target population, between program
  implementation and program plans, between
  expectations of the target population and the
  services actually delivered, or between
  outcomes achieved and outcomes projected
  (Posavac & Carey, 1997).
Goals of an Evaluation

• Implementation Goals
  – Equipment needs, staff hiring and training


• Intermediate Goals
  – Program is delivered as planned


• Outcome Goals
  – Is the program effective?
Questions to ask
                                  in an Evaluation
• (1) Does the program match the values of the
  stakeholders/needs of the people being served?
• (2) Does the program as implemented fulfill the
  plans?
• (3) Do the outcomes achieved match the goals?
• (4) Is there support for program theory?
• (5) Is the program accepted?
• (6) Are the resources devoted to the program being
  expended appropriately?
Creating an Evaluation Plan

• Step 1: Creating a Logic Model

• Step 2: Reviewing the Literature

• Step 3: Determining the Methodology

• Step 4: Present a Written Proposal
Step 1: Creating a Logic
                                           Model
• Review program descriptions
  – Is there a program theory?
  – Who do they serve?
  – What do they do?
• Meet with stakeholders
  – Program personnel
  – Program sponsors
  – Clients of program
  – Other individuals/organizations impacted by the
    program
Step 1: Creating a Logic
                                         Model
• Logic models depict assumptions about the
  resources needed to support program
  activities and produce outputs, and the
  activities and outputs needed to realize the
  intended outcomes of a program (United
  Way of America, 1996; Wholey, 1994).

• The assumptions depicted in the model are
  called program theory.
Sample Logic Model

                      Sample Logic Model


     INPUTS            ACTIVITIES            OUTPUTS            OUTCOMES




Resources dedicated What the program The direct products         Benefits for
 to or consumed by does with the inputs of program activities participants during
    the program     to fulfill its mission                    and after program
                                                                   activities
Step 2: Reviewing the
                                        Literature
• Important things to consider:
  – In what ways is your program similar to other
    programs?
  – What research designs were utilized?
  – How were participants sampled?
  – Can previous measures be adopted?
  – What statistical analyses were performed?
  – What were their conclusions/interpretations?
• Creating Hypotheses/Research Questions
Step 3: Determining
                                  the Methodology
• Sampling Method
  – Probability vs. Non-probability
• Research Design
  – Experimental, Quasi-experimental, Non-
    experimental
• Data Collection
  – Ethics, Implementation, Observations, Surveys,
    Existing Records, Interviews/Focus Groups
• Statistical Analysis
  – Descriptive, Correlational, Group Differences
Step 4: Present a
                               Written Proposal
• Describe the specific purpose of the
  evaluation
  – Specific goals, objectives and/or aims of the
    evaluation
• Describe the evaluation design
  – Include theories/support for design
  – Methodology – participants, measures,
    procedure
• Describe the evaluation questions
  – Hypotheses and proposed analyses
• Present a detailed work plan and budget
Ethics in Program Evaluation

• Sometimes evaluators will have to deal with
  ethical dilemmas during the evaluation
  process
• Some potential dilemmas
  – Programs that can’t be done well
  – Presenting all findings (negative and positive)
  – Proper ethical considerations (i.e., informed
    consent)
  – Maintaining confidentiality of clients
  – Competent data collectors
• AEA ethical principles
Data Collection:
                     Implementation Checklists
• Implementation checklists are used to
  ascertain if the program is being delivered as
  planned.
• You should have questions after each
  program chapter/section that the program
  deliverer fills out.
• This can then be used to create a new
  variable: Level of implementation (none, low,
  high).
Data Collection: Observations

• What should be observed?
  – Participants, Program staff


• Utilize trained observers.
  – Your observers should be trained on how to
    observe staff and/or clients.


• Use standardized behavioral checklists.
  – You should have a standard checklist that
    observers can use while observing.
Data Collection: Surveys

• To create or not to create?
  – Finding a valid/reliable instrument
• What can surveys measure?
  – Facts and past behavioral experiences
  – Attitudes and preferences
  – Beliefs and predictions
  – Current/future behaviors
• Types of questions
  – Closed versus Open-ended questions
• How to administer?
Data Collection:
                                  Existing Records
• School records
  – GPA, absences, disciplinary problems


• Health records
  – Relevant health information


• National surveys
  – National and state indices (e.g., census data)
Data Collection:
                      Focus Groups/Interviews
• Focus groups or individual interviews can be
  conducted with program staff and/or clients.

• Can be used to obtain information on
  program effectiveness and satisfaction.

• Can also show if client needs are not being
  met.
Statistical Analysis:
                                        Quantitative
• Descriptive Statistics
  – What are the characteristics of our clients?
  – What % attended our program?
• Correlational Statistics
  – What variables are related to our outcomes?
  – How is implementation related to our outcomes?
• Group Difference Statistics
  – Is our program group different from our
    comparison group?
  – Are there group differences on outcomes?
Statistical Analysis:
                                            Qualitative
• Transcribe interviews/focus groups/observations
   – Should be done verbatim in an organized
     fashion
• Summarizing all open-ended questions
   – Summarize and keep a tally of how many
     participants give each response
• Coding and analyzing all qualitative data
   – Utilize a theoretical framework for coding (e.g.,
     Grounded Theory)
   – Use a qualitative software package to organize
     data (e.g., Nudist, NVivo)
Writing the Evaluation Report

• This should be as detailed as possible. It
  should include both the formative and
  summative evaluation findings as well as an
  action plan for improvement to the
  design/program.
• Should be written in an easy to understand
  format (don’t be too technical). For more
  technical information you can create an
  appendix.
• Include a lot of graphical displays of the
  data.
Presenting the Results

• You should present the results of the
  evaluation to all key stakeholders.
• Professional presentation (PowerPoint,
  handouts).

• Don’t just present findings (both positive and
  negative) but explain them.
• Present an action plan for possible changes.
Working as a Program
                                          Evaluator
• Get more experience
  – Take classes
    • EP 533 (basic intro), EP 651/652 (Seminar), EP 670
      (internship, can take up to 9hours), EP 693
      (Independent Study) as well as many others
    • Evaluation Certificate (12hrs)
  – Workshops
• Get Involved
• Working as a Program Evaluator
References

•   Chen, H.T., & Rossi, P.H. (1980). The multi-goal, theory-driven
    approach to evaluation: A model linking basic and applied social
    science. Social Forces, 59, 106-122.
•   Julian, D.A. (1997). The utilization of the logic model as a system level
    planning and evaluation device. Evaluation and Program Planning, 20,
    251-257.
•   Patton, M.Q. (1986). Utilization-focused evaluation (2nd ed.). Newbury
    Park, CA: Sage Publications.
•   Posavac, E.J., & Carey, R.G. (2003). Program evaluation methods and
    case studies (6th ed.). New Jersey: Prentice Hall.
•   Rossi, P.H., & Freeman, H.E. (1993). Evaluation: A systematic
    approach (5th ed.). Newbury Park, CA: Sage Publications.
•   United Way of America. (1996). Measuring program outcomes: A
    practical approach (Item No. 0989). Author.
Websites

• Evaluators’ Institute
   – http://www.evaluatorsinstitute.com/
• Guide to program evaluation
   – http://www.mapnp.org/library/evaluatn/fnl_eval.htm
• Evaluating community programs
   – http://ctb.lsi.ukans.edu/tools/EN/part_1010.htm
• Evaluation bibliography
   – http://www.ed.gov/about/offices/list/ope/fipse/biblio.html
• Higher Ed center evaluation resources
   – http://www.edc.org/hec/eval/links.html
• The Evaluation Center
   – http://www.wmich.edu/evalctr/
Websites

• American Evaluation Association (AEA)
   – http://www.eval.org/
• Southeast Evaluation Association (SEA)
   – http://www.bitbrothers.com/sea/
• Using logic models
   – http://edis.ifas.ufl.edu/WC041
• Resources for evaluators
   – http://www.luc.edu/faculty/eposava/resource.htm
• Various program evaluation publications (all pdf)
   – http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html
• Evaluation toolkit – Kellogg Foundation
   – http://www.wkkf.org/Programming/Overview.aspx?CID=281
My Contact Information

          Jennifer Ann Morrow, Ph.D.
Assistant Professor of Evaluation, Statistics, and
                  Measurement

  Department of Educational Psychology and
                  Counseling
        The University of Tennessee
            Knoxville, TN 37996
         Email: jamorrow@utk.edu
        Office Phone: 865-974-6117

Más contenido relacionado

La actualidad más candente

Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Meshack Lomoywara
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentSESH SUKHDEO
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationKnowledge Management Center
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationYen Bunsoy
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and EvaluationAhmadzay
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationspanishpvs
 
Proactive evaluation
Proactive evaluationProactive evaluation
Proactive evaluationCarlo Magno
 
Project M&E (unit 1-4)
Project M&E (unit 1-4)Project M&E (unit 1-4)
Project M&E (unit 1-4)Regmi Milan
 
Earned value management for Beginners
Earned value management for Beginners Earned value management for Beginners
Earned value management for Beginners Shenin Hassan
 
The Project Impact Pathway
The Project Impact PathwayThe Project Impact Pathway
The Project Impact PathwayPhilip Jakob
 
MEASURE Evaluation: Results framework and resources
MEASURE Evaluation: Results framework and resourcesMEASURE Evaluation: Results framework and resources
MEASURE Evaluation: Results framework and resourcesMEASURE Evaluation
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]skzarif
 
The Human Side of Project Management
The Human Side of Project ManagementThe Human Side of Project Management
The Human Side of Project ManagementAchchuthan Seetharan
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios athanzeer
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluationSudipta Barman
 

La actualidad más candente (20)

Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2Monitoring and Evaluation: Lesson 2
Monitoring and Evaluation: Lesson 2
 
Monitoring and evaluation Learning and Development
Monitoring and evaluation Learning and DevelopmentMonitoring and evaluation Learning and Development
Monitoring and evaluation Learning and Development
 
Capacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And EvaluationCapacity Development For Monitoring And Evaluation
Capacity Development For Monitoring And Evaluation
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
Goal free model
Goal free modelGoal free model
Goal free model
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Proactive evaluation
Proactive evaluationProactive evaluation
Proactive evaluation
 
Project M&E (unit 1-4)
Project M&E (unit 1-4)Project M&E (unit 1-4)
Project M&E (unit 1-4)
 
Earned value management for Beginners
Earned value management for Beginners Earned value management for Beginners
Earned value management for Beginners
 
The Project Impact Pathway
The Project Impact PathwayThe Project Impact Pathway
The Project Impact Pathway
 
MEASURE Evaluation: Results framework and resources
MEASURE Evaluation: Results framework and resourcesMEASURE Evaluation: Results framework and resources
MEASURE Evaluation: Results framework and resources
 
Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]Monitoring & evaluation presentation[1]
Monitoring & evaluation presentation[1]
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
The Human Side of Project Management
The Human Side of Project ManagementThe Human Side of Project Management
The Human Side of Project Management
 
Monitoring and evaluation presentatios
Monitoring and evaluation presentatios Monitoring and evaluation presentatios
Monitoring and evaluation presentatios
 
Monitoring indicators
Monitoring indicatorsMonitoring indicators
Monitoring indicators
 
Functions of evaluation
Functions of evaluationFunctions of evaluation
Functions of evaluation
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 

Destacado

Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016nida19
 
Why evaluate your program?
Why evaluate your program?Why evaluate your program?
Why evaluate your program?Eric Graig
 
Non-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationNon-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationKristina Jones
 
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...Jennifer Morrow
 
Collecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College SettingCollecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College SettingJennifer Morrow
 
Preparing to go on the job market: Strategies for academic and non-academic j...
Preparing to go on the job market: Strategies for academic and non-academic j...Preparing to go on the job market: Strategies for academic and non-academic j...
Preparing to go on the job market: Strategies for academic and non-academic j...Jennifer Morrow
 
APA Version 6 Quick Guide
APA Version 6 Quick GuideAPA Version 6 Quick Guide
APA Version 6 Quick GuideJennifer Morrow
 
Process for creating the Vision 2013 Strategic Plan
Process for creating the Vision 2013 Strategic PlanProcess for creating the Vision 2013 Strategic Plan
Process for creating the Vision 2013 Strategic PlanSouth Carolina First Steps
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sectorwishart5
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationaneez103
 
LMBR Vision
LMBR VisionLMBR Vision
LMBR VisionDERNSW
 
Educ 6130 4 program evaluation final project
Educ 6130 4  program evaluation final projectEduc 6130 4  program evaluation final project
Educ 6130 4 program evaluation final projectshirleydesigns
 
Robert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choiceRobert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choiceArvin Matias
 
Ins and Outs of Program Evaluation
Ins and Outs of Program EvaluationIns and Outs of Program Evaluation
Ins and Outs of Program Evaluationkbrockmeier
 
Measurement & Evaluation
Measurement & EvaluationMeasurement & Evaluation
Measurement & EvaluationBrett Atwood
 
The evaluation of an urban renewal program and its effects on health and heal...
The evaluation of an urban renewal program and its effects on health and heal...The evaluation of an urban renewal program and its effects on health and heal...
The evaluation of an urban renewal program and its effects on health and heal...sophieproject
 
Program Planning
Program PlanningProgram Planning
Program PlanningLALA RIAZ
 

Destacado (20)

Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
 
Why evaluate your program?
Why evaluate your program?Why evaluate your program?
Why evaluate your program?
 
Non-Profit Program Planning and Evaluation
Non-Profit Program Planning and EvaluationNon-Profit Program Planning and Evaluation
Non-Profit Program Planning and Evaluation
 
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...Using Collaborative and Expressive Writing Activities to Educate First-Year S...
Using Collaborative and Expressive Writing Activities to Educate First-Year S...
 
Collecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College SettingCollecting Longitudinal Evaluation Data in a College Setting
Collecting Longitudinal Evaluation Data in a College Setting
 
Preparing to go on the job market: Strategies for academic and non-academic j...
Preparing to go on the job market: Strategies for academic and non-academic j...Preparing to go on the job market: Strategies for academic and non-academic j...
Preparing to go on the job market: Strategies for academic and non-academic j...
 
APA Version 6 Quick Guide
APA Version 6 Quick GuideAPA Version 6 Quick Guide
APA Version 6 Quick Guide
 
Process for creating the Vision 2013 Strategic Plan
Process for creating the Vision 2013 Strategic PlanProcess for creating the Vision 2013 Strategic Plan
Process for creating the Vision 2013 Strategic Plan
 
Program Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit SectorProgram Evaluation In the Non-Profit Sector
Program Evaluation In the Non-Profit Sector
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Capitulo 4
Capitulo 4Capitulo 4
Capitulo 4
 
LMBR Vision
LMBR VisionLMBR Vision
LMBR Vision
 
Educ 6130 4 program evaluation final project
Educ 6130 4  program evaluation final projectEduc 6130 4  program evaluation final project
Educ 6130 4 program evaluation final project
 
Robert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choiceRobert matias vision & mission evaluation of belle corp. first choice
Robert matias vision & mission evaluation of belle corp. first choice
 
Ins and Outs of Program Evaluation
Ins and Outs of Program EvaluationIns and Outs of Program Evaluation
Ins and Outs of Program Evaluation
 
NCCMT Spotlight Webinar: Program Evaluation Toolkit
NCCMT Spotlight Webinar: Program Evaluation ToolkitNCCMT Spotlight Webinar: Program Evaluation Toolkit
NCCMT Spotlight Webinar: Program Evaluation Toolkit
 
Measurement & Evaluation
Measurement & EvaluationMeasurement & Evaluation
Measurement & Evaluation
 
The evaluation of an urban renewal program and its effects on health and heal...
The evaluation of an urban renewal program and its effects on health and heal...The evaluation of an urban renewal program and its effects on health and heal...
The evaluation of an urban renewal program and its effects on health and heal...
 
Program Planning
Program PlanningProgram Planning
Program Planning
 

Similar a What is program evaluation lecture 100207 [compatibility mode]

Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsDebbie_at_IDS
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxwelfredoyu2
 
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.pptshayamiticharles
 
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome EvaluationYouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome EvaluationLaura Mulrine
 
Planning and evaluation.pptx
 Planning and evaluation.pptx Planning and evaluation.pptx
Planning and evaluation.pptxHazimrizk1
 
Seda emdedding learning technologies evaluating and sustainability3
Seda emdedding learning technologies   evaluating and sustainability3Seda emdedding learning technologies   evaluating and sustainability3
Seda emdedding learning technologies evaluating and sustainability3BrianKilpatrick
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes successcontentli
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approachesJessica Bernardino
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...Khulisa Management Services
 
Presentation chapters 1 and 2
Presentation chapters 1 and 2Presentation chapters 1 and 2
Presentation chapters 1 and 2mcawthon98
 
Evaluation Workshop
Evaluation WorkshopEvaluation Workshop
Evaluation WorkshopNoel Hatch
 
Organizational Capacity-Building Series - Session 5: Program Planning
Organizational Capacity-Building Series - Session 5: Program PlanningOrganizational Capacity-Building Series - Session 5: Program Planning
Organizational Capacity-Building Series - Session 5: Program PlanningINGENAES
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluationCarlo Magno
 
Fundamentals of Program Evaluation
Fundamentals of Program Evaluation Fundamentals of Program Evaluation
Fundamentals of Program Evaluation HotCubator
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs AssessmentLeila Zaim
 

Similar a What is program evaluation lecture 100207 [compatibility mode] (20)

Organizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program EvaluationOrganizational Capacity-Building Series - Session 6: Program Evaluation
Organizational Capacity-Building Series - Session 6: Program Evaluation
 
Evaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation MethodsEvaluability Assessments and Choice of Evaluation Methods
Evaluability Assessments and Choice of Evaluation Methods
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
2014_10_17_HowtoWriteanEvaluationPlanSlides_ORE.ppt
 
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome EvaluationYouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
YouthREX Webinar: Finding and Selecting Tools for Your Outcome Evaluation
 
Part III. Project evaluation
Part III. Project evaluationPart III. Project evaluation
Part III. Project evaluation
 
Planning and evaluation.pptx
 Planning and evaluation.pptx Planning and evaluation.pptx
Planning and evaluation.pptx
 
Seda emdedding learning technologies evaluating and sustainability3
Seda emdedding learning technologies   evaluating and sustainability3Seda emdedding learning technologies   evaluating and sustainability3
Seda emdedding learning technologies evaluating and sustainability3
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approaches
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
Positive Youth Development (PYD) Framework toolkit application modules - Dr B...
 
Presentation chapters 1 and 2
Presentation chapters 1 and 2Presentation chapters 1 and 2
Presentation chapters 1 and 2
 
Evaluation Workshop
Evaluation WorkshopEvaluation Workshop
Evaluation Workshop
 
Organizational Capacity-Building Series - Session 5: Program Planning
Organizational Capacity-Building Series - Session 5: Program PlanningOrganizational Capacity-Building Series - Session 5: Program Planning
Organizational Capacity-Building Series - Session 5: Program Planning
 
The nature of program evaluation
The nature of program evaluationThe nature of program evaluation
The nature of program evaluation
 
M & E Presentation DSK.ppt
M & E Presentation DSK.pptM & E Presentation DSK.ppt
M & E Presentation DSK.ppt
 
Fundamentals of Program Evaluation
Fundamentals of Program Evaluation Fundamentals of Program Evaluation
Fundamentals of Program Evaluation
 
Trg evaluation
Trg evaluationTrg evaluation
Trg evaluation
 
Needs Assessment
Needs AssessmentNeeds Assessment
Needs Assessment
 

Último

Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxMaryGraceBautista27
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A Beña
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomnelietumpap1
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxnelietumpap1
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...Postal Advocate Inc.
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 

Último (20)

Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptx
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choom
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptx
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
USPS® Forced Meter Migration - How to Know if Your Postage Meter Will Soon be...
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 

What is program evaluation lecture 100207 [compatibility mode]

  • 1. WHAT IS PROGRAM EVALUATION? JENNIFER ANN MORROW, Ph.D. UNIVERSITY OF TENNESSEE
  • 2. Summary of Current Projects • Program Evaluation Projects – Project Weed & Seed – Project Fairchild Tropical Gardens • Student Development Projects – Project Social Networking – Project Writing • Teaching Research Methods/Statistics Projects – Project RLE
  • 3. Program Evaluation Philosophy • Utilization-focused evaluation (Patton, 1996) – Evaluations are situation specific • Comprehensive evaluation designs – Formative and summative evaluation – Qualitative and quantitative data • Useful and meaningful data – Simple versus sophisticated analyses • Faculty-student evaluation teams – Students as evaluation apprentices
  • 4. What is Program Evaluation? • “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs for use by specific people to reduce uncertainties, improve effectiveness, and make decision with regard to what those programs are doing and affecting” (Patton, 1986). • “Evaluation research is the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention program” (Rossi & Freeman, 1993).
  • 5. Types of Evaluation • Formative Evaluation: focuses on identifying the strengths and weaknesses of a program or intervention. – Comprised of implementation (process) and progress evaluation – Occurs during the entire life of the program/intervention – Is performed to monitor and improve the program/intervention • Summative Evaluation: focuses on determining the overall effectiveness or impact of a program or intervention. – Also called impact evaluation – Assesses if the project/intervention met its stated goals
  • 6. Preparing to Conduct an Evaluation • Identify the program and its stakeholders – Get a description of the program (i.e., curriculum) – Meet with all stakeholders (survey or interview them) • Become familiar with information needs – Who wants the evaluation? – What is the focus of the evaluation? What resources do I have available to me? – Why is an evaluation wanted? – When is the evaluation wanted?
  • 7. Program Provider/Staff Issues • Expectation of a “slam-bang effect”. • Fear that evaluation with inhibit creativity/innovation in regards to the program. • Fear that the program will be terminated. • Fear that information will be misused. • Fear that evaluation will drain resources. • Fear of losing control of the program. • Fear of the program staff that they are being monitored.
  • 8. What is Program Theory? • Program theory identifies key program elements and how they relate to each other. • Program theory helps us decide what data we should collect and how we should analyze it. • It is important to develop an evaluation plan that measures the extent and nature of each individual element.
  • 9. Inadequate Program Evaluation Models • Social Science Research Model: form two random groups, providing one with the service and using the other as a control group. • Black-Box Evaluation: an evaluation that only looks at the outputs and not the internal operations of the program. • Naturalistic Model: utilizing only qualitative methods to gather lots of data.
  • 10. Theory Driven Model • Theory-driven evaluations are more likely than methods-driven evaluations to discover program effects --- they identify and examine a larger set of potential program outcomes (Chen & Rossi, 1980). • Theory-driven evaluations are not limited to one method (i.e., quantitative or qualitative), one data source (i.e., program participants, artifacts, community indexes, program staff), or one type of analysis (i.e., descriptive statistics, correlational analyses, group difference statistics). • Theory-driven evaluations utilize mixed-methods and derive their data from multiple sources.
  • 11. Improvement-Focused Model • Program improvement is the focus. • Utilizing this type of model, evaluators can help program staff to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, or between outcomes achieved and outcomes projected (Posavac & Carey, 1997).
  • 12. Goals of an Evaluation • Implementation Goals – Equipment needs, staff hiring and training • Intermediate Goals – Program is delivered as planned • Outcome Goals – Is the program effective?
  • 13. Questions to ask in an Evaluation • (1) Does the program match the values of the stakeholders/needs of the people being served? • (2) Does the program as implemented fulfill the plans? • (3) Do the outcomes achieved match the goals? • (4) Is there support for program theory? • (5) Is the program accepted? • (6) Are the resources devoted to the program being expended appropriately?
  • 14. Creating an Evaluation Plan • Step 1: Creating a Logic Model • Step 2: Reviewing the Literature • Step 3: Determining the Methodology • Step 4: Present a Written Proposal
  • 15. Step 1: Creating a Logic Model • Review program descriptions – Is there a program theory? – Who do they serve? – What do they do? • Meet with stakeholders – Program personnel – Program sponsors – Clients of program – Other individuals/organizations impacted by the program
  • 16. Step 1: Creating a Logic Model • Logic models depict assumptions about the resources needed to support program activities and produce outputs, and the activities and outputs needed to realize the intended outcomes of a program (United Way of America, 1996; Wholey, 1994). • The assumptions depicted in the model are called program theory.
  • 17. Sample Logic Model Sample Logic Model INPUTS ACTIVITIES OUTPUTS OUTCOMES Resources dedicated What the program The direct products Benefits for to or consumed by does with the inputs of program activities participants during the program to fulfill its mission and after program activities
  • 18. Step 2: Reviewing the Literature • Important things to consider: – In what ways is your program similar to other programs? – What research designs were utilized? – How were participants sampled? – Can previous measures be adopted? – What statistical analyses were performed? – What were their conclusions/interpretations? • Creating Hypotheses/Research Questions
  • 19. Step 3: Determining the Methodology • Sampling Method – Probability vs. Non-probability • Research Design – Experimental, Quasi-experimental, Non- experimental • Data Collection – Ethics, Implementation, Observations, Surveys, Existing Records, Interviews/Focus Groups • Statistical Analysis – Descriptive, Correlational, Group Differences
  • 20. Step 4: Present a Written Proposal • Describe the specific purpose of the evaluation – Specific goals, objectives and/or aims of the evaluation • Describe the evaluation design – Include theories/support for design – Methodology – participants, measures, procedure • Describe the evaluation questions – Hypotheses and proposed analyses • Present a detailed work plan and budget
  • 21. Ethics in Program Evaluation • Sometimes evaluators will have to deal with ethical dilemmas during the evaluation process • Some potential dilemmas – Programs that can’t be done well – Presenting all findings (negative and positive) – Proper ethical considerations (i.e., informed consent) – Maintaining confidentiality of clients – Competent data collectors • AEA ethical principles
  • 22. Data Collection: Implementation Checklists • Implementation checklists are used to ascertain if the program is being delivered as planned. • You should have questions after each program chapter/section that the program deliverer fills out. • This can then be used to create a new variable: Level of implementation (none, low, high).
  • 23. Data Collection: Observations • What should be observed? – Participants, Program staff • Utilize trained observers. – Your observers should be trained on how to observe staff and/or clients. • Use standardized behavioral checklists. – You should have a standard checklist that observers can use while observing.
  • 24. Data Collection: Surveys • To create or not to create? – Finding a valid/reliable instrument • What can surveys measure? – Facts and past behavioral experiences – Attitudes and preferences – Beliefs and predictions – Current/future behaviors • Types of questions – Closed versus Open-ended questions • How to administer?
  • 25. Data Collection: Existing Records • School records – GPA, absences, disciplinary problems • Health records – Relevant health information • National surveys – National and state indices (e.g., census data)
  • 26. Data Collection: Focus Groups/Interviews • Focus groups or individual interviews can be conducted with program staff and/or clients. • Can be used to obtain information on program effectiveness and satisfaction. • Can also show if client needs are not being met.
  • 27. Statistical Analysis: Quantitative • Descriptive Statistics – What are the characteristics of our clients? – What % attended our program? • Correlational Statistics – What variables are related to our outcomes? – How is implementation related to our outcomes? • Group Difference Statistics – Is our program group different from our comparison group? – Are there group differences on outcomes?
  • 28. Statistical Analysis: Qualitative • Transcribe interviews/focus groups/observations – Should be done verbatim in an organized fashion • Summarizing all open-ended questions – Summarize and keep a tally of how many participants give each response • Coding and analyzing all qualitative data – Utilize a theoretical framework for coding (e.g., Grounded Theory) – Use a qualitative software package to organize data (e.g., Nudist, NVivo)
  • 29. Writing the Evaluation Report • This should be as detailed as possible. It should include both the formative and summative evaluation findings as well as an action plan for improvement to the design/program. • Should be written in an easy to understand format (don’t be too technical). For more technical information you can create an appendix. • Include a lot of graphical displays of the data.
  • 30. Presenting the Results • You should present the results of the evaluation to all key stakeholders. • Professional presentation (PowerPoint, handouts). • Don’t just present findings (both positive and negative) but explain them. • Present an action plan for possible changes.
  • 31. Working as a Program Evaluator • Get more experience – Take classes • EP 533 (basic intro), EP 651/652 (Seminar), EP 670 (internship, can take up to 9hours), EP 693 (Independent Study) as well as many others • Evaluation Certificate (12hrs) – Workshops • Get Involved • Working as a Program Evaluator
  • 32. References • Chen, H.T., & Rossi, P.H. (1980). The multi-goal, theory-driven approach to evaluation: A model linking basic and applied social science. Social Forces, 59, 106-122. • Julian, D.A. (1997). The utilization of the logic model as a system level planning and evaluation device. Evaluation and Program Planning, 20, 251-257. • Patton, M.Q. (1986). Utilization-focused evaluation (2nd ed.). Newbury Park, CA: Sage Publications. • Posavac, E.J., & Carey, R.G. (2003). Program evaluation methods and case studies (6th ed.). New Jersey: Prentice Hall. • Rossi, P.H., & Freeman, H.E. (1993). Evaluation: A systematic approach (5th ed.). Newbury Park, CA: Sage Publications. • United Way of America. (1996). Measuring program outcomes: A practical approach (Item No. 0989). Author.
  • 33. Websites • Evaluators’ Institute – http://www.evaluatorsinstitute.com/ • Guide to program evaluation – http://www.mapnp.org/library/evaluatn/fnl_eval.htm • Evaluating community programs – http://ctb.lsi.ukans.edu/tools/EN/part_1010.htm • Evaluation bibliography – http://www.ed.gov/about/offices/list/ope/fipse/biblio.html • Higher Ed center evaluation resources – http://www.edc.org/hec/eval/links.html • The Evaluation Center – http://www.wmich.edu/evalctr/
  • 34. Websites • American Evaluation Association (AEA) – http://www.eval.org/ • Southeast Evaluation Association (SEA) – http://www.bitbrothers.com/sea/ • Using logic models – http://edis.ifas.ufl.edu/WC041 • Resources for evaluators – http://www.luc.edu/faculty/eposava/resource.htm • Various program evaluation publications (all pdf) – http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html • Evaluation toolkit – Kellogg Foundation – http://www.wkkf.org/Programming/Overview.aspx?CID=281
  • 35. My Contact Information Jennifer Ann Morrow, Ph.D. Assistant Professor of Evaluation, Statistics, and Measurement Department of Educational Psychology and Counseling The University of Tennessee Knoxville, TN 37996 Email: jamorrow@utk.edu Office Phone: 865-974-6117