Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Meta Evaluation

2.872 visualizaciones

Publicado el

Meta Evaluation= Evaluation of Evaluation (Validity of Evaluation)
Included Standards of join comity

Publicado en: Educación
  • Inicia sesión para ver los comentarios

Meta Evaluation

  1. 1. Meta-evaluation It has been prepared for educational evaluation course by Mohsen Sharifirad Course Instructor: Prof. M. Chizari
  2. 2.  Attend to Metaevaluation, Meta Evaluation and Meta-Evaluation Phrasalwords  Synonym of Metaevaluation is “Review Evaluation”  Worn about same surfing result as “Meta Analysis” Meta-evaluation Some Tips It’s needed to be evaluating
  3. 3. “Meta- evaluation is the evaluation of evaluations —indirectly, the evaluation of evaluators — and represents an ethical as well as a scientific obligation when the welfare of others is involved” (p. 228). What is Meta-evaluation Michael Scriven (1991) described meta-evaluation in his EvaluationThesaurus: He continues to add that meta-evaluation should be conducted by the evaluator as well as by an external entity.
  4. 4. Proactive meta-evaluation, which is designed to help evaluators before conducting an evaluation. Types of Meta-evaluation (Stufflebeam and Shinkfield, 2007) Concurrent meta-evaluation is designed to take place alongside an evaluation, rather than before or after it. Retroactive meta-evaluation, which is designed to help audiences judge completed evaluations. (Carl E. Hanssen et al, 2008)
  5. 5. Proactive metaevaluations are needed to help evaluators focus,design,budget,contract,and carry out sound evaluations. Types of Meta-evaluation Retroactive metaevaluations are required to help audiences judge completed evaluations. In the evaluation literature, these two kinds of metaevaluation are labeled formative metaevaluation and summative metaevaluation. (Stufflebeam and Shinkfield, 2007)
  6. 6. Meta-evaluation The concurrent meta-evaluation differs from both formative and summative meta-evaluations (as defined earlier by Stufflebeam & Shinkfield, 2007) because concurrent meta-evaluation (a) is conducted simultaneously with the development and implementation of a new evaluation method; (b) has both formative and summative components; (c) is comprehensive in nature; and (d) includes multiple, original data collection methods.
  7. 7. Meta-evaluation Standards Patton (1997) suggested that questions to focus a meta-evaluation should include • Was the evaluation well done? • Is it worth using? • Did the evaluation meet professional standards and principles? Guidance for conducting meta-evaluation using evaluation standards is found throughout the evaluation literature.
  8. 8. Meta-evaluation Standards Scriven (1991) argued that meta- evaluation can be either formative or summative and can be aided through the use of check- lists or standards such asThe Program Evaluation Standards (The Joint Committee on Standards for Educational Evaluation, 1994).
  9. 9. Meta-evaluation The Joint Committee on Standards for Educational Evaluation (1994) prescribed‫کردند‬ ‫,تجویز‬ “The evaluation itself should be formatively and summatively evaluated against these and other pertinent standards, so that ‫به‬ ‫که‬ ‫طوری‬ its conduct‫روش‬ is appropriately guided and, on completion‫,درپایان‬ stakeholders can closely examine its strengths and weaknesses”.
  10. 10. Meta-evaluation Example Concurrent Meta-Evaluation A Critique
  11. 11. Meta-evaluation Example Concurrent Meta-Evaluation A Critique
  12. 12. Meta-evaluation Example META-EVALUATION OF QUALITY AND COVERAGE OF USAID EVALUATIONS 2009-2012 EXECUTIVE SUMMARY DEFINITION of 'U.S. Agency for International Development - USAID' An independent federal agency of the United States that provides aid to citizens of foreign countries. Types of aid provided by USAID include disaster relief, technical assistance, poverty alleviation and economic development.
  13. 13. Meta-evaluation Example This evaluation of evaluations, or meta-evaluation, was undertaken to assess the quality of USAID’s evaluation reports. The study builds on USAID’s practice of periodically examining evaluation quality to identify opportunities for improvement. It covers USAID evaluations completed between January 2009 and December 2012. EXECUTIVE SUMMARY Context and Purpose
  14. 14. Meta-evaluation Example Meta-Evaluation Questions The meta-evaluation on which this volume reports systematically examined 340 randomly selected evaluations and gathered qualitative data from USAID staff and evaluators to address three questions:
  15. 15. Meta-evaluation Example 1.To what degree have quality aspects of USAID’s evaluation reports, and underlying practices, changed over time? 2. At this point in time, on which evaluation quality aspects or factors do USAID’s evaluation reports excel and where are they falling short? 3.What can be determined about the overall quality of USAID evaluation reports and where do the greatest opportunities for improvement lie? Meta-Evaluation Questions
  16. 16. Meta-evaluation Example The framework for this study recognizes that undertaking an evaluation involves a partnership between the client for an evaluation (USAID) and the evaluation team. Each party plays an important role in ensuring overall quality. • Information on basic characteristics and quality aspects of 340 randomly selected USAID evaluation reports was a primary source for this study. • Quality aspects of these evaluations were assessed using a 37-element checklist. Meta-Evaluation Methodology
  17. 17. Meta-evaluation Example Question 1.To what degree have quality aspects of USAID’s evaluation reports, and underlying practices, changed over time? Over the four years covered by the meta-evaluation, there were clear improvements in the quality of USAID evaluation reports. On 25 of 37 (68 percent) evaluation quality factors rated, evaluations completed in 2012 showed a positive net increase over 2009 evaluations in the number that met USAID quality standards on those factors. Evaluation Quality Findings
  18. 18. Meta-evaluation Example Question 1. To what degree have quality aspects of USAID’s evaluation reports, and underlying practices, changed over time? Ratings on several factors improved by more than 10 percentage points, including whether findings were supported by data from a range of methods, study limitations were identified, and clear distinctions were made between findings, conclusions, and recommendations. Improvements in evaluation quality factor ratings did not generally rise in a linear fashion, but instead fluctuated from year to year. Evaluation Quality Findings
  19. 19. Meta-evaluation Example Question 1.To what degree have quality aspects of USAID’s evaluation reports, and underlying practices, changed over time? Not all evaluation rating quality factors improved over the study period. MSI, in addition to examining changes over time for the study sample as a whole, assessed changes between 2009 and 2012 on a regional basis, by sector, and for a subset of USAID Forward evaluations to which the Agency, after July 2011, paid special attention from a quality perspective. A t–test was used to compare USAID Forward evaluations with other evaluations. Its results were not significant. Evaluation Quality Findings
  20. 20. Meta-evaluation Example Question 2. At this point in time, on which evaluation quality aspects or factors do USAID’s evaluation reports excel and where are they falling short? Four clusters of evaluation ratings were used to determine where USAID excels on evaluation quality and where improvements are warranted. Evaluation quality factors on which 80 percent or more USAID evaluations met USAID standards were coded as “good.” Of 37 evaluation quality factors examined, 24 percent merited the status designation “good.” Evaluation Quality Findings
  21. 21. Meta-evaluation Example Question 2. At this point in time, on which evaluation quality aspects or factors do USAID’s evaluation reports excel and where are they falling short? Quality standards for which 50 percent to 79 percent of evaluations were rated positively were designated as “fair.” USAID performance was either “good” or “fair” on half of the factors rated. On the remaining evaluation quality factors, USAID performance was deemed “marginal” on 20 percent of those factors and “weak” on 32 percent. Evaluation Quality Findings
  22. 22. Meta-evaluation Example Question 2. At this point in time, on which evaluation quality aspects or factors do USAID’s evaluation reports excel and where are they falling short? Among evaluation quality factors on which compliance was “weak,” MSI found that half addressed quality standards that had recently been introduced in USAID Evaluation Policy. Performance on these factors is likely to improve as familiarity with these new standards improves. Among factors rated weak, the most significant involve low levels of compliance with USAID’s requirement for the participation of an evaluation specialist on every evaluation team and its expectation that, wherever relevant, data on the results of USAID evaluations will be documented on a sex-disaggregated basis. Evaluation Quality Findings
  23. 23. Meta-evaluation Example Question 3.What can be determined about the overall quality of USAID evaluation reports and where do the greatest opportunities for improvement lie? On an overall evaluation quality “score” based on 11 of the meta- evaluation’s quality rating factors, USAID evaluations averaged 5.93 on a 10-point scale—with a mode of 7 points and a relatively normal distribution. Evaluation Quality Findings
  24. 24. Meta-evaluation Example Question 3.What can be determined about the overall quality of USAID evaluation reports and where do the greatest opportunities for improvement lie? Statistical tests conducted using this overall score showed that USAID evaluations completed in 2012 were of significantly higher quality than those completed in 2009. Evaluation Quality Findings
  25. 25. Meta-evaluation Example Question 3.What can be determined about the overall quality of USAID evaluation reports and where do the greatest opportunities for improvement lie? MSI also found that evaluations reporting an evaluation specialist as a team member had higher overall quality scores than evaluations where an evaluation specialist was not reported to be involved.This finding was statistically significant at .05, .01, and .001 levels. Other comparisons were not found to be statistically significant. Evaluation Quality Findings
  26. 26. Meta-evaluation Example The overall picture of evaluation quality at USAID from this study is one of improvement over the study period, with strong gains emerging on key factors between 2010 and 2012. The number of evaluations per year increased, and the quality of evaluation reports has improved.While this portrait is largely positive, the study also identified evaluation quality factors, or standards, that USAID evaluation reports do not yet meet Conclusions
  27. 27. Meta-evaluation Example On several core evaluation quality standards—such as clear distinctions among evaluation findings, conclusions, and recommendations— performance was found to be below USAID standards. Other significant deficiencies included the small percentage of evaluations that indicated that an evaluation specialist was a member of the evaluation team, which USAID has required for the better part of a decade, and low ratings on the presence of sex-disaggregated data at all results levels— not simply for input level activities. Low ratings were also found for several evaluation standards introduced in the 2011 Evaluation Policy, but this may simply reflect slow uptake or lack of awareness of standards. Conclusions
  28. 28. Meta-evaluation Example evaluation reports in those areas that offer opportunities for improvement: • Recommendation 1. Increase the percentage of USAID evaluations that have an evaluation specialist as a fulltime team member with defined responsibilities for ensuring that USAID evaluation report standards are met from roughly 20 percent as of 2012 to 80 percent or more. Recommendations
  29. 29. Meta-evaluation Example • Recommendation 2. Intervene with appropriate guidance, tools, and self-training materials to dramatically increase the effectiveness of existing USAID evaluation management and quality control processes. Recommendations
  30. 30. Meta-evaluation Example • Recommendation 3. As a special effort, in collaboration with USAID’s Office of Gender Equality andWomen’s Empowerment, invest in the development of practitioner guidance materials specific to evaluation. Of these three recommendations, the first is considered the most important for systematically raising the quality of evaluations across all sectors and regions. MSI’s second recommendation is intended to complement its first recommendation and encourage USAID to scale up evaluation management “good practices” already known within the Agency. Recommendations
  31. 31. Meta-evaluation Example 1.To what degree have quality aspects of USAID’s evaluation reports, and underlying practices, changed over time?
  32. 32. Meta-evaluation Example 2. At this point in time, on which evaluation quality aspects or factors do USAID’s evaluation reports excel and where are they falling short?
  33. 33. Meta-evaluation Example 3.What can be determined about the overall quality of USAID evaluation reports and where do the greatest opportunities for improvement lie?

×