1. Ohio Principal
Evaluation System
A Snapshot of Implementation in 2011-2012
Jill Lindsey, PhD
Suzanne Franco, EdD
Ted Zigler, EdD
Wright State University’s OERC Team
Funded by the Ohio Education Research Center
2. Purpose
To gather feedback from principals, evaluators
and superintendents regarding first use
experiences
3. Design
• Criterion sampling
• Phone and in-person interviews with
superintendents
• Focus group interview with evaluators
• Focus group and individual interviews with
principals evaluated using OPES
4. Guiding Questions related to
• Implementation
• Training
• 50% student performance measures
• Challenges
• Perceptions
• Comparison with past practice
• Advice
5. Sample
• Five Superintendents of the six districts in region with
three or more trained in OPES agreed to be interviewed
• All RttT Districts
• All Typology 6 or 7 (urban/suburban with
high median income)
• Varied enrollment ( 2000-5000) with 5-13
Principals
• Varied % free/reduced lunch population (4-
40%)
• One district partially implementing as a pilot
• Second largest
• Highest % free-reduced lunch
• 8 Principals evaluated with OPES
6. Superintendent Findings
• All completed four sessions of training & found
training to be very helpful
• Not implementing or not fully implementing
• Greater focus on teacher evaluation changes
• All currently using Danielson model for teacher
evaluation and satisfied but adapting
• OPES an improvement over past practice
7. Evaluator Findings
• All completed four sessions of training and found
scenarios to be especially helpful but a lack of clarity
around latitude
• Modified rubric and forms
• Prompted great conversations
• Too much variation between evaluators
• Time concerns
• Equity concerns about use of student growth measures
8. Principal Findings
• Did not receive training- need training
• Pilot process, no student performance/growth measured used, not
part of official record
• Varied experiences in number of meetings and artifact
expectations
• Best evaluation experience, affirming, very collaborative, lots of
conversation and input
• Fit well with other district and building processes/goals
• Concerns about use of student performance measures
• Created empathy in teachers
9. Common Findings Across
Groups
• Training helpful and needed; need clarity
• Is a very positive, collaborative process
• Need more consistency in process
• Piloting time is essential for best use & buy-in
• Student Performance/growth component must be
established
• OTES and OPES intertwined
10. Questions
If you wish to contact us for more information:
Jill.Lindsey@wright.edu
Suzanne.Franco@wright.edu
Ted.Zigler@wright.edu