Web & Social Media Analytics Previous Year Question Paper.pdf
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
1. A learning partner throughout your career
Weighing the pig doesn’t make it fatter
…or does it?
CPD evaluation under the microscope
Silvana Richardson
2. A learning partner throughout your career
The plan
1. Why evaluate CPD
2. Evaluating CPD – a useful model
3. Take-aways for your organisation
3. A learning partner throughout your career
The plan
1. Why evaluate CPD
2. Evaluating CPD – a useful model
3. Take-aways for your organisation
4. A learning partner throughout your career
Warming up
With your group, discuss the following questions:
a) Is it important to regularly evaluate the Continuing
Professional Development (CPD) programme in place
in your organisation? If so, why? If not, why not?
b) Do you regularly evaluate the quality and impact
of the CPD programme in your organisation? If so, how?
If not, why not?
CPD evaluation in your organisation
5. A learning partner throughout your career
Why CPD evaluation matters
How do you know that your CPD offer is having the greatest
possible impact on teacher and student learning?
A lot of good things are done
in the name of professional development.
But so are a lot of rotten things.
Evaluation provides the key to making
the distinction between the two.
Guskey (2000)
6. A learning partner throughout your career
The plan
1. Why evaluate CPD
2. Evaluating CPD – a useful model
3. Take-aways for your organisation
7. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
8. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
9. A learning partner throughout your career
Level
To the professional development experience
Have they enjoyed the experience?
Was the content useful? Did it make sense?
Were the activities were effective?
Was the trainer knowledgeable?
May include environmental factors
The venue – e.g. Was the room at the right temperature?
The facilities – e.g. Were the chairs comfortable?
Refreshments – e.g. Was the coffee hot and ready on time?
Tool
End of session questionnaire - ‘the happy sheet’
1 Participants’ reactions
10. A learning partner throughout your career
Level
‘ Happy sheet’ – example
A combination of statements to rate and open questions
Participants select a value from 1 – 10
1. The content of the workshop was good.
2. The professional challenge of this workshop was suitable to me.
3. This workshop was useful for me.
4. I would recommend this workshop to others.
Participants answer open questions
1. What did you find useful/like about the workshop?
2. What could have been better?
3. What might you do differently as a result of this workshop?
1 Participants’ reactions
11. A learning partner throughout your career
Evaluating evaluation tools
With your group, discuss:
a) How useful is the information elicited by participant
reaction feedback forms?
b) (How) can they lead to an improved CPD offer?
c) What are the limitations of ‘happy sheets’?
The value of ‘happy sheets’
12. A learning partner throughout your career
Level
‘ Happy sheets’
• The most common form of professional development evaluations
• The easiest type of information to gather and analyse
1 Participants’ reactions
reveal the entertainment
value of an activity,
not its quality or worth.
can help improve future
design and delivery of
programmes or activities.
13. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
14. A learning partner throughout your career
What have participants learnt as a result of their engagement
in CPD activities?
What knowledge and skills have they gained?
Possible tools
• Assessments – e.g. assignments in assessed programmes
• Quizzes
• Exit tickets
• Simulation or full-scale skill demonstrations
–e.g. microteaching, assessed teaching practice
• Self-evaluations
• Portfolios assembled by participants documenting learning
• Participants’ entries in reflective journals, etc.
Level Participants’ learning2
15. A learning partner throughout your career
Exit tickets
Level Participants’ learning2
Know? Used?
16. A learning partner throughout your career
Exit tickets
• Elicit short, swift, straight answers to a few simple questions
at the end of a CPD activity.
• Provide limited space for writing to help participants respond
in a focused way.
Level Participants’ learning2
17. A learning partner throughout your career
Exit tickets - example
1. What I didn’t know before this session
_________________________________
2. What I might need support with
_________________________________
3. How I feel I have progressed as a result of this session
Level Participants’ learning2
18. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
19. A learning partner throughout your career
Are the organisation’s policies, procedures and practices
aligned with the changes proposed by the CPD activities?
Has the organisation made the necessary resources
for change and learning available?
Were changes at the individual level encouraged and
supported at all levels? (Does the organisation encourage
individual changes to become embedded in the system?)
Were successes recognized and shared?
Possible tools
• Minutes from follow-up meetings
• Questionnaires
• Interviews and focus groups with teachers and managers
Level
Institution’s capacity
to support change3
20. A learning partner throughout your career
Alignment of policies and procedures
with the changes proposed by CPD – example 1
Level
Institution’s capacity
to support change3
Introduced through CPD School policy
21. A learning partner throughout your career
Alignment of policies and procedures
with the changes proposed by CPD – example 2
Level
Institution’s capacity
to support change3
22. A learning partner throughout your career
From banning to principled use
Level
Institution’s capacity
to support change3
No reason
given
Reason
given
23. A learning partner throughout your career
Making the necessary resources
for change and learning available
Level
Institution’s capacity
to support change3
24. A learning partner throughout your career
Making the necessary resources
for change and learning available
Level
Institution’s capacity
to support change3
25. A learning partner throughout your career
A change of focus
Discuss the following question with your partner
Why do you think the focus shifts from the individual
participant in the CPD activity to the organization
at this level?
From the participant to the organisation
26. A learning partner throughout your career
From the participant to the organisation
Level
Institution’s capacity
to support change3
Organisational variables can be key
to the success of any professional
development effort. They can also hinder
or prevent success, even when
the individual aspects of professional
development are done right.
Sparks (1996)
27. A learning partner throughout your career
Organisational variables
Level
Institution’s capacity
to support change3
Problems at Level 3 can cancel gains made at Levels 1 and 2.
Gathering information at Level 3 is generally more complicated
than at previous levels.
28. A learning partner throughout your career
Organisational variables and inspection
Level
Institution’s capacity
to support change3
Given how important it is that there is alignment
between organisational policies and practices,
and the changes advocated by the CPD programme,
should this not feature somehow
in inspection schemes’ quality standards?
29. A learning partner throughout your career
Organisational variables and inspection
Level
Institution’s capacity
to support change3
9 Staff Profile and Development
Indicators of excellence
The institution is supportive towards staff undertaking additional
training or professional development by, for instance:
- allowing flexible working hours and/or covering staff during
absence for training, attendance conferences etc.
- providing a degree of financial support for training, attendance
at conferences etc.
The Eaquals Inspection Scheme Manual Version 7.1 (2016)
Given how important it is that schools make the necessary
resources for change and learning available, should this be
an indicator of excellence or of compliance with the standards?
30. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
31. A learning partner throughout your career
Focus: transferability to practice
Did the new knowledge and skills that participants learnt
make a difference in their professional practice?
Possible tools
• Lesson observation – live, video or audio
• Learning walks
• Questionnaires
• Focus groups
• Structured interviews with teachers and their supervisors
• Examination of written personal reflections (participants’
journals or portfolios)
• Line management meetings
• Student feedback
Level
Participants’ use of the
new knowledge and skills4
32. A learning partner throughout your career
Timing issues
• This information cannot be gathered at the end
of a professional development session.
• Time must pass to allow participants to adapt
the new ideas and practices to their settings.
• Because implementation is often a gradual and uneven
process, you may need to measure progress at several
time intervals.
Level
Participants’ use of the
new knowledge and skills4
33. A learning partner throughout your career
Learning walks
Level
Participants’ use of the
new knowledge and skills4
Know? Done?
34. A learning partner throughout your career
Learning walks
Level
Participants’ use of the
new knowledge and skills4
To what extent is all the CPD work done
on X actually embedded in practice?
What CPD work will we need to do
next to improve or develop further?
• Developmental and constructive whole-school observations
with a clearly articulated, specific focus linked to professional
learning and school priorities.
• The aim is to collect evidence about teaching and learning,
evidence of progress and areas for school development.
LOOKING
BACK
LOOKING
FORWARD
35. A learning partner throughout your career
One-to-one student tutorials (2017)
Why focus on tutorials?
• Effective tutorials enable teachers to empower learners
to reflect, self-assess, understand their progress more
effectively, and set relevant individual learning goals.
• The learning walk would be an opportunity to gain insights
into the normal experience of students and teachers during
tutorial meetings.
Learning walks – an example
36. A learning partner throughout your career
One-to-one student tutorials (2017)
Methods for gathering data
• No live physical observers (unlike most learning walks)
• Each teacher audio recorded at least one tutorial
Added value
It gave all the teachers the opportunity to listen to their own
tutorial(s), notice and critically reflect on how they conduct
tutorials.
Learning walks – an example
37. A learning partner throughout your career
One-to-one student tutorials (2017)
Procedure for teachers
Invite student participants
Seek consent from students
Record the tutorial
Self-assess the tutorial against criteria
Submit the audio-recording and self-assessment for feedback
Learning walks – an example
1
2
3
4
5
38. A learning partner throughout your career
One-to-one student tutorials (2017)
Procedure for critical friends
Listen to the recording and take notes of key points.
Read teacher’s self-assessment.
Comment on the teacher’s self-assessment and give
feedback on key points that the teacher missed.
Collect key strengths and issues, anonymise data.
Learning walks – an example
1
2
3
4
39. A learning partner throughout your career
Tutorial learning walk
1. Read a teacher’s self assessment after listening to her
own tutorial (in the second column) and the comments
made by the teacher’s critical friend.
2. With your partner, comment on the learning
value of this learning walk for the individual
teacher.
Recording sample
A2 Arabic speaker
40. A learning partner throughout your career
One-to-one student tutorials (2017)
Procedure for academic managers
Collate all the strengths and issues.
Analyse and thematise.
Write a report with recommendations.
Learning walks – an example
1
2
3
41. A learning partner throughout your career
Tutorial learning walk
1. Quickly scan the report (from page 2)
2. With your partner, discuss the value
of the whole process for the organisation.
Report sample
42. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
43. A learning partner throughout your career
Did the CPD programme benefit students? If so, how?
What was the impact on students?
Did it affect student performance or achievement?
Are students more confident as learners?
Did it influence students' physical or emotional well-being?
Is student attendance improving? Are dropouts decreasing?
Possible tools
School, teacher and student records:
• Formative assessment data –e.g. portfolio evaluations
• Test scores
• Samples of students’ work, homework completion rates
• Changes in study habits
• Attendance, retention, completion, achievement and
dropout rates
• Data collected during teacher research projects
• Interviews with students, parents, teachers, and managers
Level
Students’ learning
outcomes5
44. A learning partner throughout your career
A word of caution
Level
Students’ learning
outcomes5
The relationship between
professional development and
improvements in student learning
in real-world settings is far too complex
and includes too many intervening
variables to permit causal inferences.
Guskey (1997)
45. A learning partner throughout your career
Guskey’s five-levels of information (2000)
Participants’ reactions
Participants’ learning
Institution’s capacity
to support change
Participants’ use of
the new knowledge
1
2
3
4
Students’ learning
outcomes
5
Evaluating CPD - a useful model
46. A learning partner throughout your career
The plan
1. Why evaluate CPD
2. Evaluating CPD – a useful model
3. Take-aways for your organisation
47. A learning partner throughout your career
CPD evaluation
in your organisation
In pairs discuss the following questions
1. What level(s) does
the CPD evaluation
in your organisation
address?
2. What could you do
to make CPD
evaluation (even
more) robust?
Guskey’s five levels
49. A learning partner throughout your career
Thanks!
silvana.richardson@bellenglish.com
www.bellenglish.com
Follow me on Twitter @laioli
@bellteachers
Notas del editor
.
You may think it’s important to evaluate the CPD programme in principle, and yet not do it regularly or at all.
Take feedback on b) how
Evaluating the quality and impact of their CPD programmes on a regular basis enables organisations to diagnose what has and has not worked, capitalize on those strategies that best promote teaching learning,
provide evidence of progress, thus becoming more accountable
and base future designs on sound information.
Effective professional development evaluations require the collection and analysis of the five critical levels of information in Guskey, 2000a.
With each succeeding level, the process of gathering evaluation information gets a bit more complex.
And because each level builds on those that come before, success at one level is usually necessary for success at higher levels.
Each level builds on previous levels and is important both to establish the impact of professional development programmes (feedback) and to inform the design of future programmes (feedforward).
Level 1: Participants' Reactions
The first level of evaluation looks at participants' reactions to the professional development experience.
This is the most common form of professional development evaluations, and the easiest type of information to gather and analyze.
At Level 1, you address questions focusing on whether or not participants liked the experience. Did they feel their time was well spent? Did the material make sense to them? Were the activities well planned and meaningful? Was the leader knowledgeable and helpful? Did the participants find the information useful?
Important questions for professional development workshops and seminars also include, Was the coffee hot and ready on time? Was the room at the right temperature? Were the chairs comfortable? To some, questions such as these may seem silly and inconsequential. But experienced professional developers know the importance of attending to these basic human needs.
Information on participants' reactions is generally gathered through questionnaires handed out at the end of a session or activity. These questionnaires typically include a combination of rating-scale items and open-ended response questions that allow participants to make personal comments. Because of the general nature of this information, many organizations use the same questionnaire for all their professional development activities.
Some educators refer to these measures of participants' reactions as “happiness quotients,” insisting that they reveal only the entertainment value of an activity, not its quality or worth. But measuring participants' initial satisfaction with the experience can help you improve the design and delivery of programs or activities in valid ways.
Some educators refer to these measures of participants' reactions as “happiness quotients,” insisting that they reveal only the entertainment value of an activity, not its quality or worth.
But measuring participants' initial satisfaction with the experience can help you improve the design and delivery of programs or activities in valid ways.
Effective professional development evaluations require the collection and analysis of the five critical levels of information in Guskey, 2000a.
With each succeeding level, the process of gathering evaluation information gets a bit more complex.
And because each level builds on those that come before, success at one level is usually necessary for success at higher levels
In addition to liking their professional development experience,
we also hope that participants learn something from it.
Level 2 focuses on measuring the knowledge and skills that participants gained.
Depending on the goals of the programme or activity, this can involve anything from …
Although you can usually gather Level 2 evaluation information at the completion of a professional development activity, it requires more than a standardized form. Measures must show attainment of specific learning goals. This means that indicators of successful learning need to be outlined before activities begin. You can use this information as a basis for improving the content, format, and organization of the program or activities.
Elicit:
-Raise your hands if …
you know what exit tickets are
You use exit tickets regularly in CPD activities
Elicit:
-Raise your hands if …
you know what exit tickets are
You use exit tickets regularly in CPD activities
I
Effective professional development evaluations require the collection and analysis of the five critical levels of information in Guskey, 2000a.
With each succeeding level, the process of gathering evaluation information gets a bit more complex.
And because each level builds on those that come before, success at one level is usually necessary for success at higher levels
Level 3 - This level of evaluation focuses on the institution’s capacity to support change and seeks to establish whether its policies, procedures and practices are aligned with the changes proposed by the CPD activities –I’ll give an e.g. in a min.
has made the necessary resources for change and learning available -e.g. time for collaborative work - and whether it encourages individual changes to become embedded in the system. Gathering information about this - albeit difficult - is important, as lack of institutional support can undermine the impact of professional development initiatives. Tools that can be used include the minutes for meetings, questionnaires, interviews and focus groups with teachers and managers.
Guskey here focuses on whether CPD is consistent with the institution’s values, principles, policies , etc. which puts CPD in a subservient role.
However, in my organisation I’ve seen CPD – particularly when it’s evidence-informed and sound- drive positive changes in policies and even values. So I think it’s both ways.
An example – use of the students’ L1 – Evidence-informed CPD on this using the best available evidence will promote a change of mindset – from seeing the use of the students’ L1 as problematic to using the students’ L1 as the most valuable resource for learning additional languages. This is often at odds with schools policies which promote English only.
For example, banning the use of mobile phones in lessons as a school-wide policy and encouraging m-learning in CPD. (happened in my org, led to changing the policy).
Another example – use of the students’ L1 – Evidence-informed CPD on this using the best available evidence will promote a change of mindset – from seeing the use of the students’ L1 as problematic to using the students’ L1 as the most valuable resource for learning additional languages. This is often at odds with schools policies which promote English only.
For example, banning the use of mobile phones in lessons as a school-wide policy and encouraging m-learning in CPD. (happened in my org, led to changing the policy)
-e.g. time for collaborative work, e.g. peer coaching and mentoring, supporting teachers, including time for sharing and reflection.
Suppose, for example, that several teachers in a language school participate in a professional development event on collaborative action research, which is conducted by teacher teams and aims to overcome the isolation commonly experienced by classroom teachers by promoting collegial relationships.
They gain a thorough understanding of the theory and develop a plan for a collaborative action research project together, which requires release from teaching for peer observation, and to get together to analyse and interpret the results.
Following their training, they try to implement the project in the school, but the DoS -who missed the training, is primarily focused on staffing and sees any changes to the staffing rota that are not caused by sickness or annual leave as unwelcome disruptions– does not give the group the release time for peer observation and does not allow them to break away from the weekly INSETT programme for the duration of the project to give them time to analyse the results together. This thwarts these teachers’ efforts to implement what they have learnt in their practice. It also hinders learning which could benefit the whole school.
The lack of positive results in this case doesn't reflect poor training or inadequate learning, but rather organization policies that undermine implementation efforts.
Lack of organization support and change can sabotage any professional development effort.
Issues such as alignment and support can play a large part in determining the success of any professional development effort.
Problems at Level 3 can cancel the gains made at Levels 1 and 2 . That's why professional development evaluations must include information on organization support and change.
Gathering information at Level 3 is generally more complicated than at previous levels.
Procedures may involve analysing school records,
examining the minutes from follow-up meetings, administering questionnaires, and interviewing participants and school managers.
This information can be used to document and improve organization support, also to inform future change initiatives.
In the latest version of the inspection manual, these are the references to the evidence that EAQUALs inspectors seek regarding PD:
In the latest version of the inspection manual, these are the references to the evidence that EAQUALs inspectors seek regarding PD:
Effective professional development evaluations require the collection and analysis of the five critical levels of information in Guskey, 2000a.
With each succeeding level, the process of gathering evaluation information gets a bit more complex.
And because each level builds on those that come before, success at one level is usually necessary for success at higher levels
Level 4: Participants' Use of New Knowledge and Skills
The fourth level involves evaluating participants’ use of the new knowledge and skills that they have developed by taking part in CPD activities.
At Level 4 we ask, Did the new knowledge and skills that participants learned make a difference in their professional practice?
You may gather this information through questionnaires or structured interviews with participants and their supervisors, oral or written personal reflections, or examination of participants' journals or portfolios.
The most accurate information typically comes from direct observations.
Level 4: Participants' Use of New Knowledge and Skills
Unlike Levels 1 and 2, this information cannot be gathered at the end of a professional development session. Enough time must pass to allow participants to adapt the new ideas and practices to their settings. Because implementation is often a gradual and uneven process, you may also need to measure progress at several time intervals.
Because the focus here is on transferability to practice, this information cannot be collected at the end of a CPD activity, it can only be gathered once teachers have had time to implement and adapt the new strategies to their context.
You can analyze this information to help restructure future programs and activities to facilitate better and more consistent implementation.
At Bell we have been conducting LWs since 2014. The first few were rather top-down, and ‘done to’ the teachers, with Acad Mans deciding the assessment criteria, conducting the observations , analysing and interpreting the data, and writing the report.
So, last yr we decided to take LWs a step further to ensure teacher buy-in, but more importantly, to ensure that we all learnt and improved along the way.
Our focus last yr was on 1-2-1 tutorials between teachers and students.
Invite participants
First, teachers invited their students to take part in the learning walk –this meant students allowing teachers to record one full tutorial session with a mobile device.
Teachers were told that they would only need one recording of a student for self-assessment and feedback purposes, but that they might wish to invite more students to take part, record them all, and then decide which one they wanted to use for the learning walk.
Seek written consent
Teachers were asked to seek consent from the student(s) they would like to record in writing before the tutorial. A consent form was provided for this.
Invite participants
First, teachers invited their students to take part in the learning walk –this meant students allowing teachers to record one full tutorial session with a mobile device.
Teachers were told that they would only need one recording of a student for self-assessment and feedback purposes, but that they might wish to invite more students to take part, record them all, and then decide which one they wanted to use for the learning walk.
Seek written consent
Teachers were asked to seek consent from the student(s) they would like to record in writing before the tutorial. A consent form was provided for this.
Effective professional development evaluations require the collection and analysis of the five critical levels of information in Guskey, 2000a.
With each succeeding level, the process of gathering evaluation information gets a bit more complex.
And because each level builds on those that come before, success at one level is usually necessary for success at higher levels
Level 5 addresses “the bottom line”: How did the professional development activity affect students? Did it benefit them in any way?
In addition to the stated goals, the activity may result in important unintended outcomes.
For this reason, evaluations should always include multiple measures of student learning (Joyce, 1993).
e.g. Consider, for example, elementary school educators who participate in study groups dedicated to finding ways to improve the quality of students' writing and devise a series of strategies that they believe will work for their students. In gathering Level 5 information, they find that their students' scores on measures of writing ability over the school year increased significantly compared with those of comparable students whose teachers did not use these strategies. On further analysis, however, they discover that their students' scores on mathematics achievement declined compared with those of the other students. This unintended outcome apparently occurred because the teachers inadvertently sacrificed instructional time in mathematics to provide more time for writing. Had information at Level 5 been restricted to the single measure of students' writing, this important unintended result might have gone unnoticed.
Measures of student learning typically include cognitive indicators of student performance and achievement.
In addition, you may want to measure affective outcomes (attitudes and dispositions) and psychomotor outcomes (skills and behaviors). Examples include students' self-concepts, study habits, school attendance, homework completion rates, and classroom behaviors. You can also consider such schoolwide indicators as enrollment in advanced classes, member-ships in honor societies, participation in school-related activities, disciplinary actions, and retention or drop-out rates. Student and school records provide the majority of such information. You can also include results from questionnaires and structured interviews with students, parents, teachers, and administrators.
Level 5 information about a program's overall impact can guide improvements in all aspects of professional development, including program design, implementation, and follow-up.
In some cases, information on student learning outcomes is used to estimate the cost effectiveness of professional development, sometimes referred to as “return on investment” or “ROI evaluation” (Parry, 1996; Todnem & Warner, 1993).
Using these five levels of information in professional development evaluations, are we ready to “prove” that professional development programs make a difference?
Can we demonstrate that a particular professional development programme, and nothing else, is solely responsible for the school's 10 percent increase in student achievement scores?
Of course not. Professional development takes place in real-world settings.
But in the absence of proof, we can collect good evidence about whether a professional development programme has contributed to specific gains in student learning.
For example, use anecdotes and testimonials – while they should never form the basis of an entire evaluation, they offer the kind of personalized evidence that most people believe, and they should not be ignored as a source of information.
Using appropriate pre- and post-measures provide valuable information.
We have now looked at the five levels .
But how about your organisation?
What data do we draw on to plan our CPD directed?
Strategic priorities: the Bell Way and Education plan
Feedback from inspections (BC; ISI)
QA observation data analysis (quantitative and qualitative data)
Buzz observation data
T voice – evaluations and comments
What data do we draw on to plan our CPD directed?
Strategic priorities: the Bell Way and Education plan
Feedback from inspections (BC; ISI)
QA observation data analysis (quantitative and qualitative data)
Buzz observation data
T voice – evaluations and comments