2. The changing face of assessment & feedback 05/12/2013
Overview
»Assessment and feedback challenges
»Influencing change through principles
»Feedback and feed forward
»Assessment and employability
»Electronic assessment management
The full story
Programme Report: www.jisc.ac.uk/assessmentandfeedback
2
3. The changing face of assessment & feedback 05/12/2013
Context
» Jisc Assessment and Feedback
Programme (2011 – 2014)
» 20 projects and 40 institutions
involved across the UK
» 3 strands focused on institutional
change, evaluation of technologies
and technology transfer
» Directly involved 2,200 staff and
6,000 students
www.jisc.ac.uk/assessmentandfeedback
3
4. The changing face of assessment & feedback 05/12/2013
4
A challenging landscape
»highly devolved & inconsistent (A)
»traditional practices dominate (B)
»lack of developmental focus (C)
»relevance to world of work (D)
»learner in passive role (E)
CC BY-NC-ND 2.0 Whirling Phoenix
5. From challenge to change 17/10/2013
The changing face of assessment & feedback 05/12/2013
5
The Vision: a principled approach
»need to articulate principles
»principles should demand action
»implementation can be teachercentric or learning-centric
CC BY 2.0 Mind_scratch
Resources
Why principles? http://www.reap.ac.uk/TheoryPractice/Principles.aspx
Overview of principles:
http://jiscdesignstudio.pbworks.com/w/page/40343419/Assessment-andfeedback-principles
6. The changing face of assessment & feedback 05/12/2013
Putting principles into practice
QUB assessment & feedback principles
6
7. The changing face of assessment & feedback 05/12/2013
7
Adapted from QUB model
8. The changing face of assessment & feedback 05/12/2013
8
Curriculum design
Implementing principles
University of Ulster ‘Viewpoints’ Cards
http://jiscdesignstudio.pbworks.com/w/file/70476581/Viewpoints%20A%26F%20Cards.pdf
9. The changing face of assessment & feedback 05/12/2013
9
Aligning technology with principles
University of Exeter ‘Tech Trumps’
http://jiscdesignstudio.pbworks.com/w/page/63225947/Technology%20Top%20Trumps
10. The changing face of assessment & feedback 05/12/2013
Promoting employability
»Emphasis on summative assessment
does not reflect working life
»Skills in giving feedback and using
feedback from multiple sources
essential
»Sometimes understanding the brief is
the hard part….
»Students can not always articulate the
skills they have
CC BY-NC-ND 2.0 Webmaster Birmingham City University
10
11. The changing face of assessment & feedback 05/12/2013
Dimensions of work-integrated assessment
University of Exeter Work Integrated Assessment Model
11
12. The changing face of assessment & feedback 05/12/2013
University of Exeter student evaluation of learning from applying model
12
13. 13
The changing face of assessment & feedback 05/12/2013
Peer review
»Most significant shift
towards assessment
for learning
»Students need to be
convinced of benefits
»Open source tools e.g.
Peerwise
CC BY-SA 2.0 AJC1
14. The changing face of assessment & feedback 05/12/2013
14
Longitudinal development: feeding
forward
»Feed forward
»Ipsative approaches
»Technology needed to
support information
sharing
CC BY-NC-ND 2.0 Waveneyavenue
15. The changing face of assessment & feedback 05/12/2013
15
Curriculum Design: scheduling
University of Dundee
course redesign using
University of Herfordshire
assessment timelines tool
16. The changing face of assessment & feedback 05/12/2013
16
Analysing feedback
»Feedback is a ‘black box’
»Programme teams don’t discuss
feedback
»Useful analytical tools available
CC BY-SA 2.0 xdxd_vs_xdxd
17. The changing face of assessment & feedback 05/12/2013
What kind of feedback do you give?
P1 - Praise. Motivating but if used indiscriminately can appear
insincere.
P2 - Recognising Progress (ipsative feedback). Can be motivating
and informs students about their learning. Lack of progress serves
as an early warning.
C - Critique. How work falls short of expectations or criteria; can be
discouraging if not accompanied by information on how to
improve.
A - Advice. Help students take future action to improve.
Q - Clarification requests. Asking learners to think more deeply
about their work and generate actions themselves.
O - Unclassified statements. Neutral comments, for example that
describe the piece of work but do not make any judgement.
(adapted from IOE feedback profile)
17
18. 18
The changing face of assessment & feedback 05/12/2013
IOE 2012 feedback profile
Summative assessments for 5 programmes
(N= 165)
Category of feedback
Average comments per
script
Rank
P1 Praise
4.4
1
P2 Ipsative (progress)
0 (negligible)
5
C1-C3 Critique
2.7
2
A1-A3 Advice for current or
future assignments
1.9 (mostly for current
assignment)
3
Q Questions and clarification
requests
0.1
4
19. 19
The changing face of assessment & feedback 05/12/2013
IOE 2013 feedback profile
Summative assessments for 5 programmes
(N= 113)
Category of feedback
Average comments per
script
Rank
P1 Praise
6.6
1
P2 Ipsative (progress)
0.3
5
C1-C3 Critique
3.6
3
A1-A3 Advice for current or
future assignments
3.7
2
Q Questions and clarification
requests
0.35
4
20. The changing face of assessment & feedback 05/12/2013
New approaches
Feedback is….
CC BY 2.0 Khalid albaih
20
21. The changing face of assessment & feedback 05/12/2013
21
Electronic Assessment Management
»Academics less keen than students
or administrators
»Technology is coming of age
»Clear evidence of workload savings
»Pre-requisite for analytics
»Electronic marking isn’t that bad!!!
CC BY-NC-SA 2.0 jackhynes
22. The changing face of assessment & feedback 05/12/2013
Process improvement
Manchester Metropolitan University Assessment Lifecycle
22
23. The changing face of assessment & feedback 05/12/2013
What do you want to know more about?
A Defining & applying principles
B Assessment & employability
C Feed forward/longitudinal development
D Electronic Assessment Management
E Other (say what in the chat box)
23
24. The changing face of assessment & feedback 05/12/2013
http://bit.ly/jiscdsaf
24
25. The changing face of assessment & feedback 05/12/2013
Briefings
»Changing assessment and feedback practice with the help of
technology
http://bit.ly/afguide-change
»Electronic Assessment Management
http://bit.ly/afguide-eam
»Enhancing student employability through technology
supported assessment and feedback
http://bit.ly/afguide-employability
»Feedback and feed forward: using technology to support
learner longitudinal development
http://bit.ly/afguide-feedforward
25
26. The changing face of assessment & feedback 05/12/2013
Any questions ?
26
Notas del editor
Influencing change in assessment and feedback practices through a principle-led approachAssessment and employability: the role of technology in supporting the development of skills and competences to enhance employment prospectsFeedback and feed forward: the role of technology in supporting learner engagement with feedback and improving progressionElectronic assessment management and how technology can support assessment lifecycle processes to make more effective use of resources
Jisc, a UK body supporting the use of digital technologies in UK higher education, further education and skills, funded this work involving over 40 UK higher education institutions.There were 3 strands focusing on institutional change, evaluation of technologies in use and development technology transfer. Overall there was a strong emphasis the educational rationale for enhancing learning and teaching through technology and delivering efficiencies and quality improvements.
Importance of baseline review.Showed pockets of good practice but overall a consistent picture of the challenges and highlighted the problems that exist with resistance to change, and the scaling up of good practice and innovation. A&F strategy and policy. Strategy documents tend to be quite procedural in focus and don’t reflect current thinking around effective assessment practice and the value that assessment can bring to learning. Another key issue is that devolved responsibility for assessment and feedback across faculties and service departments results in considerable inconsistencies, making it difficult to achieve parity of experience for learners. When it comes to academic practice the issues are varied and complex but include the emphasis on summative assessment and the persistence of traditional forms such as essays/exams which increase the workload burden.Timeliness, along with quality and consistency of feedback, was an issue across the board. Even where clear deadlines are set there isn’t always in time to feed into next assignment. Curriculum design (modular approach) can also provide barriers to the ongoing developmental approach to feedback at a programme level. The assessment and feedback process, particularly the emphasis on high-stakes assessment and the value that is placed on marks and grades, is very different to the formative ways professionals develop during their working life.There is a perception that learners don’t engage with the feedback they receive. Tutors may feel they have given a lot of feedback and support but it hasn’t been acted upon. Learners are seen as passive – waiting for feedback to be delivered to them but the reality is less clear cut as the value of acting on feedback is not always well-communicated, and was notably absent in most induction processes. Learning design often puts the learner in a passive role. Quick poll on which of these participants see as the biggest issue (mainly just to get them used to the system).
A key message from the programme was the need to make sure that the pedagogy was driving the technology and not the other way round.There was a collective view that what is considered good educational practice in each institution needed to be surfaced and articulated as the foundation for moving forward (not least because of the devolved nature of strategy and policy making already mentioned).One tried and tested way of doing this is by defining a set of principles that describe what the educational experience should be in your institution. The links on this slide point to some well-established sets of principles and many of the projects benchmarked themselves against these as a starting point. Others had already done considerable ground work in defining their own principles and the University of Hertfordshire's assessment for learning principles is one example that crops up throughout this presentation.Principles can be a kind of shorthand summing up a lot of evidence-based research but they aren't a lazy way out and they don't stand still. Many projects found that the dialogue around principles was as important in changing culture and practice as the end result. They also found that their thinking evolved over time and, in particular, evidence relating to the value of peer review appears to corroborate much of the newer research literature.It isn't an easy thing to get right: the principles institutions came up with had to gain widespread support yet still challenge current thinking and demand action rather than passive acceptance. Actually implementing the principles was also an interesting voyage of discovery because an apparently sound principle such as 'clarify what constitutes good performance' can actually be implemented on a spectrum that goes from doing this in a very teacher-led way to doing it in a way that is much more learning-centric and we will see more of this later.Ask are they aware of a defined set of A&F principles for their institution? Yes/No response.
The programme saw some excellent examples of how institutions have gone about putting principles into practice and the lessons they learned. Queen’s University, Belfast produced this diagram to show how each of its assessment and feedback principles supports an over arching aim which is to ‘Encourage positive motivational beliefs and self-esteem’. The diagram is colour coded to help with mapping a range of supporting resources to the application of particular principles. A decision tree offers guidance on ways of implementing each of the principles and suggests technologies that can support this.
We talked earlier about how the same educational principles could be implemented in ways that ranged from teacher centric to learning centric and the University of Ulster Viewpoints project (which was part of the Jisc curriculum design programme rather than this one) has produced some very useful staff development resources based on a simple workshop approach using cards to stimulate thinking at the right point in the design process. Their materials are freely available for others to use and the many institutions who have used them have been very, very positive about their value.
Finally we get round to mentioning technology and there are some lovely examples of resources projects have used to help people make technology choices based on the type of the learning and teaching approach they are trying to implement. These Tech Trumps cards are one example from the University of Exeter but Queens, Winchester and others have done similar things and you can find them all in the Design Studio.
Employability was a major theme of many projects particularly those at Cornwall College, the University of Exeter and MMU. Whatever debates we might have about the true value and purpose of education, the notion that study leads to improved employability prospects is currently of major importance to students and to senior managers.The projects tried to address the fact that traditional assessment methods such as essays and an emphasis on summative assessment do not prepare students well for the world of work. In the 'real world' acquiring and making sense of feedback from a range of sources and giving feedback are essential skills.
Discuss issues around settling on a term.An issue noted at Exeter is that, although students may have many of the vocational competencies required by employers, they are often unable to articulate and demonstrate their abilities in job interviews etc. The University of Exeter brought together the concepts of educational principles and employability in developing a model to help ensure assessment and feedback practice was developing appropriate competencies in its learners (and helping the learners to evidence those competencies). The model uses dimensions such as peer review, collaborative working and audience to help ensure that assessments are structured in such a way as to evidence these skills. Model flexible enough to be adapted to different contextsThe different colours show the design evolving through review & QA processes.An example of how the use of these approaches has impacted actual practice includes a Psychology module where the original assessment was planned as a written exam and which was redesigned so that a collaboratively designed flyer became the main assessment. The assessment involved working with two external 'audiences': a Lived Experienced Group (consisting of people with mental health issues) and the University of Exeter's own Wellbeing service for students.
This pie chart shows the outcomes of a student evaluation where they were asked to identify what they had learned from this module. What this shows quite clearly is the link between the revised module structure and the development of some of the key professional skills that were the intended outcomes of the module.
Peer review is a key means of developing some of the skills required in the world of work.Peer review activities weren't a major feature of the programme overall but they do appear to be one of the success stories. There are a number of good examples: it was a major focus of the University of Edinburgh's project which looked at use of the Peerwise tool. Piloting use of the University of Westminster's Making Assessment Count or MAC approach across a number of different institutions seems to show that a combination of self-reflection and peer review gave the best results in terms of enhancing learning and the report from this project is very emphatic about the fact that peer review was the element of the approach that had the best alignment with assessment for learning principles. This corroborates much of David Nicol's recent research in which he suggests that self reflection needs to be supported by an external element and that evaluating other pieces of work completed to the same assignment brief is an immensely powerful way of achieving this.The big note of caution, especially from the MAC implementations is the importance of good induction for students to ensure that they do understand the purpose and value of peer review activities. The idea that you go to university or college to learn from an expert seems to be deeply ingrained in the student psyche.It seems that we as educators aren't helping students make the connection between activities such as peer review, which may be unfamiliar and uncomfortable for some, and the competencies they will need in employment.
One of the biggest shifts we saw throughout the programme was in the balance from summative to formative assessment and hence in assessment of, to assessment for, learning. This was a key aim of many of the projects who wanted to focus on learner longitudinal development, feeding forward and ipsative approaches whereby feedback acknowledges progress against the learner's previous performance regardless of achievement.Although this is primarily a learning design issue, technology has a vital role to play here in terms of storing feedback across a programme and making it easily accessible to both staff and students in order to develop a long-term picture but it was found that most of the VLEs in common use record both marks and feedback at a module level so that it is not easy to gain this kind of overview at programme level.
The design of the curriculum does of course need to permit this type of development and common problems include lack of formative opportunities or lack of time for feedback to inform the next assignment. This diagram shows an immensely useful tool, known as assessment timelines, that was developed by the University of Hertfordshire. Many other institutions have used this approach of mapping existing practice against a timeline to see where improvements can be made. This is an example of before and after timelines from the University of Dundee.The programme produced evidence that over-assessment and assessment bunching has an adverse effect on student attainment. It might be obvious but the programme has produced hard evidence for something that has, to date, been viewed as gut feeling or common sense. MMU undertook some large scale research into this topic and has presented good data.Actually collecting data about assessment & feedback practice has been revealing in many ways. In MMU's case they found that at peak times they could be dealing with 13,000 assignment submissions in a week which has all sorts of implications for their administrative and IT support systems as well as for their learners. A reduction in summative assessment has helped alleviate this.The University of Glamorgan (now part of the University of South Wales) used personalised assessment diaries to help avoid assignment bunching and help students with their time management.
One of the most astonishing things the programme revealed was the lack of discussion that seems to take place around approaches to feedback. The IOE, in looking to implement a more longitudinal approach, described feedback as taking place in a 'black box' and their experience seems to be borne out elsewhere. The focus of quality assurance activities seems to be exclusively on comparing and moderating marks and it appears that there is little, if any, discussion within programme teams about individual approaches to feedback. It is therefore unsurprising that students talk about inconsistency in the feedback they receive.Another of the really valuable pieces of the evidence base from this programme is a range of audits, in different institutions and using different methods, of the types of feedback given. A number of the projects have produced tools for profiling and analysing feedback and many of these are readily available for others to use.
This is an example of the feedback profile used by the Institute of Education. What is interesting overall is that all of the institutions who undertook analyses found that feedback appeared to be skewed in particular ways rather than balanced (precisely how it was skewed varied according to subject and institutional context) but much of it was short term rather than truly developmental. Again rather than focus on any of specific differences, the value of this work overall was really in the eye-opening that occurred and the dialogue that was stimulated.
Formative assessment was not surprisingly dominated by advice for current assignment but summative was more unexpected.Praise/critique
Please also take a look at the work of the University of Dundee who had a couple of projects in this area. One showed how a much more open approach facilitated team teaching in a distance learning environment and improved learning as well as making for much greater staff satisfaction on a course that had had its staff FTE halved over the last few years. The other used a scaffolded approach of assignment cover sheets and reflection and dialogue via a wiki on another PG course and it is interesting how their effectiveness in closing the feedback loop has been more successful than similar approaches without all of these steps where demands for student self-reflection have been seen as a bureaucratic exercise.
One of the clearest messages from the programme is that electronic assessment management is both effective and efficient and is the best way to meet the needs of a very significant proportion of learners. In using the term we are referring to a number of different processes which can often be conflated in general discussion about the value of the approach.The perspectives of the various stakeholders on certain types of changes to assessment and feedback practice is a huge and very interesting topic that the programme addressed and there is a significant body of evidence around perceptions and what can change them. Possibly the main point to make here is that, while you all know how polarised the for and against camps can be, there are some really interesting examples from the programme of winning over dissenters and of them going on to champion the use of approaches that they have become convinced represent a significant improvement for learners and teachers.This is not to dismiss the concerns of those who have tried new approaches and found they did not live up to the hype. We are at an interesting point in relation to the technology that supports assessment and feedback. Unlike some of the cool and wizzy things people are innovating with in other aspects of learning and teaching, many of the available technologies have appeared clunky and poorly integrated up till now. It appears that we are however reaching a point where the technologies are coming of age and the new breed of products represents a significant improvement on the old.The University of Huddersfield undertook the most extensive study into electronic assessment management and their report is strongly recommended to anybody who has an interest in this topic.
Having said that the suite of available technologies may not be ideal, it is clear that business processes represent a bigger challenge. Alongside the many success stories of electronic assessment management there are some cautionary tales including a very honest report from the University of Exeter about why its attempt to establish end to end electronic processes was unable to fulfil all of its objectives at the time. In large part this was due to the variability of business processes across the institution. This is an issue that came up time and time again in the programme and gives rise to concerns not just around inefficiency but also around the parity of experience afforded to learners in different parts of the institution.There is a lot of interest in learning analytics at the moment. You need to get business process right to implement electronic assessment management and you need EAM to get the kind of data that can turn into meaningful analytics. It is too big a topic to fit in now but I will again refer you to the work that Huddersfield (and also Hertfordshire) have been doing in terms of the impact on students of showing them where their performance sits within a cohort.On the subject of assessment management - making appropriate choices between different types of activity is an important factor in resource management. There is a very useful assessment resource calculator produced by the University of Hertfordshire to enable judgements about appropriate assessment methods to be viewed in terms of both staff and student effort and they make the important point that such a tool must be used in the context of a bigger discussion about what constitutes good assessment design which for them includes reference to their Assessment-for-Learning principles.