LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestras Condiciones de uso y nuestra Política de privacidad para más información.
LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestra Política de privacidad y nuestras Condiciones de uso para más información.
Student & Learner evaluation during and post COVID19
These are the slides from a webinar I gave for the EDEN NAP series (European Distance Education Network). The session focuses on proctoring tools for online exams, the use of Open Book Exams and looks into online group exams as a means to cover multiple online evaluations.
COVID19 has pushed us into the online learning world with a vengeance. But there is still one huge transformation to be done: moving from f-2-F evaluations to online evaluations. In my view, the best way to move forward is by exchanging experiences and visions. So, let me start with sharing my journey on becoming an expert and being involved with formal and non-formal evaluations. I got my master’s in education and my Ph.D. by taking only 2 exams…. One exam was online, the other was an oral defence. So, in terms of avoiding formal exams I feel like the King of the Hill.
But in my past, I have worked for two sectors that each demand mandatory certification for health and safety standards: I worked in the Health sector for the Institute of Tropical Medicine, and in the Renewable energy sector for EIT InnoEnergy.
So my experience is situated somewhere between formal and non-formal evaluation. These past experiences have constructed my vision upon evaluation: the best evaluation is a mix of both formal and non-formal evaluation supporting the self-directed learning of students and their capacity to self-evaluate responsibly. Always looking at the best possible option of evaluation.
Before we start, we should all reflect on this:
“What is the purpose of an exam or any learning evaluation?” Are we scared that our students misuse their freedom when taking exams or assignments, are we scared they will cheat? Or do we want them to become ethical problem solvers and do we evaluate them as such?
In this webinar we will look at three options that can be used for online evaluations during and after COVID19. Therefore this webinar is divided into three sections: one section looking at proctoring tools, one on moving from closed book exams to open book exams and the last section focusing on team or group exams. As you can see on the bottom of this slide, each section will be followed immediately with a Q&A moment. And as Tim mentioned, feel free to immediately write down any questions you might have in the chat of this webinar.
Proctoring tools is an educational technology that allows you to organize exams for students located anywhere where they can have a steady internet connection. A proctoring software monitors each student or learner and provides an environment where cheating is made difficult. There are a number of proctoring tools out there and this type of tool set is growing each year as the interest in using proctoring tools is growing rapidly. Proctoring tools are used in universities, but also in companies where health and safety standards need to be met and as such professional certification is subject to rigid, concise policies. Basically, a proctoring tool is a software that is put into a network and allows students or learners to be assessed at a distance, while they are proctored or monitored by a software that captures the immediate environment of the student as well as the screens they open or use. The proctoring tool takes over the monitoring task of a teacher or trainer at an exam.
Proctoring tools are of interest to all of us who work with remote students and learners. But also for those of us who have really huge numbers of students. As some proctoring tools also lend itself for proctoring exams in a real life auditorium, where the tool is set up on the computers of each learner to monitor their online behavior, but where the teachers keep track of the “close distancing” some learners might try to get the answers to the test.
Most proctoring tools sell themselves as being cheat-proof. But let’s be honest, is cheating always something to avoid? Or can some cheating be considered as strong problem-solving, or a high capacity curation of content, or even a solid collaboration between peers finding out collaboratively what the best answer to a question is. With all of the professional world zooming with the concept of “21st century skills”, which includes creativity, communication, problem-solving and such… it is weird that we would only focus on the classic, old-school focus of ‘not-cheating’.
EIT InnoEnergy works with different universities and different corporate partners to deliver or collaborate on courses and coaching in the renewable energy sector. Some of our university partners are already using Proctoring tools, TU Delft uses ProctorExam for up to 2500 students per exam, and KTH the Royal Institute of Technology in Sweden uses DigiExam to organize their exams on campus yet with a proctoring tool monitoring their student computers. We have some expertise in this area, but with the COVID19 we want to expand and rollout one particular tool on of proctoring tool so that we can all benefit from it across universities and corporations. The proctoring tools are no exam delivery platforms, they are add-ons that focus on limiting the exam environment to sites that are allowed by the teacher. But after having looked at different proctoring tools, it is safe to say that most of the steps they perform are similar. First of all the tool will establish the identity of the learner or student. Most of the time this is done by asking the student to show their ID card, which the system will then copy. In the next step a picture is taken of the student face. And then the student environment is screened and monitored. This step might differ slightly between systems. One of the tools I compared was called ProctorExam, this tool has a mobile app that allows it to monitor the full environment, as well as screen the computerscreen for potential virtual machines. Although proctoring tools emphasize safety, one must never doubt the creativity of students. There are cheat sheets and hacks for many of the softwares out there and students are eager to share them. A student can also try to use a virtual machine (this is an emulation of a computer). A virtual machine is difficult to detect as it does not seem to be ‘real’, but by recording the learner at a certain corner using a mobile device, one can easily see what they are looking at and whether a virtual machine is set up.
Once these steps are taken, the learner can start to take the exam and all of their actions on the computer will be monitored, recorded and screened for possible cheating. If the students open a screen, it will be logged by the proctoring tool. If someone walks into the room, it will be logged by the proctoring tool Each new action the student takes is a log entry for the tool and will subsequently be proctored. This recording and reviewing process can be delivered live (making use of English life proctors from centres who have trained proctors with English certification-based people. Or it can be reviewed on the basis of a recording. This reviewing of the recording is returned with a report on how the learner did between 24 to 48 hours maximum after the exam.
As a teacher the exam preparation time is just the same as before, but of course you need to prep the exam so it can be delivered through the proctoring tool. Each of the tools has a fairly similar way to setup an exam. I will focus on the most commonly used option, where you use your own Learning Management System as the platform to deliver the exam. This means the weighing of your exam questions, the points earned by each correct or partially correct question is organized inside fo the
This enables you to pull the exam from an existing question database, or to use tools that are part of the curriculum already (like a dictionary the students are allowed to use). As a teacher you list which tools the students can see. This can be limited to the exam itself and nothing else, but you can also allow the students to access specific tools (e.g. 3D engineering design tool), or access specific websites (e.g. library of documentations they can use). All of these accepted sites are entered into the proctoring tool in advance. As a teacher you also plan which exam will be opened to which students at what time. And the teacher gets the reports from the proctoring tool in order to see for themselves whether any ‘cheating’ picked up by the tool or the proctors is in fact real cheating or something that is allowed for this particular exam. (e.g. if a parent comes in and asks the student in their mother tongue whether they want soup for lunch, it might create a possible alert for the proctor, but you as a teacher might understand that it is merely a short harmless interruption).
One element to take into consideration during COVID times is a possible spreading of the amount of students taking the exam at the same time. You can imagine that with COVID the demands for the proctoring tools have rising quite steeply, putting their server and proctoring capacity in a difficult place. One of the COVID solutions is to spread the amount of time between groups of students starting the exam, e.g. 150 students per hour starting the same exam. All proctoring tools are eagerly enhancing their server capacity at present.
So there are some benefits and some downsides on using proctoring tools.
Organising online exams does have some advantages: You have a higher flexibility in terms of location, which is very important as we don’t know the exit strategy of each country faced with Corona. There is a higher flexibility in terms of timing as the remote proctors are not limited to specific hours. For some students the anxiety might be reduced, because the learners can take their exams in a familiar environment. Given a lot of students need to take a specific knowledge-reproducing exam, the use of proctoring tools can provide a more secure setup for those examinations (e.g. health and security guidelines). But even post-COVID these digital examination options can be used for specific groups, where the distance exam is offered as an extra exam-facility Athletes on remote training camps or competitions Disabled students who cannot make it to class Learners facing long-term illness
All of these benefits make it worthwhile to consider purchasing and using proctoring tools.
Proctoring tools are reviewed by English speaking proctors, this makes native language exams difficult for non-English speaking countries. Teachers need time to trust these new systems Some practical limitations: Problem if a student needs to share sketches or drawings, as these must be seen individually by the teacher. It demands a lot of preparatory work from IT and admin, to link everything together (LTI and API compliance is essential!) A specific helpdesk needs to be put in place from central administration
Constant reevaluation of the best systems compared to what type of exam tools each department uses LTI and API compliancy is very important Setting up support (helpdesk, integration or migration). Purchasing the right tool while considering the type of exams that are preferred. Considering whether you want in-house proctors (teachers on campus) or external proctors (at a distance). The internal proctors might allow you to support native language exams, while the tool is then used to monitor accessed online tools. Does university or company policy allow these types of exams? Are student contracts covering online exams to earn their degree? Do you have an LMS or LXP installed at your university or company.
Talk all of us through the screenshot
In the end, the usefulness of a proctoring tool is decided upon by the institution and the teachers given the current challenges we face with COVID19.
Whether you want to start using proctoring exams or not, the question I raised at the beginning of this presentation stays the same: What is the purpose of an exam or any learning evaluation? What do we want our learners to learn? Do we want them to answer parrot-like to questions we offer them in an almost laboratory confined environment? Or do we want them to use knowledge to solve problems under real life conditions?
An option to consider for online student evaluations is the open book exam. If you as a teacher offer an open book exam, you do not necessarily need a proctoring tool. With an open book exam having an audio and video link between you and the student is enough (well, student or students, but we will look at group exams later in this presentation).
A closed book exam has been tested for a long period of time and it it still the most frequently used form of student evaluation. The process consists of an assessment of knowledge that is acquired over a fixed period of time and is then delivered in the form of answers to questions to show understanding and comprehension. During the process of answering questions, no additional sources (no documents, no peers, no experts) may be consulted in order to get to the answer requested.
Although the closed book exam is the most frequently used means of student evaluation, it has not been the longest living form of evaluation. Socratic method (https://en.wikipedia.org/wiki/Socratic_method) relies on testing knowledge, capacity, experience by digging deeper and deeper into a certain subject by the means of questions. This is a cooperative, argumentative dialogue between people, which means it can be 101 or among a team or group of people to assess the knowledge available, which is closer to the open book exam.
Opposite to the closed book exam there is the open book exam. With an open book exam the students or learners can choose to find answers to the proposed questions by using a specific or multiple documents, books or websites, or in some cases they can look for answers anywhere they like, as long as they provide an answer by the time the exam closes.
This is mostly a pedagogical based choice, although it can also be a philosophical choice to opt for one or the other, or both. One can compare it to the discussion on using Open Educational Resources or Closed, ad-hoc provided resources.
An open book exam is useful if You want to organize an evaluation online you want to test the ability to find and apply information they have processed previously. the answers sought rely on the capacity to apply, analyze, synthesize, compare/contrast or evaluate information. An open book exam tests whether the learner understands the big picture and how course concepts work together. The argumentation to come to a solution becomes more important, as well as the quality of its delivery, and possibly the creativity of combining the right concepts that have been taught.
What is required from the teacher side? (https://www.lib.sfu.ca/about/branches-depts/slc/learning/exam-types/open-book-exams) As a teacher the transformation from a closed to an open book exam will be perceived as easy or difficult depending on the teacher’s preference. But if a choice is made to build an open book exam (or an exam that is in part open), the teacher will need to think of the following:
Clearly indicate the time slot available for the exam Clearly state what you want the students to use (allowed sources) Is there a need to cite sources used? See whether the exam covers both easy to find as hard to find scenarios Provide student pointers on how to prepare for an open book exam (see link)
A variation on open book exams are take-home exam, this can also be done online, where the assignment is provided at one date, and the answer must be returned the day after.
Benefits of open book exams You can combine different concepts to build towards a solution Prior knowledge needs to be used to make sense of an overall big picture, proven by the different steps leading up to the solution Works really well in this data-driven age where interpretation of data and weighing correlation to causality is important. It provides a broader array of evaluating a students knowledge in addition to closed book exam 21st century skills can be part of the overall evaluation: communication skills, digital skills, self-directed learning (and finding), …
Downsides of open book exams Grading open book exams takes a lot more time, as this cannot be automated (despite some progress on AI based assessment tools) It lacks in-depth fact responses. E.g. for many of the health and safety standards you need to really know and be able to reproduce set answers which are better evaluated using closed book exams. It demands prior knowledge. Basic knowledge of a new discipline is frequently fact-based, this is not something you can easily evaluate in open exams. It is less rigorous in terms of grading, there is a lot of grey area: interpretation of the evaluation by the teacher, assumptions of prior knowledge…
Frequent misconceptions of open book exams Open book exams are easy: students need to consider which sources to use, they preferably will rely on previously obtained knowledge to be able to address the questions within the allotted time, students need to keep track of time and prioritize, and the way to the answer needs to be well-structured, well-argumented using concepts that were seen in the class or common to that particular sector.
No prior knowledge is needed: the timing is crucial, and being able to make a distinction between good, solid information and shady resources is important. In order to do this the learners must have active, prior knowledge to make the right decisions and work with the best sources. You also need to know basic concepts in order to move towards an answer.
Copying material is not answering open book questions: this must be emphasized at the beginning, it is not enough to copy material, the overall concept and use must be delivered using a personal voice which supports the personal understanding as well. Too many sources can diverge attention and omit a clear succinct argumentation brought forward in the students own words.
Each of these misconceptions must be made clear to the student, so they understand what to expect, how to perform during the exam and how to prepare (learning indeed!).
Example open book exam EIT InnoEnergy: energy data cases
For InnoEnergy’s students all learning in different areas of renewable energy, we have built an energy data case, where the students are given a data set to solve a particular challenge (e.g. wind turbine efficiency) and they need to analyze all of the data and produce a specific data-based solution for a particular section within wind energy. This combines all of their previously acquired knowledge and offers them a chance to show their ability to come up with the right answers given any data set relevant to the sector.
To build this case we asked two people from different backgrounds to build a data case specifically for academic students (master and Ph.D. students) and for professional learners. These data cases were then developed for two different audiences each: people who already knew a bit of coding, and people who were new to coding. Where the non-coding savvy audience would use tools to visualize data outcomes from the data sets that were available.
What we basically did is present relevant qualitative or quantitative data and then ask students interpretative and application questions: – What does the data show? What relevance does this data or does the scenario have in terms of [component of current topic]? What other factors could potentially affect this data? How would you test for these?
Within InnoEnergy we mostly have engineering students. This means they are frequently asked to use a design thinking approach and tackle real-life business challenges with teams of students. Trying to come up with prototypes, testing these in order to provide a working solution at the end. This means that group exams are important. But how can we organize group exams at a distance? And is this beneficial to the learner and the teacher? Let’s have a look at group exams in this last section of the webinar.
Group or team exams are evaluations where a whole team of students is evaluated in the same session. These groups of students need to work on a similar project, each using their own specific capacities.
Compare a team exam to a hackathon. Where different people work in teams and have to come up with a workable solution in 24 hours time. Each of the team members will take up a role that fits their capacity, this role will provide guidance into which type of tasks each of the members is responsible for, and at the end the final product is a culmination of the group work. The final product is than assessed by a jury of experts to decide which team has won the hackathon.
Btw, if you have never organized a hackathon, then you might consider enrolling in the DigiEduHack that is organized here (https://digieduhack.com/en/ ).
A team exam is important if there are different roles to be taken up when coming to a solution of a problem. A team exam might also be a good choice if teams of students have developed specific maturities or capacities in one or other area If teams of students have worked on a project, they can be evaluated as a group for each of their own specific roles are tasks.
The group examiners normally consist of the project supervisors, plus external experts or juries that are allowed to pose questions throughout the examination.
The external experts are presented The timing is clarified (this was also part of the preparation documents provided to the student group) Each member of the team needs to take up one part of the overall presentation of the group work (this is taken into consideration in the overall assessment) Each student needs to clarify the specific area they worked on, and emphasize its strengths and the considerations made during the project process. After the presentation feedback is given to the team (overall feedback and personal feedback), this feedback also consists of clarifying questions After this a joint open discussion starts, to go over problem analysis, theories used, specific aspects of the group work. On open questions, each of the team members is allowed to answer (= raise hand symbol) but the examiners choose who can answer the question. This allows for all the team members to provide answers to the questions. An individual Q & A session follows so that examiners can pose questions to each of the individual students on the necessary theories and concepts they should all understand.
Team exams tend to last longer than individual exams, so planning a break is a good idea for all.
Once the team exam is finished, the examiners discuss among themselves for the grading provided (team and individual grading) and the results are made public soon after that.
Great for evaluating multidisciplinary teams This type of evaluation resembles real life situations of evaluating teams on the work floor when engaged in a mutual project Fits innovation-driven curricula (e.g. using challenges and problems) The teachers can provide grades immediately after the exam (cfr. oral examinations)
It takes time as multiple people are involved The grading is based on the interpretation of the examiners The personalities of the team members can differ, so that some might come across more anxious or timid than others.
Teacher side Preparing the project questions Preparing a rubric that enables weighing each team member during the exam Attracting an external expert
ICT side Durable, consistent online meeting tool Recording for possible later litigation questions
Student side Clear understanding of the overall bigger picture Understand where the seen concepts come in Indicate where novel solutions were built and how Clearly indicate each team members responsibility during the project
“What is the purpose of an exam or any learning evaluation during and post COVID?” We should not be scared that our learners or students will cheat during exams or assignments, because we can ensure that our ways of evaluating our students are directed towards making them ethical problem solvers, and providing them with the right tools and support to move into the future and take their self-directed learning and their professional and academic evaluation into their own hands. If we want to address and include 21st century skills in our curricula, we need to invest in student evaluations that strengthen these skills by using the best pedagogical options out there to train them. For that reason, I am in favor of using a mix of closed and open book exams, include group exams to enable a more real-life professional environment in which to work and be evaluated. This combination seems to me to be the best way forward in combining formal and non-formal evaluations that are built on self-directed learning skills, self-evaluation and solving real life challenges.
Student & Learner evaluation during and post COVID19
during & after COVID19
Inge Ignatia de Waard
EIT InnoEnergy Sr. Learning Strategy & Innovation
Document on subject to be found here.
higher flexibility in terms of location
higher flexibility in terms of timing
Reduce some anxiety
proctoring tools as a more secure setup
for knowledge-reproduction exams
extra exam-facility (post-COVID)
• Athletes on remote training camps or competitions
• Disabled students who cannot make it to class
• Learners facing long-term illness
English language dominates Proctoring
Teachers need to build trust
practical limitations: sketches,
drawings, … must be seen individually
by the teacher.
a lot of preparatory work from
A specific helpdesk needs to be put in
place from central administration
REEVALUATION OF THE
LTI AND API
SETTING UP SUPPORT
CHOOSING THE RIGHT
HOUSE PROCTORS OR
DOES UNIVERSITY OR
ONLINE EXAMS TO
EARN THEIR DEGREE?
DO YOU HAVE AN LMS
OR LXP INSTALLED?
• Benefits of open book exams
• combine different concepts
towards a solution
• Show bigger picture,
• Qualitative and quantitative
• broader array of evaluating a
• Fits 21st century skills as an
• Grading open book exams takes
time, no current automation
• lacks in-depth fact responses
• demands prior knowledge
• less straight forward grading
• Open book exams are easy
• No prior knowledge is needed
• Copying material is not enough
parts of the whole
Feedback to the
real life project evaluation
Near immediate grading
It takes time as multiple
people are involved
The grading is based on the
interpretation of the
The personalities of the
team members can differ,
so that some might come
across more anxious or
timid than others.
• Teacher side
• IT side
• Student side
InnoEnergy Smart Cities Innovation Journey example
What is the purpose of an exam or
any learning evaluation
during and post COVID?
Give rise to Ethical
Feel free to
• Twitter: @Ignatia
• Google scholar:
Read paper on this here.