Micromeritics - Fundamental and Derived Properties of Powders
Using methodological triangulation to analyse students’ use of recorded lectures
1. Using methodological triangulation
to analyse students’ use
of recorded lectures
July 7, 2012
Pierre Gorissen
Fontys University of Applied Sciences
The Netherlands
2. • PhD Research:
“Facilitating the use of recorded lectures”
• Pierre Gorissen (Fontys University of Applied Sciences)
• Dr. Jan van Bruggen (Open University of the Netherlands)
• Prof. Dr. Wim Jochems (Eindhoven University of Technology)
7. Multi step approach
#1 Let’s ask them:
- Survey
- Interviews
#2 Measure their use:
- Mediasite reports
- Log files
#3 Compare the results
- Triangulation
8. #1 Let’s ask them
• 1,122 studenten (203 Fontys / 919 TU/e)
• 7 courses (1 Fontys / 6 TU/e)
• Response rate 46.1% (517 students)
• Follow up interviews (30 minutes) with 14
students
9. Main results
• Most students watch recordings at home;
• No major technical problems;
• They know where to find the recordings;
• They would prefer for all courses to be
recorded.
10. Main results
• Reasons to watch:
- Missed one or more lectures;
- Prepare for exams / improving test scores;
- Improve retention of lecture materials;
• Reasons not to watch:
- Already attended the live lecture;
- No time;
- Didn’t feel they missed anything.
11. Further reading on step #1
Gorissen, P., van Bruggen, J.M. and Jochems, W. (in press) ‘Students
and recorded lectures: survey on current use and demands for
higher education’, Research in Learning Technology.
12. #2 Measure the use
Logfiles
Combine
Missing Irrelevant Identify Identify Remove
data Combined data data users sessions Outliers
sources dataset
Data cleaning
Mediasite
database
Analyse Filtered
Log results
dataset total
dataset
13. The log data
• 1,500,000 lines of log data
• 8,000 recordings
• 5,000 hours of video
• 263 different courses
• 4,927 unique student users
• 48,539 learner sessions
15. Further reading on step #2
Gorissen, P., van Bruggen, J. and Jochems, W. (2012) ‘Usage
reporting on recorded lectures using educational data mining’, Int. J.
Learning Technology, Vol. 7, No. 1, pp.23–40.
16. #3 Compare the results
Logfiles
Combine
Missing Irrelevant Identify Identify Remove
data Combined data data users sessions Outliers
sources dataset
Data cleaning
Mediasite
database
Survey Compare Analyse Filtered
Log results
results results dataset total
dataset
!?
17. Reported versus actual use
Reported Actual
n (%) n (%)
Never 13 9.1 6 4.2
< 5 times 22 15.4 35 24.5
5-10 times 51 35.7 43 30.1
> 10 times 57 39.9 59 41.3
Number of times respondents used recorded lectures for the C01 course.
18. Reported versus actual use
Reported Actual
n (%) n (%)
0% - 10% 2 1.5 27 9.3
10% - 25% 4 3.1 203 69.8
25% - 50% 7 5.4 40 13.7
50% - 75% 26 20.0 13 4.5
75% - 100% 91 70.0 8 2.7
Average percentage of a recording viewed.
19. Conclusions
• Methodological triangulation is necessary
• Using the logged data, some reports by students:
• can be confirmed
• can be refuted
• can only be confirmed/refuted indirectly
• can not be confirmed/refuted
• …so we need both data sources!
20. Conclusions
• Don’t simply rely on what students tell you.
It might be just what they think you want to
hear!
• “Views” is a bad measure for use.
• Different recordings or different students
mean different use.
And we had more than enough to work with if we just focused on recorded lectures. The Eindhoven University has 5 Mediasite recording sets that record about 2,000 recordings per year each about 45 minutes long. They use student operators for the camera’s and though that is cheaper than regular staff, creating these 2,000 recordings per year isn’t cheap. Another important issue for them was that because they now record to the max of their capacity, they want to be sure that they record the right lectures.
We came up with a two step approach: let’s ask them and if that doesn’t work, see if we can measure their use. I’ll start with step 1 and come back to step 2 later.
We did a survey amongs 1,122 students from both universitites. Why the differences in numbers: unlike a lot of surveys I’ve seen we didn’t just sent out the survey to any or all students that may or may not have viewed a recording. We selected 7 courses that had been recorded during the semester that just ended at the moment the survey was scheduled. Now because Fontys had only just started recording full lecture series at that point, we selected 6 courses from the Eindhoven University and only one from Fontys. We sent the students a personalized mail asking them specifically about that single course. I think it greatly improved our response rate. But it also makes it possible to compare their answer with results from step 2. We also did follow up interviews . Now we asked students at the last page of the online survey if we could contact them for follow up questions. A total of 120 checked YES there. When we then afterwards mailed them to ask if they still wanted to participate, only 14 said YES again.
Probably not that surprising. Bandwidth is not a problem, travel distance isn’t either, so students watch from home, no need to do that at the university.
Probably not that surprising. Bandwidth is not a problem, travel distance isn’t either, so students watch from home, no need to do that at the university.
Assignment due Test for first part of the course Test for second of the course Resit of the test for both parts.