Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
1. Learning analytics and Moodle: So much
we could measure, but what do we
want to measure?
Associate Professor Michael Sankey, EdD
Director, Learning Environments and Media
2. SAF
Embarking on a project that involves:
Establishing common codebase across all our 3
Moodle environments fully aligned with Mahara
Extending the functionality of eAssessment within
Moodle (replacing EASE, CMA, EMS)
Establishing a suite repositories in Equella
Create new digital rights management workflow
Enhance discoverability
Establish learning analytics across L&T systems
Align help resources to new regime and Provide PD
3. Learning analytics for our systems
Which systems?
What tools?
How big do we want the data?
Is it just our USQ systems?
What do we want to know?
How do we want to use this data?
Who gets involved?
Who makes the decisions?
I‟ll come back to these questions at the end
But first some background…
4. Siemens, G. 2013. Structure and logic of analytics. Available from http://www.learninganalytics.net/
5. Three levels within the institution
Learning
Analytics
Educational
Data mining
Educational
Analytics
All focus on the learner to some degree, either as an
individual or in context to the institution
Educational Analytics
6. Academic and Learning Analytics
http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education
7. Siemens, G. 2013. Structure and logic of analytics. Available from http://www.learninganalytics.net/
8.
9. Developing an analytics framework
Identify Tier
1 data
Data that
exists (technical
analytical)
Form Tier 2
data
Issues
intelligence
Create Tier 3
data
Context
intelligence
Adapted from Terenzini, 2013; Padró & Frederiks, 2013
SBMI and Peoplesoft
• AUSSE/UES
• Grades
• Graduation rate
• Persistence
• Retention
• Student demographics
• Student satisfaction data
• Transfer rates
L&T Systems & RightNow
• Co-curricular student
engagement activities data
• Course interactions
• Systems data
• Learning Centre data
• Other student learning
support activities data
Institutional emphasis for data collection & analysis:
Customer service (transactional) or Student development
National policy preference
10. Learning analytics is the
measurement, collection, analysis and reporting
of data about learners and their contexts, for
purposes of understanding and optimizing
learning and the environments in which it
occurs.
12. Where did it originate
SoLAR exists to ensure that there
is an expansive, transformative
vision for what analytics might
mean for the future of learning and
to promote a very critical discourse
that is non-partisan, and grounded
as far as possible in practice-based
research. SoLAR is a non-profit
organization. Incorporation is
currently underway.
13. Scope
At what level do we pitch?
LMS data analytics
Easier to implement
Limited data so not the whole picture
Logs in Moodle are good, but not comprehensive
The Learning ecosystem analytics
Complex – needs an open standards model and
potentially access to external repositories
Much more holistic picture
In our case, Mahara, Equella, BB
Collaborate, EASE, Library, lecture capture, etc
14. LMS is one of the primary providers of the
data, since it preserves digital footprints of
student interactions which can be mined for
patterns of learning behaviour and teaching
practice, and this allows for benchmarking and
the monitoring of institutional quality
initiatives.
15. Predictive analysis indicates that some
students are higher risk than others; for those
who are first in family or from a low socio-
economic background, the risk of failure
increases. There is a question as to what
constitutes quality learning. Analytics is a key
player in this field, given that it provides a
vast amount of data and techniques for its
analysis. Learners and their context are vitally
important in this discussion.
16. How big is the data?
Typically in our Moodle we generate between
50-100 million log records per year
What is the Aim
Finding out we have a problem before it fully
manifests
If we accept this it gives us a framework to
consider our options
17. An perspective on big data OUA
As with other fields the key questions to ask are:
what do you want to know; why do you want to know it and what
are you going to do next?
Analytics makes sound educational and financial sense, it
increases retention and encourages students to enrol again.
The biggest factor in student‟s retention is intent of purpose; why
are they doing what they want to do? There are things that can
be done to aid them to achieve their intent of purpose.
Previous education is the single biggest predictor of success. So
the question becomes, what supports can we put in place for
those without this. E.g.:
Invigilated exams in a student‟s first unit decreases their chance of
success. This raises learning design issues for our introductory units.
Other data shows that older students and female students are more likely
to succeed in their first Course.
Coaching and contact are also predictors of retention and success, along
with preparatory units.
18. QUT study – Wendy Harper
Overall, the conclusions she drew were:
The key predictor of success in a unit is GPA.
The number of hits and days visiting a QUT Blackboard unit
site also predicts unit success.
Students who are likely to fail a unit often do not engage early
enough with their online environment.
Students who fail a unit often have alternating high peaks of
engagement and total disengagement.
Factors such as gender, international or domestic
enrolment, and age make very little difference to student
behaviour in online units.
„Narrowly failing‟ students often perform a much greater
amount of online activities in the unit they are struggling in.
„Narrowly failing‟ students often show high engagement
around early assessment pieces, but this drops off as the
semester progresses.
19. “Learning and knowledge creation is often
distributed across multiple media and sites in
networked environments. Traces of such activity
may be fragmented across multiple logs and
may not match analytic needs. As a result, the
coherence of distributed interaction and
emergent phenomena are analytically
cloaked”
Suthers, Rosen, 2011
20. Strategy Planning &
resources
allocation
Metrics &
tools
Capacity
development
Systemic
change
Data inventory Data/Analytics
team
Analytics goals
& target areas
Faculty/Staff
PD
Course
models?
Role of data
(Problem or
opportunity)
Data sources Educator-
controlled tools
Student access Self-directed
learning
Stakeholders
(IR, Academic,
Admin)
Budget Enterprise tools Learning
design
Automated
discovery
Access Priorities Iterative
development of
algorithms
Process
mapping and
evaluation
Student models
Governance Stages of
deployment
Visualization Intelligent
curriculum
Compliance Policy
development
Athabasca's approach
Siemens, G. 2013. Structure and logic of analytics. Available from http://www.learninganalytics.net/
21. Ethics
The ethical professional
Respecting the rights of students
Stepping in to
provide pastoral support
advise about risks of failure
advise about increasing chances of success
Research ethics
Risk minimisation
Needed for publication
Issues with:
accessing „databanks‟
anonymity
22. Privacy Issues
The Greater Good vs Big Brother
Teacher:
“It‟s unethical not to tell a student they are at risk
of failing”
Student:
“I don‟t want you to be looking over my shoulder. I
can make my own choices about my study.”
Reports to staff vs dashboards for students
23. Usefulness of analytics
What is the question to which analytics is the
answer?
Don‟t just buy a product
Learning analytics are just indicators of behaviour
They don‟t explain behaviour
A single source of analytic data is probably
insufficient
Combine data into a data warehouse
Time to look at some different options
24.
25. The Engagement Analytics block
http://docs.moodle.org/22/en/report/analytics/index
It provides information about student progress against a range of
indicators. It provides feedback on the level of "engagement" of a
student. “Engagement" refers to activities which have been
identified by current research to have an impact on student
success in an online course.
The plugin was developed as part of a NetSpot Innovation Fund
project by Monash University (Dr Phillip Dawson), with code by
NetSpot developers (Ashley Holman & Adam Olley).
It is a block that teachers can add to their Moodle course that will
provide them with a quick graphical snapshot of which students
are at risk.
It is important to note that the purpose of the plugin is to provide
teachers with information only, it does not automatically take any
action based on the indicators e.g. NO email or notification is sent
to students automatically.
If desired the teacher would follow up on the information
themselves, based on what they know about the student and their
other communications.
26.
27. GISMO
It is a visualization tool for Moodle that obtains tracking
data, transforms the data into a form convenient for
processing, and generates graphical representations that
can be explored and manipulated by course instructors to
examine social, cognitive, and behavioral aspects of
distance students.
It can be included in any Moodle course as side block.
Since it is aimed to help instructors, this block will be
visible only to users who have the instructor role
(students don't see it).
Each time the Moodle cron jobs runs, GISMO fetches
students' data from Moodle logs, and performs some
statistical calculations. The lifetime of GISMO data
corresponds to the length of time of your Moodle logs.
28. It has
Accesses overview
A graph reporting the student's accesses to the course.
Accesses to the course
A graph reporting accesses for each student in a timeline.
Accesses overview on resources
A graph reporting the number of accesses made by the students to the resources
of the course
Assignments overview
A graph reporting the submission of assignments. Color is mapped to the grade
assigned by the teacher.
Quizzes overview
A graph reporting the submission of quizzes. Color is mapped to the grade.
Resources accesses overview
A graph reporting an overview of the number of accesses to resources of the
course.
Resources accessed by a particular student
A graph reporting an overview of the student's accesses to resources on a timeline.
Students' accesses to resources
A graph reporting, for each student, the number of accesses to resources of the
course.
36. SNAPP
The Social Networks Adapting Pedagogical
Practice (SNAPP) tool performs real-time social
network analysis and visualization of
discussion forum activity within popular
commercial and open source Learning
Management Systems (LMS).
It essentially serves as a diagnostic
instrument, allowing teaching staff to evaluate
student behavioral patterns against learning
activity design objectives and intervene as
required a timely manner.
Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011).
Using Social Network Metrics to Assess the Effectiveness of Broad-Based Admission Practices.
Australasian Journal of Educational Technology, 27(1), 16-27.
Also available from: http://www.snappvis.org/?page_id=4
37. Info from Shane Dawson (UniSA)
Social interaction is one of the most important
of student behaviours and predictors of success.
Student networks are the “single most potent source
of influence.”
The tool provides a visualisation of social
networking. Different patterns are available to the
individual, and mechanics which allow the data to be
manipulated for different purposes.
It demonstrates that with students, like responds to
like; they form self-regulating structures.
It may be possible to manipulate group structures so that high-
performing students can assist low-performing ones.
It may also be possible to direct teachers‟ time to areas of need.
Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011).
Using Social Network Metrics to Assess the Effectiveness of Broad-Based Admission Practices.
Australasian Journal of Educational Technology, 27(1), 16-27.
41. Refining the signals from the
Twitter feed
http://mashe.hawksey.info/2012/11/cfhe12-
analysis-summary-of-twitter-activity/
42. BIM
David Jones FoE
BIM (BAM into Moodle). BAM = Blog Aggregation Management.
BIM is a Moodle module that supports an activity where:
Each student registers an individual external web feed. The feed might
be generated by a blog, twitter or any other tool that produces a web
feed. It's the student's choice what they use.
Each student uses that external feed to respond to a set of
questions. Currently, those questions usually encourage the
student in reflecting on their learning, often in the form of a
reflective journal.
There is no need to have a set of questions.
it maintains a copy of each students web feed, and attempts to
allocate student posts to the questions.
it allows different teachers to track, manage and mark posts for
different groups of students.
Allows a coordinating teacher to allocate teaching staff to
different groups, track their marking progress and all student
activity.
Student results can be sent to the Moodle gradebook.
Jones, D. 2013. BIM – Feed Aggregation. Available from http://davidtjones.wordpress.com/research/bam-blog-aggregation-management/
43. ACODE prepared a literature review containing
165 categorised references in an Endnote
library
44. Learning analytics for our systems
The big Q Your big Answer
Which systems?
What tools?
How big do we
want the data?
Is it just our USQ
systems?
What do we want
to know?
How do we want
to use this data?
Who gets
involved?
Who makes the
decisions?
Notas del editor
Source: McKinsey Report: Big Data: The Next Frontier for Innovation, Competition, and Productivity
Suthers, D. D., & Rosen, D. (2011). A unified framework for multi-level analysis of distributed learning Proceedings of the First International Conference on Learning Analytics & Knowledge, Banff, Alberta, February 27-March 1, 2011.