Learning analytics research informed institutional practice
1. Learning Analytics: Research
Informed Institutional Practice
Yi-Shan Tsai
Anne-Marie Scott
London School of Economics and Political Science
06 December 2017
5. Learning analytics is…
“the measurement, collection, analysis and
reporting of data about learners and their
contexts, for purposes of understanding and
optimising learning and the environments in
which it occurs.” (Long et al., 2011)
Long, P. D., Siemens, G., Conole, G., & Gašević, D. (Eds.). (2011). In Proceedings of the 1st International Conference on Learning
Analytics and Knowledge (LAK’11). Banff, AB, Canada: ACM.
6. The emergence of learning analytics
• The need to understand how students learn
• The maturity of data technology
• The rise of MOOC
• Political concerns for educational institutions
Ferguson, Rebecca (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced
Learning, 4(5/6) pp. 304–317.
7. • “[D]ata trails offer an opportunity to explore
learning from new and multiple angles.”
(Siemens, 2013)
Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400. Chicago
https://cdn.business2community.com/wp-content/uploads/2011/07/databasemining-300x199.jpg
8. How can we extract value from these
big sets of learning-related data?
Educational
Data Mining
How can we optimise opportunities
for online learning?
Learning
analytics
Ferguson, Rebecca (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced
Learning, 4(5/6) pp. 304–317.
9. Are my students learning?
https://www.linkedin.com/pulse/d2i-diversity-inclusion-integration-step-2-peter-bryttne
11. Course Signals
• Goal: produce “actionable
intelligence”.
• Predictive algorithm:
- Performance
- Effort
- Prior academic history
- Student characteristics
Arnold, K. E., & Pistilli, M. D. (2012, April). Course Signals at Purdue: Using learning analytics to increase
student success. In Proceedings of the 2nd International Conference on Learning Analytics and
Knowledge (pp. 267-270).
22. Objectives
• The state of the art
• Direct engagement with key stakeholders
• A comprehensive policy framework
http://sheilaproject.eu/
23. Slide credit: Dragan Gašević (2017) Let’s get there! Towards policy for adoption of learning analytics. LSAC, Amsterdam, The Netherlands.
http://sheilaproject.eu/
24. The state of the art
Challenges, adoption and strategy
http://sheilaproject.eu/
25. Adoption challenges
1. Leadership for strategic implementation & monitoring
2. Equal engagement with stakeholders
3. Pedagogy-based approaches to removing learning barriers
4. Training to cultivate data literacy among primary
stakeholders
5. Evidence of impact
6. Context-based policies to address privacy & ethics issues
and other challenges
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education –challenges and policies: a review of eight learning analytics policies.
InProceedings of the Seventh International Learning Analytics & Knowledge Conference(pp. 233-242).
http://sheilaproject.eu/
26. LA adoption in Europe
• Institutional interviews: 16 countries, 51 HEIs, 64
interviews, 78 participants
N O P L A N S
I N P R E P A R A T I O N
I M P L E M E N T E D 9 7 5
12
18
The adoption of learning analytics (interviews)
Institution-wide Partial/ Pilots Data exploration/cleaning
http://sheilaproject.eu/
27. LA adoption in Europe
• Institutional survey: 22 countries
NO P LA NS
IN P RE P A RA TION
IMP LE ME NT ED 2 13
15
16
The adoption of LA
Institution-wide Small scale N/A
http://sheilaproject.eu/
28. LA strategy
No defined strategy
LA
Digitalisation strategies
Teaching & learning strategies Immature
plans for
monitoring &
evaluation
http://sheilaproject.eu/
30. Interests – senior managers
• To improve student learning
performance (16%)
• To improve student satisfaction
(13%)
• To improve teaching excellence
(13%)
• To improve student retention (11%)
• To explore what learning analytics
can do for our institution/ staff/
students (10%)
LA
Learner
driver
Teaching
driver
Institutional
driver
http://sheilaproject.eu/
31. Concerns – senior managers
No one-size-fits-all solutions
http://sheilaproject.eu/
32. Interests – teaching staff
• An overview of student learning
engagement and performance.
• Inform course design.
• Manage a big class.
http://sheilaproject.eu/
34. Interests – students
Personalised support
• Inform teaching support and curriculum design.
• Support a widening access policy.
• Support students at all achievement levels to
improve learning.
• Assist with transitions from pre-tertiary education to
higher education, and from higher education to
employment.
http://sheilaproject.eu/
40. Learning Analytics Map of Activities,
Research and Roll-out (LAMARR)
http://www.ed.ac.uk/information-services/learning-technology/learning-analytics
41. Early MOOC Analytics
• 6 courses:
– Artificial Intelligence Planning
– Astrobiology
– Critical Thinking in Global Challenges
– E-Learning and Digital Cultures
– Equine Nutrition
– Introduction to Philosophy
• 2 iterations analysed (more or less)
• August 2013 – April 2014
• Team drawn from UoE and CETIS
Detail on course design: MOOCs @ Edinburgh 2013: Report #1
(http://hdl.handle.net/1842/6683)
43. Aims & Research Questions
• Who are our participants?
• What data do we have?
• Can we identify patterns of participant behaviours such that participants could be categorised?
• How ‘social’ are participants?
• Are there patterns of in-platform behaviour which would predict retention / persistence in the
course?
Techniques & Tools Used
• Standard course extract (mySQL)
• Survivor Analysis (SPSS)
• Social Network Analysis (R, Gephi, TAGS Explorer)
• Analysis of survey results (SPSS)
• Visualisations for exploring data (R, Gephi, Google charts, Excel)
51. Lessons Learned
• Usability of data is low
– Data is very ‘raw’ - requires a lot of processing
– Invest up-front in quantifying and describing the data - use staff with some educational
background
– Make institutional reporting requirements turnkey
– Consider whether a standardised data extract would work for a large number of purposes
• Effort and skills required can be significant
– Define your questions
– Make pragmatic decisions – quite a bit can be done with simple tools / existing skills
– Foster an open / sharing culture to bring diverse skills together and pool resources
• Platforms are still maturing
– Platforms are still evolving - be prepared for change and re-work
– Not all platforms will give you the same data – comparisons could be hard
• Experience can re-used
– Experience / approaches may be useful when considering work with on-campus platforms
54. • How can University teaching teams develop critical and
participatory approaches to educational data analysis?
• How can we develop ways of involving students as
research partners and active participants in their own data
collection and analysis, as well as foster critical
understanding of the use of computational analysis in
education?
Learning Analytics Report Card (LARC)
http://larc-project.com
Knox, J. (2017). Data Power in Education: exploring critical awareness with the
‘Learning Analytics Report Card’ (LARC). Special Issue: Data Power in Material
Contexts, Journal of Television and Media.
http://journals.sagepub.com/doi/full/10.1177/1527476417690029
Isard, A. and Knox, J. 2016. Automatic Generation of Student Report Cards. 9th
International Natural Language Generation conference. Edinburgh, Sept 5-8
http://www.macs.hw.ac.uk/InteractionLab/INLG2016/proceedings/pdf/INLG33.pdf
55.
56.
57.
58.
59.
60.
61. Lessons Learned
• Built capacity and understanding
• No one size fits all
• Retention focus is of limited value
• Civitas are most credible in the market
• Market does not provide
• Data protection, security, FOI all take more time
• Data validation takes a lot of time
• Learning analytics does not fit neatly into the organisation
• Our data are not always easy to work with
62. Learning Analytics Policy and Governance
• Task Group (reporting to Senate Learning and Teaching, and
Knowledge Strategy Committees )
• Governance group:
̵ Convenor - a senior academic member of staff with expertise in Learning Analytics
̵ The Assistant Principal with strategic responsibility for Learning Analytics
̵ A student representative
̵ The University’s Data Protection Officer
̵ Representatives from relevant service units (Universities Secretaries Group and
Information Services Group)
̵ A member of academic staff with expertise in research ethics.
63. Statement of Principles
1. LA will not be used to inform significant action at an individual level
without human intervention.
2. We will use LA to benefit all students in reaching their full academic
potential.
3. We will be transparent about data collection, sharing, consent and
responsibilities.
4. We will actively work to recognise and minimise any potential negative
impacts from LA.
5. We will abide with ethical principles and align with organisational
strategy, policy and values.
6. LA will be supported by focused staff and student development activities.
7. LA will not be used to monitor staff performance.
https://www.ed.ac.uk/files/atoms/files/learninganalyticsprinciples.pdf
65. Edinburgh: Purposes
• Skills – Interactions with analytics as part of the University
learning experience can help our students build 'digital
savviness' and prompt more critical reflection on how data
about them is being used more generally, what consent
might actually mean and how algorithms work across
datasets to define and profile individuals. Learning analytics
approaches can also be used to promote the development
of key employability skills. Supporting staff to develop skills
in working with learning analytics applications is also an
investment in institutional capacity and leadership.
http://www.ed.ac.uk/academic-services/projects/learning-analytics-policy
66. 1. Co-responsibility in an
asymmetrical power and contractual
relationship
…obligation to act is a co-responsibility of students and
institution, tempered by the asymmetrical power and
contractual relationship in which the institution has
very specific moral and legal duties to respond
Image credit: https://pixabay.com/en/michelangelo-abstract-boy-child-71282/
Prinsloo, P & Slade S (2017) An elephant in the learning analytics room – the obligation to act, LAK17 presentation, https://www.slideshare.net/prinsp/an-elephant-in-the-learning-analytics-room-the-obligation-to-act
69. Next steps
• GDPR challenges – established governance group
• Scottish Sector level focus via QAA
– Evidence for Enhancement: Improving the Student
Experience
• Capacity building
– Project manager, service manager, data analyst, PhD
intern
• Course design / feedback at scale
73. In what way might learning analytics
be useful to you or your students?
74. Would you have any concerns about
using learning analytics in your daily
teaching practice?
75. Should we give students access to
their analytics if it could potentially
demotivate them?
76. What are the pros and cons with
predictive modeling?
77. Data is not students
http://www.tate.org.uk/context-comment/blogs/treachery-images-rene-magritte
Editor's Notes
Opening survey:
Has anyone had any experience with learning analytics?
How would you describe your experience with learning analytics?
What do you expect from a learning analytics tool?
When teaching a big class especially, it gets difficult to know how each student is doing in the class. Learning analytics is meant to provide a solution for this.
Data: academic data, background data, and engagement data
Purposes: Inform the provision of educational services for students and encourage self-regulated learning.
If predictive modelling is used, the traffic lights may be applied to indicate the likelihood of failing a course
LA is meant to give us data-based evidence for a better understanding of student engagement and performance.
The maturity of data technology: the ability to collect massive amounts of data, process them, and generate useful information about individuals’ behavioural patterns – data gold mining (e.g., marketing strategies, educational data mining)
Political concerns for educational institutions: to measure, demonstrate and improve performance.
Drop-out economic pressure on universities
We want to know if students do access the coursework and materials that we have prepared for them. Have they accessed them? When? Any catch-up behaviour?
Assignment? Log-in?
Interventions may be:
Posting of a traffic signal indicator on a student’s LMS home page; • E-mail messages or reminders; • Text messages; • Referral to academic advisor or academic resource centers; or, • Face to face meetings with the instructor.
Partner organisations:
The University of Edinburgh, UK
Universidad Carlos III de Madrid, Spain
Open University of the Netherlands, Netherlands
Tallinn University, Estonia
Erasmus Student Network aisbl (ESN), international
European Association for Quality Assurance in Higher Education, international
Brussels Educational Services, international
Challenge 5 has also been identified in Ferguson, R., & Clow, D. (2017). Where is the evidence? A call to action for learning analytics.
16 countries – UK (21), Spain (11), Estonia (3), Ireland (2), Italy (2), Portugal (2), Austria (1), Croatia (1), Czech Republic (1), Finland (1), France (1), Latvia (1), Netherlands (1), Norway (1), Romania (1), and Switzerland (1)
21 out of 51 institutions were already implementing centrally-supported learning analytics projects.
25 institutions have established formal working groups, but not all institutions have planned to provide analytics data to students.
In many cases where LA was supported centrally, LA was usually initiated under the wider digitalisation strategies or teaching and learning strategies. However, there were also a great number of institutions that had not defined clear strategies for learning analytics and were still at the ‘experimental’ or ‘exploratory’ stage.
The interviews identified three common aspects of internal drivers for the adoption of learning analytics:
Learner-driver: to encourage students taking responsibility for their own studies by providing data-based information or guidance.
Teaching-driver: to identify learning problems, improve teaching delivery, and allow timely, evidence-based support.
Institution-driver: to inform strategic plans, manage resources, and improve institutional performances, such as retention rate and student satisfaction.
An equivalent question (multiple choices) in the survey provided 11 options for motivations specific to learning and teaching. The results identified five top drivers.
How can the institution as a whole benefit from LA?
No one-size-fits-all solutions:
Needs vary by institutions, but existing solutions focus on addressing retention problems. LA should not be used as a deficit model.
differences among subjects and faculties.
Other concerns:
Uncertainly about the benefits of LA: fear of failing expectations
Pressure to adopt LA
The strictness of existing data protection regulations makes adoption more difficult.
Student engagement data: when, how long, etc.
Inform course design: reflect on places where students fail.
Know ‘why’ students struggle: it’s not good enough to just know that students fail certain questions.
Other concerns:
Not all learning is digital
No one size-fits-all solution
Correlation does not suggest causation
Surveillance on students
Inform teaching support and curriculum design so that no one is falling behind or having to learn the same materials repetitively.
Support a widening access policy – at a class level.
Support students at all achievement levels to improve learning by providing them a better overview of their own learning progress.
Other concerns:
Limitations in quantifying learning
Worries about human contacts and teaching professionalism being replaced by machines
We adopted the Rapid Outcome Mapping Approach to developing this policy framework. The ROMA model was originally designed by to support policy and strategy processes in the field of international development. The model begins with defining an overarching policy objective, followed by six steps designed to provide policy makers with context-based information. It allows decision makers to identify key factors that enable or impede the implementation of learning analytics. Moreover, the reflective process allows refinement and adaptation of policy goals to meet context change over time.
Drop of relatively slow across the course, with probability increasing rapidly over the final period of assessment
Blue – end of teaching; Red – end of assessment
Tutors very central on one; participants more central on the other
Some clustering by locations can be observed – potentially driven by time difference / language family
“Trying to use the platform data to answer research questions is currently, according to Whitmer, like “trying to build a spaceship at the atomic level.” While it is possible to tease out aggregate data on participant activities in MOOCs, more useful inquiries such as looking for patterns in learner behavior over time and over multiple courses, or relationships between activity and measures of achievement, are not routinely feasible.”
René Magritte – This is not a pipe
Data is not students. It’s about students. Learning analytics provides a new perspective from which we can get a glimpse of a learning process, reflect upon it, and take a further action.