Australian university teacher’s engagement with learning analytics: Still early days. - Professor Carol A. Miles University of Newcastle and A/Professor Deborah West Charles Darwin University. ANZTLC15
This session reports the results of a recent OLT-funded national exploratory study addressing the relevant factors and their impact when implementing learning analytics for student retention purposes. The project utilised a mixed-method research design and yielded a series of outputs, including the development of a non-technical overview of learning analytics, focusing on linking the fields of student retention and learning analytics resulting in an institution level survey focusing on sector readiness and decision making relating to utilising learning analytics for retention purposes. An academic level survey was administered to academic staff exploring their progress, aspirations and support needs relating to learning analytics. Follow-up interviews expanded on their experiences with learning analytics to date. An evidence-based framework was developed, mapping important factors affecting learning analytics decision making and implementation. This was illustrated by a suite of five case studies developed by each of the research partner institutions detailing their experiences with learning analytics and demonstrating why elements in the framework are important. These findings were shared and tested at a National Forum in April 2015.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
HE Course and Module Evaluation Conference - Shaun mcgall & Marie salter
Similar to Australian university teacher’s engagement with learning analytics: Still early days. - Professor Carol A. Miles University of Newcastle and A/Professor Deborah West Charles Darwin University. ANZTLC15
Top Ten Things Learned From Ten Years of Online Statistics Teaching (Michelle...statisfactions
Similar to Australian university teacher’s engagement with learning analytics: Still early days. - Professor Carol A. Miles University of Newcastle and A/Professor Deborah West Charles Darwin University. ANZTLC15 (20)
Australian university teacher’s engagement with learning analytics: Still early days. - Professor Carol A. Miles University of Newcastle and A/Professor Deborah West Charles Darwin University. ANZTLC15
2. Australian university teachers’ engagement with learning
analytics: Still early days
Professor Carol A. Miles
University of Newcastle, Australia
carol.miles@newcastle.edu.au
A/Professor Deborah West
Charles Darwin University
Deborah.west@cdu.edu.au
3. Project Outputs
• Provide a non-technical overview of the use/potential use
and limitations of learning analytics for retention
• Collect data on LMS and SIS usage, and the use of analytics
in the sector at various levels
• Develop a framework for critically evaluating the use of
analytics for student retention
• Develop case studies on the use of the framework to
evaluate analytics in the five project institutions
4. Institution Level Survey
• Intended to provide a basic overview of institutional
infrastructure, progress and broad planning around
learning analytics.
• Designed for one survey completion per institution
• Anonymous, online survey using Qualtrics
• Disseminated to all DVCA’s in Universities in
Australia
• Completed by 22 Australian institutions (plus 2 from
NZ)
• July to August 2014
• 55% response rate
5. Academic Level Survey
• Asked people about their involvement in learning
analytics, what they think the key issues are, what
their aspirations are and how their institution is
tracking/supporting them, and what they could do
better.
• Purposive snowball sample
• Anonymous and online, using Qualtrics
• September to November 2014
• 353 respondents
6. Interviews
• 23 people from 15 different Australian universities
• Held a variety of roles (Including teacher, educational
developer, student support, library, learning analytics
project leader, tutor and L&T leader) and were at
different levels.
• Self-selected
• Between 15 and 30 minutes in length
• December 2014 to February 2015
• Semi-structured
7. Headline Findings
• We are at an early stage of development, implementation, and
understanding around learning analytics.
• Context is important
• There is great variability across the sector – in relation to preparedness,
and how institutions are thinking about and implementing learning
analytics due to context.
• There is a tension around the extent to which LA can drive actions and
behaviours or take the functions of people
• There is a tension between business needs, wants and limitation (costs)
versus academic staff needs and wants, academic freedom, innovation in
learning and teaching
• People across the institution have a role to play in leveraging the
opportunities of learning analytics which must take account of the
8. Interests of University Teachers
• Being able to better understand who is in their class
(demographics, prior academic history etc.)
• Being able to have consolidated information about their individual
students at the touch of a button (e.g. seeing how their students
are doing in other units, what their demographic data is, whether
they are using resources etc. all in one place)
• Learning analytics being used by people centrally to better justify
or evidence directives relating to their teaching (e.g. when
academics are told to respond in 24 hours to students is there
evidence for this being useful?)
• Improving BOTH student (e.g. resource access patterns,
socialisation) and teacher (e.g. teaching style, unit design)
behaviour with respect to learning
9. What do the Early Days Look Like?
• Actively seeking to understand what learning analytics is and how it
might be leveraged at their institution
• Institutions are aware that getting their data infrastructure better
integrated is probably going to be helpful, even if they aren’t
completely sure how, yet
• Most institutions have developed some analytics capacity and
experience through centralised Business Intelligence projects
• Many universities are currently running pilot projects and/or testing
different tools
• The technical and multi-faceted nature of working with integrated
data and advanced analytics means that planned projects often take
longer than anticipated or hit roadblocks
• The sector has learnt a lot about learning analytics and some of the
issues with implementing it, but in terms of generalizable, scalable
successes, it is still working towards that.
10. “In an ideal world, once the course enrolment is done in the first
week of semester the lecturer should be able to get summary like a
quick snapshot - it could be just a few graphs and a table that this is
the make-up of your class this year. This could give info about:
• International vs domestic
• Previous courses
• Previous credit
The class is different every year and I
personally believe that this kind of
information should be very easy to generate
through the LMS. This lets the teacher know
at the start of the course what things
potentially to tweak. I think useful
Participant Views – Data Needs
11. 0
25
50
75
100
Use for
performance
management
Transparency
about how and
why learning
analytics are
being used
Profiling of
students
Data security Duty to
respond to
risks identified
by the data
Consent to
access data
Workload
changes
Accreditation
and
authorisation
of users
Training and
professional
development
Ownership of
data
No concern Small - Medium High - Very High
Notes: n varies between 145 and 154 per ethical issue due to missing data
Excludes those people who indicated ‘not sure’ (approx. 30 per variable) to better illustrate
ETHICAL ISSUES
Academic Level Survey – Ethics Issues
12. 0 30 60 90 120 150
Other
Delivering training on the use of learning…
Conducting formal research and/or…
Being part of the group that is leading…
Attending…
Advocating for the use of learning analytics…
None of the listed choices*
Using learning analytics to help with analysis…
Reading about learning analytics for my own…
nFrom the Academic Level Survey
Notes: * denotes mutually exclusive response
n = 346, missing = 7
LearningAnalytics –Activity Participation
13. “We actually have a lot [of data]. But people will
only use it if they actually have to. The main
people who have to use it are people like
executives and then others who might have to do
things like staircase model reporting and that
kind of thing.”
“The data does take so long to analyse. For
example, our Deputy Head has spent hours
sifting through surveys like we have one which
is how students are settling in and we have
done another about whether students are
planning to leave and manually
having to compute that data and
interpret it takes forever. Whereas, if
you have big data sets that you can
run algorithms on that can save a lot
of time. The algorithms take a long
Impacts on Motivation
14. Reflections on Time and Workload
“There just isn’t time to do it. So the reporting mechanism or how you actually take
these opportunities for recording data and presenting it back in a way that is actually
useful, navigable and doesn’t require learning a whole heap of data analysis to
understand it, the more the take up will be universal. I think as we move towards
better dashboards and all those sorts of things it will likely work better more
generally.”
“Now, your average academic, for whom a lot of this actually quite interesting, just
doesn’t have the time to learn a whole new system. I’ve got the time to do it. This is
my interest and my hobby. I have the time to invest in it because I enjoy it. Full time
academics with their commitment to research to get promotions and University
grading and things like that. Most of them don’t have the sheer time to be involved
with teaching one subject.”
“Time is short. I do start early. And it is not because I have slackened off during the
day there is just so much to be done. The last thing you want to do is sift through a
whole heap of stuff that is not relevant to you. You just want to be able to say “that is
what I need, that is where it is and then it opens up and you can the students and
15. Participant Views – Pilot Mode
“There is quite a lot happening at the
institutional level. The majority of academic staff
probably wouldn’t be aware of it but because we
are currently working in a pilot mode, we have
only involved say two dozen academics across
the university so we have piloted various
iterations of our attempt at analytics on about
two dozen units.”
16. 0
40
80
120
160
Ease of learning
analytics data access
Relevance and
comprehensiveness
of data that I can
access
Ease of visualisation
and interpretation of
data
Opportunities to
provide feedback
about learning
analytics
implementation
Professional
development about
learning analytics
Information about
how learning
analytics is being
used
Provision of
information about
how learning
analytics use will
affect me
Poor or Very Poor Fair Good or Very Good
LEARNING ANALYTICS PROVISIONS
n
Rating of Institutional Provisions
Notes: n varies between 187 and 204 per category due to missing data
Excludes those people who indicated ‘not sure’ (between 85 and 104 per category) to better
17. • Curriculum/course design and teaching
improvement
• Resource planning at an institutional
level
• Improving retention
• Student behaviours
• Academic actions and behaviours
• Student background and demographics
• Student performance
SOURCES
OF DATA
USE CASES/
APPLICATIONS
SOURCE &
OUTCOME
Data thatAcademic Staff Want
18. Which students
are most at
risk?
What interventions are
more likely to be effective?
What are the indicators
of risk?
?
?
?
What effect does
intervention with students
(based on low results
observed through
analytics have?
what kind of follow
up helps?
What trends can we see in
terms of at risk students?
Are there relationships between
progression, enrolment status,
performance?
?
?
?
?
Improve Retention / Student Performance
19. Student Behaviour
How active are
students in course
content overall and at
an individual content
level?
How does the
information
related to
engagement fit
with student
feedback?
? ?
How does this () correlate with
what content the academic
delivers?
If my students are clicking/accessing
something a lot does it mean:
A) They love it?
B) They don’t understand it and find
it hard?
C) They hate it and have to keep
coming back to it because it is
too difficult or unpalatable to
manage?
What do students most value?
Most like?
Have they accessed recordings or
some specific tool (e.g. wikis,
blogs etc)?
What is the best way to structure
content to improve learning
performance?
What is the best way to structure the
content to engage students?
?
?
?
?
? ?
Student Behaviour
20. Do they use rubrics?
How much time are
instructors spending
and where are the
peaks of activity?
How do academics do
their marking?
?
?
?
Which academics and which
“schools” need support to
become better/fuller users of
Bb Learn?
How consistent is the
marking?
What interactions takes
place?
?
? ?
AcademicActions/Behaviour
21. International and domestic
indicators
Requests for special
consideration
Tertiary Entrance Rank (TER)
Access to support services
Grade Point Average
English language preparation
Students resilience, emotional
attributes and interpersonal
attributes at selection
Student academic preparation
Declaration of disability
Access to equity services
Academic progression including
previously attempted subjects or
courses, length of enrolment, nature of
enrolment (part time/full time),
internal/external
Standard retention linked variables
such as first in family, ATSI, gender,
age, basis of entry etc.
Demographics and Student Background
22. 0 40 80 120 160 200 240
Not sure*
Socio-economic status
Aboriginal and Torres Strait Islander status
International student status
Materials or resources access patterns…
Time spent in LMS
Use of communication tools (e.g.…
Attainment of certain grades
n
Indicators Used to Identify Risk
23. 0 30 60 90 120
Automated referrals to specific resources or…
Others
Automated e-mails to at-risk students
Formal request to interview at-risk students…
Acknowledgement of positive progress (e.g.…
Telephone calls to at-risk students
Manual referrals to specific resources or…
Offer of consultation with at-risk students…
Manual emails to at-risk students
n
From the Academic Level Survey
Elements of Systematic Risk Response
24. “Tell me what data is available, give me
access to it, give me the time to use it and
give me guidance in using it”.
WhatAcademic Staff Need
25. “But until we can show academics and make it personal -
tell them it can help give them a better teaching score
because students are going to be engaged and feel
valued. It will save time and make your results better.
The ‘what’s in it for me’ question is important. It
can’t just be about making the institution look better, it
needs to be personalised so that we give teachers things
that save time and create better teaching outcomes,
because that is what we are judged on now”.
Making LearningAnalytics Personal
26. “I think some of this needs to be almost seamless for
students. There is a level at which some of the
information from learning analytics may be useful,
potentially recommender based systems or providing
some useful prompts or feedback to students in a
fairly automated and seamless way, but they wouldn’t
necessarily know that is learning analytics per se.”
Another Perspective on Students
27. Headline Findings
• We are at an early stage of development, implementation, and
understanding around learning analytics.
• Context is important
• There is great variability across the sector – in relation to
preparedness, and how institutions are thinking about and
implementing learning analytics due to context.
• There is a tension around the extent to which LA can drive actions
and behaviours or take the functions of people
• There is a tension between business needs, wants and limitation
(costs) versus academic staff needs and wants, academic
freedom, innovation in learning and teaching
• People across the institution have a role to play in leveraging the
opportunities of learning analytics which must take account of the
relationships between strategy, planning, policy and action.
28. Transitional Institutional Elements
Key questions to consider are:
• What is the strategic positioning of learning analytics within your
institution?
• To what degree is there executive sponsorship (academic and IT)?
• Do you have a specific learning analytics plan and if so, how is it
aligned with your institutional budget and academic plan?
• To what extent is there congruence between the vision, strategy
and values of the institution that relate to learning analytics?
• How clearly is the role of learning analytics in your institution
identified and communicated?
29. Strategic Positioning
What is the strategic positioning of learning analytics within your institution?
“We have our governance group, we have reference groups, we have
operational forums. There has been a year and a half of talk and of
dialogue and getting ready and now we are at a position of starting to
get organised”
“People are busy. It is potentially just another administrative layer in
their eyes. There are a whole range of issues that need to be worked out
such as what is the governance model around this? Who has the role
and responsibility to respond or react to what the data is starting to
reveal. What is the appropriate way of engaging with students, or a
student? Then you have things like data visualisation and you have
questions around just how much information should you share and
30. How clearly is the role of learning analytics in your institution
identified and communicated?
“I am also involved in a CRM project here at the University
looking at a whole-of-university CRM approach and it is a
good opportunity to see where analytics can fit into how the
CRM is built and how it can support student lifecycle
information management.”
“I think at my institution there are all sorts of programs –
particularly where first year students are involved and in the
transition period. They have programs and people place so
one would assume that there is an interest in general around
retention. Certainly, if numbers do start to drop off in our
units we do get questions about why.”
Communication
31. Transitional Retention Elements
• Do you have an institutional retention plan or
strategy?
• What are the governance arrangements that support
the retention plan or strategy?
• How are learning analytics positioned within the
retention plans and associated governance
arrangements?
32. “At the moment we kind of go on hunches. We do look at the
statistics, but then we also talk with people, so, for example, we
might hear from the student association that there is a particular
issue. Similarly a lecturer might contact us and let us know that there
is some issue occurring that they are not sure what to do about. It is
very reactive, rather than proactive.”
“I think it is the institution’s responsibility to make sure that easy-
to-use tools are available and then staff can have a play and isolate
aspects that they are interested in. I think that is very important and
essential impact that can go into the teaching and learning cycle and
what lessons you learned this year and how you can improve the
teaching and learning outcomes next year.
“…At the moment my own experience is sort of ad-hoc. It happens
Retention Plans and Structures
33. LearningAnalytics for Retention
• What business and educational questions do your
stakeholders have, and how are they prioritised?
• To what extent can your system address those
questions?
• Where gaps are identified, what resources are
available for follow up?
• What ethical issues have been identified and what is
the resolution?
34. “It is all very well to collect all of the data, but what
answers do you want to get from our data. Apart from
retention type things at this point we are not getting a
lot of feedback from academics. We know that we
want to see if a student is active in their course –
whether they are getting to the forums or getting to
the lecture captures and things like that. They haven’t
really got much past that at this point”.
Business and Educational Questions
35. “…one of the key questions would be around whether it was going
to be used in a constructive way or in a punitive way”.
“There is a perception that privacy exists. If people just gave up
that concept they would be fine”.
Learning Analytics for Retention
“Having said that, there are
all sorts of ethical caveats
around that. …The
awareness of students as to
how much information is
actually collected on them is
Ethical Issues
36. Intervention and Reflection
• How are potential interventions developed, managed
and enabled?
• What training is provided, and what training is
required for staff and students?
• What support is provided, and what support is
required for staff and students?
• How are the interventions evaluated and improved?
• How is the whole system evaluated and improved?
37. “We can detect a lack of engagement through the
student engagement with the LMS. Now, where we do
detect a lack of engagement our student services
people are quite good and they have a kind of call
centre set up where students are given a friendly call,
you know, first of all do you realise you should be
logging in? Do you intend to continue with this course
of study? Do you have any problems that we can help
you with? You know, the normal sort of counselling-
cum-progression approach.”
Responding to What Data Tells You
38. “Even with a well-designed survey it is very hard to know
exactly what people are talking about in that survey. That
is one of the real promises of learning analytics that
eventually you can start leaving that kind of blunt tool
methodology behind or at least augmenting it. Plus, the
students won’t have to answer surveys all the time which
will make them happy as well.”
Methods of Data Gathering