Web & Social Media Analytics Previous Year Question Paper.pdf
Learning Analytics – Challenges arising from a current review of LA use
1. The European Commission’s
science and knowledge service
Joint Research Centre
Learning Analytics –
Challenges arising from a current
review of LA use
Seville May 23 5 2017
For the Danish Accreditation Institution (by Skype)
Dr. Riina Vuorikari
DG JRC – Directorate Innovation and Growth
Unit B4 Human Capital and Employment
2. The European Commission’s
science and knowledge service
Joint Research Centre
. Part 1: Introduction to the
Report: aims, inventory of LA
(5 min)
. Part 2: Opportunities
(10 min)
. Part 3: Challenges arising from
the study
(10 min)
Outline:
4. 4
Focus on the priorities of the
European Commission:
working for more than
20 policy DGs
Policy neutral:
has no policy agenda
of its own
Independent:
no private, commercial or
national interests
The Joint Research Centre (JRC)
Directorate
Growth &
Innovation
Seville
5. DigComp
(DG EMPL)
EntreComp
(DG EMPL)
DigCompConsumers
(DG JUST)
Openedu Policies(HE)
(DG EAC)
MOOCKnowledge
(DG EAC)
Blockchain
(DG JRC)
OPTEV
(DG JRC)
MOOCs4 inclusion
(DG EAC)
Learning Analytics
(DG JRC)
Anticipatory studies Policy & society
OrganisationsIndividuals
DigCompEdu
(DG EAC)
DigPolEdu
(DG EAC)
CPDmodels
(DG EAC)
ICTinPISA
(DG EAC)
CompuThink
(DG JRC)
DigCompOrg4Schools
(DG EAC)
OpenEdu (HE)
(DG EAC)
DigCompOrg
(DG EAC)
Current JRC research on Digital Age Learning
and 21st Century Skills
6. 6
The Study behind the JRC Report
• Goal: Provide research evidence on the use of
learning analytics and discuss their implications
for education policy
• Study conducted between September 2015-June
2016
• Design of the study: the JRC in Seville,
Unit of "Human Capital and Employment”
• Research: The Open University, UK under the
contract and supervision of the JRC
7. 7
Learning analytics involve
the measurement, collection, analysis and reporting
of data about learners and their contexts, for
purposes of understanding and optimizing
learning and the environments in which it occurs.
(source)
Learning analytics have their roots in many fields of
educational and technical research, including assessment,
personal learning and social learning, but also in business
intelligence and data mining.
The field draws on theory and methodologies from
disciplines as statistics, artificial intelligence and
computer science (Dawson et al., 2014).
8. 8
To access the Inventory
• Google “leap inventory learning analytics”
• http://cloudworks.ac.uk/cloudscape/view/2959
• Google “learning analytics JRC science hub”
To access the Report
9. 9
What does the Study contain?
• An inventory of recent implementations of learning
analytics:
• Tools, practices and policies (60 examples)
• 5 case studies
• Review of research literature on implementation
• To critically reflect on the impact, potential and limits
of using learning analytics in education
• To consider the implication for education policy:
“The Action List for Learning Analytics”
10. The European Commission’s
science and knowledge service
Joint Research Centre
. Part 1: Introduction to the
Report: aims, inventory of LA
(5 min)
. Part 2: Opportunities of tools
(10 min)
. Part 3: Challenges arising from
the study
(10 min)
Outline:
11. 12
Example 1: Inventory of Tools
1.
2.
3.
4.
5.
Specific models
of domain knowledge
(in math) and
on the learner
responses (cognitive
models )
Stand alone
application that
generates its own
data.
What does this
data tell us about
learning?
Is it actionable?
14. 15
• Tools in Inventory target
• compulsory education (13); HE (8);
• workplace (2); any (6)
• “stand-alone” tools; add on to an existing VLE;
custom-made solutions
• Different target beneficiaries of analytics:
• Learners, teachers, tutors, advisors, counsellors, school
heads/managers, decision-makers, policy-makers,..
The Inventory: Tools (1)
15. 16
The Inventory: Tools (2)
• Different data sources:
• Student digital traces from the platform or
outside of it, e.g. interaction data, social media, libraries
• Data from offline sources, e.g. evaluations by the learner,
demographic data, nation-wide test data/evaluations
• Methods to use data: Descriptive data and statistics, visualisation,
statistical interferences, dataminig, machine learning,..
• Actions on data: modelling of data based on which actions to, e.g.
adaptive systems to scaffold and support; to recommend, predict,
alert..
• Action based on based: past behaviour, similarity in grades,
domain knowledge, right answers, statistics, …
16. 17
Narrowing the attainment gap: Georgia State
University At the university, predictive analytics
have been used to tackle the achievement gap for
low income and first-generation students. GSU
graduation rate rose from 32% in 2003 to 54% in
2014.
In the process, the university claims to have
removed the achievement gap between
students from minority backgrounds or lower
socioeconomic status and their peers.
Inclusive
education,
equality!
Example 4: Inventory of Practices
17. 18
Do we see real improvements in
learning outcomes for learners? 37 examples
18. The European Commission’s
science and knowledge service
Joint Research Centre
. Part 1: Introduction to the
Report: aims, inventory of LA
(5 min)
. Part 2: Opportunities
(10 min)
. Part 3: Challenges arising from
the study
(10 min)
Outline:
20. 21
Challenge (1)
• Evidence of Impact: The research evidence
documented in this study shows that there is little
formal validation of tools
• e.g. whether the tools fulfil their intended purpose
such as having a positive impact on learning;
encouraging more efficient learning; or more
effective learning,..
21. 22
Challenge (2)
• Impact: The research evidence documented in
this study shows that currently, most impact of
learning analytics in education and training seems
takes place around issues, little impact on
changing practices yet:
• E.g. Sensitive issues of personal data and privacy are
at the centre of discussion
22. 23
Examples of Open University, UK
• Tools: Open Essayist and OU Analyse
• Practices: Ethical use of student data policy
• Case study on “The process of developing an institutional ethics
policy ” (part of the Report)
Example 6: Inventory
23. 24
Europe’s General Data Protection Regulation
(GDPR)
• Europe has taken the position that individual privacy is important and
that changes to current practices in general analytics are needed
• Institutions will need to understand their responsibilities and
obligations with regard to data privacy and data protection and will
have to put procedures in place to ensure that they are compliant with
the legislation.
http://ec.europa.eu/justice/data-protection/reform/
25. 26
Challenge (3)
• Impact: the implementation of learning analytics
seems to be a long-term process requiring a vision
and a strategy, policy and structure, but also
knowledge and skills in technology and pedagogy
• E.g.
• Case study in UTS (Au): vision of becoming a data-
intensive university in 2011 – strategy and a new centre
in 2014, tools are being developed and piloted now
• Kennisnet working w/schools and vendors since 2014
(products that have useful features!), now focus also on
standardisation of student data, etc.
26. 31
Challenge (4)
• The majority of current learning analytics
work is not strongly aligned with the
European Union’s priority areas for
education and training
• E.g. Strategic objectives for European
cooperation in education and training
(ET2020)
27. 32
Challenge (4): Create an European
vision for Learning Analytics
Strategic objectives for European cooperation in education and
training (ET2020)
1. Relevant and high-quality skills and competences for employability,
innovation, active citizenship
2. Inclusive education, equality, non-discrimination, civic competences
Indicator: reducing school drop-out rates to less than 10%
3. Open and innovative education and training, including by fully embracing
the digital era
4. Strong support for educators
5. Transparency and recognition of skills and qualifications
6. Sustainable investment, performance and efficiency of education and
training systems
28. 33
Do Learning Analytics
with dashboards and
half-hearted visions of
learning remain the
lower hanging fruit of
digital technology ? What about the visions
for empowering learning?
29. 34
Action List for Learning Analyitcs
• Action 1: Create common vision for LA in
European education and training
• Use the ET2020 Priority areas to make policy hooks
• Action 2: Build LA tools that help teachers and
learners
• Now too much focus on the supply side!
• Help generate demand: Talk to teachers and
learners to understand what they want
• Action 3: Conduct research that ends up in the
LACE Evidence Hub
• Validation of tools and their promises should lead
the research
30. The European Commission’s
science and knowledge service
Joint Research Centre
Check the research of our team
at the JRC Science Hub:
https://ec.europa.eu/jrc/
New skills agenda:
https://ec.europa.eu/education/
news/20160610-education-skills-
factsheet_en
Thank you!
Notas del editor
Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G., Mittelmeier, J., Rienties, B., Ullmann, T., Vuorikari, R., Research Evidence on the Use of Learning Analytics and Their Implications for Education Policy. (2016), Joint Research Centre Science for Policy Report.
JRC-IPTS: One of the key knowledge providers for DG EAC
Source from web-analyics in the early 2000
Even if the field in new, there are long root in existing research such as Adaptive learning; Personalised learning and Intelligent tutoring systems; Recommender systems to support learning, etc
Techniques and methods are also borrowed from statistics, artificial intelligence, computer science
It is like a catch-all term, an umbrella under which many existing old things have been re-dressed with some new spice
It’s also a topic with lots of hype around it, so thinking of the mission of the JRC and the demand for evidence driven policy-making in Europe, it was clear that the policy-makers in Europe had a real need for better data and evidence of what is actually happening. People in the Ministries, educational boards at the national and local level do take many decisions…
Focus only on one area: mathematics and provides personalised learning activities and feedback
Only certain courses which are based on specific models of domain knowledge in math and on the learner responses (cognitive models )
Information on student progress and mastery of each achievable skill
Teachers get several reports on individual’s engagement and learning of skills, but also on underperformance.
Class assessment
Works directly with each institution and makes use of available student data. Is individually tailored to each institution to fit their analytics needs
Data sources include VLE; social media, “card swipes” (e.g. using student card to go to library?), libraries, housing - Aggregates student data for analysis and visualisation
Historic and predictive data for institutional leaders and student service providers
Visualises student performance and success across modules and predicts programme completition
Integrates different tools
A learning environment tool that has a module (a feature) to make analytics available
Provides analysis and reporting at individual and group level. Tools to support evaluation and improvement of pedagogical practices
Data is gathered from different sources: works with several school book publishers whose modules can be used for analytics, but also student surveys and statistical data & data from national texts
Also uses adaptive tools, see Knewton in tools inventory for more information
Teaching 32 examples
A programme that provides content
Students can complete lessons, the programme analyses performance and detects gaps
Teachers have dedicated tools available
Runs easily in a brower
HOWEVER
5. Little information about privacy
6. No information about the impact of the tools, whether the use actually guarantees any learning outcomes or effectiveness, etc.
22 entered into force in May 2016 and will affect the learning analytics field in many ways. Europe has taken the position that individual privacy is important and that changes to current practices in general analytics are needed. Moving forward, the definition of personal data will be larger and more complex, and these legal changes will mean universities become data containers rather than data processors, with new responsibilities for control of data.
Institutions will need to understand their responsibilities and obligations with regard to data privacy and data protection and will have to put procedures in place to ensure that they are compliant with the legislation.
Cormack (2016) has proposed a Data protection framework for learning analytics that reduces the significance of the boundary between protected personal data and unprotected, non-personal data ensuring that all processing includes appropriate safeguards. The proposed framework appears in a special issue of the Journal of Learning Analytics that deals with issues of ethics, privacy and data protection (Ferguson, 2016).
Challenge !: What does it mean for educational policy?
work is needed to make links between learning analytics, the beliefs and values that underpin the area in Europe – and European priority areas for education and training 2020See also Vuorikari, R. (2017): Can Learning Analytics help the EU achieve its strategic objectives for education and training by 2020? Learning Analytics and Policy workshop in LAK
Challenge 2:
- much of the current work on learning analytics concentrates on the supply side – the development of tools, data, models and prototypes. There is considerably less work on the demand side –
i.e. on how analytics connect with education and the changes that school administrators, teachers and studentswant these tools to make in order to support their everyday learning, teaching and assessment work. More attention needs to be paid to the demand side - like, for example, the work carried out by Kennisnet in the Netherlands. This sought to help schools articulate what they want from ICT vendors, mediating requirements and exploring possible solutions, thus ensuring that learning analytics products have useful features for their end users.