Digital Skills Gap Peer Learning Activity - Two examples from the UK
1. ET2020Working Group: Digital Skills andCompetences
Two examples from the UK
Helen Beetham for Jisc
07 Dec 17
#digitalcapability #digitaltracker
2. Our challenges in UK HE
»Teaching staff are doing the basics
› …but feel starved of time and expertise to innovate
»Students have high expectations
»Funding constraints and economic uncertainty
› Employability the focus
»High dependence on external metrics…
› Staff fear of monitoring
› Student cynicism about surveys
› Strong demand from organisations for benchmarking
»… but a lack of quality information, local or national
»Many initiatives, little coherence08/11/17 #digitalcapability http://ji.sc/building-digicap
3. All available from:
http://ji.sc/building-digicap
09/05/17 #digitalcapability http://ji.sc/building-digicap
Outer layer (identity) = high level
attributes
Middle layer = digital practices
• learning and teaching
• research and problem solving
• communication and collaboration
• handling information, media and
data
Inner layer = functional access & skills
Designed to be adapted to: role,
subject specialism, local culture
A framework for digital capability
4. »Role-specific profiles
»Strategic model and
guidance
»Curriculum development
materials
»Case studies
»Further resources and
reports
09/05/17 #digitalcapability http://ji.sc/building-digicap
A range of resources
5. • Mainly benefits individuals
• Explore concept of digital capabilities
• Identify personal digital capabilities
• Understand more about own strengths
and weaknesses
• Access helpful resources
• Delivered in Potentially
• Add custom resources
• For staff and students
• Mainly benefits organisations
• Collect data
• Assess the digital experience
• Actionable data about access,
experience, support and attitudes
• Start conversations
• Delivered in BOS
• Add custom questions
• For staff and students
Tracker
Discovery
6. Student experience tracker
1. A tried and tested student survey, made up of:
› Closed questions that can be benchmarked
› Open questions for local analysis
› The opportunity to add customised questions
2. A student engagement process, governed by our guidance:
› Students play a part in planning, communicating, participation, analysis
and responding to findings
3. Triangulate with survey data from organisational leads and
from staff
4. A Community of Practice around the process and findings
7. Development and piloting
»Initial questions based on research
findings and sector consultations
(2016)
»Closed pilot with 24 selected
institutions (2016)
»Open pilot with 85 self-selecting
institutions, including 10 non-UK
HEIs (2016-17)
»Multi-strand evaluation – data
analysis, surveys, interviews
8. Evaluation findings 2017
1. Survey instrument is robust
2. Universities value the process:
» Actionable evidence about student digital experience
» Benchmark against other universities, monitor change
over time, and compare groups of students
» Better informed decisions about investments in
the digital environment and digital CPD
» Support for student engagement
» Bring stakeholders together to have a coherent
conversation, including with other universities
» Demonstrate quality enhancement & student
engagement
27/06/2017 https://www.jisc.ac.uk/rd/projects/student-digital-experience-tracker #digitalstudent
9. Evaluation findings 2017
3. Interest in our aggregated findings for the UK HE sector
»Range of briefings for stakeholders in universities & policy-makers
(ca 2000 views)
»interest from external agencies
»opportunities for further research
& foresight
»lessons for further development
of the tracker
»Reports available from
http://bit.ly/jisctracker17
http://bit.ly/tracker17brief
10. Open pilot 2017-18
»Almost 200 signed up (registration still open for non-UK HE)
»Refinements to student questions
»Contextualising data collected from organisational leads
»Optional teaching staff survey (for early 2018)
»Review panels (UK)
»(Potentially) more benchmarking
groups including non-UK
»Updated guidance, webinars, FAQs
jiscmail community
18/10/17 #digitalstudent http://bit.ly/trackerguide
11. Timeline
› Non-UK universities can still sign up before 30th April 2018
› Run the survey in any two- or three-week period after this.
› All surveys closed by 31st May 2018
› You have access to your
data until 31 July 2018
12. • Mainly benefits individuals
• Explore concept of digital capabilities
• Identify personal digital capabilities
• Understand more about own strengths
and weaknesses
• Access helpful resources
• Delivered in Potentially
• Add custom resources
• For staff and students
• Mainly benefits organisations
• Collect data
• Assess the digital experience
• Actionable data about access,
experience, support and attitudes
• Start conversations
• Delivered in BOS
• Add custom questions
• For staff and students
Tracker
Discovery
13. Digital discovery
» Digital discovery supports individuals
(staff and students) to reflect on their
digital capabilities
» A series of quiz-type questions and
feedback linked to the digital
capabilities framework
» Participants are made aware of digital
skills they have and new ones to try
» Results show a visual summary, ‘next
steps’ and links to further resources
08/11/17 #digitalcapability http://ji.sc/building-digicap
http://bit.ly/digcapdiscovery
14. Users like finding out about digital capabilities and
exploring what they can do already (and might try next)
»60% link through to further resources
»80% of those found the resources relevant to their needs
»Overall, 83% found the discovery tool ‘somewhat useful’, ‘useful’,
or ‘very useful’
»30% said they were planning to do something new as a result of
using the discovery tool
»Need for significant improvements in platform, user experience;
questions needed to focus on work-related digital issues
First pilot evaluation: what we learned
08/11/17
15. 2018 pilot
08/11/17 #digitalcapability http://ji.sc/building-digicap
»Over 100 education providers
signed up; working closely with 10
»New platform: potentially
»New questions and question types
»Generic questions for staff +
teaching questions (HE and
FE/Skills)
»Coming soon: generic + specialist
questions for students
»New pilot and evaluation process
Dec 2017-May 2018
17. What have we learned?
» Framework provides coherence and common language while
letting people develop local solutions
» Tension between collecting organisational data e.g. for planning,
and supporting self-development
» Tension between describing digital practice broadly or in detail:
› Broad capabilities and mindset require examples to clarify
› Detail/examples date quickly and exclude some people
» The actual framework matters less than the process of
engagement (local context) and sharing across contexts
08/11/17 #digitalcapability http://ji.sc/building-digicap
Notas del editor
From a curriculum point of view, the outer layer of the framework (digital identity) corresponds to high level outcomes, such as graduate attributes, or end-of-course assessment criteria. What kind of learner or practitioner do you aspire to become? Our first activity will take place in this layer.
The four interlocking circles in the middle layer of the diagram correspond to four areas of digital practice in the curriculum. They will be our focus when it comes to designing authentic learning activities.
Finally, the inner layer corresponds to the digital tools used, and the functional skills required to use them. They may be very specialist or quite generic (the Learner Profile suggests examples of both). But the design process or workflow we will follow does not look at the tools and skills to be used until those higher level decisions about the overall learning activity and how it can authentically be carried out using digital technologies.
As an outcome of the digital student work and the need to gather quantitative data on students digital experience at an organisational level and at a sector level, we developed the student digital experience tracker as a survey tool with a robust set of student tested questions delivered in BOS. See http://bit.ly/jiscdigidataservice
This evidence supports discussions with senior managers
The report containing the summary findings from 2017 surveys will be available from 20th June from web link on this slide.
We asked questions through a variety of formats – including a pop up short survey after staff had completed the tool (907 responses), and fuller survey a couple of weeks later. We also asked institutional leads for their feedback.
There are also benefits for organisations – the anonymous data returns that an organisational lead can access show the number of completions by department….
There are also benefits for organisations – the anonymous data returns that an organisational lead can access show the number of completions by department….
Benefits at both staff and institutional level
Staff – we have seen it can be powerful tool for widening perceptions and understanding of what digital literacies are, and provide a useful point for reflection
For organisations – an aggregated view of anonymised staff data to provide insight to support identification of future support needs.