LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestras Condiciones de uso y nuestra Política de privacidad para más información.
LinkedIn emplea cookies para mejorar la funcionalidad y el rendimiento de nuestro sitio web, así como para ofrecer publicidad relevante. Si continúas navegando por ese sitio web, aceptas el uso de cookies. Consulta nuestra Política de privacidad y nuestras Condiciones de uso para más información.
Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
Educational Technology Assessment Hierarchy
Does it impact
How many people use it?
Does it work? (SLAs)
What is Learning Analytics?
Learning and Knowledge
Analytics Conference, 2011
“ ...measurement, collec6on, analysis and
repor6ng of data about learners and their
contexts, for purposes of understanding
and op2mizing learning
and the environments
in which it occurs.”
Meta- questions driving our Learning Analytics research @ Blackboard
1. How is student/faculty use of Bb plaorms (e.g. Learn, Collab,
etc.) related to student achievement? [or sa6sfac6on, or risk, or …]
3. What data elements, feature sets, and func6onality can we
create to integrate these ﬁndings into Bb products to help faculty
improve student achievement?
2. Do these ﬁndings apply equally to students ‘at promise’ due to
their academic achievement or background characteris6cs? (e.g.
race, class, family educa6on, geography)
• Simula2on if X, what Y? (“With this Ultra
Learning Analy6cs trigger rule, how many
students would trip no6ﬁed?”)
• Hypothesis tes2ng: inves6gate if a speciﬁc
rela6onship is true (“What’s the rela6onship
between 6me spent in a course and student
• Data mining: analyze underlying latent pakerns
in data (“What typical pakerns in tool use
characterizes BB Learn courses?”)
Key Data Sources
• Learn Managed Hos6ng
• Learn SaaS
• Collaborate Ultra
Main Big Data Sources & Techniques
Commitment to Privacy & Openness
• Analyze data records
that are not only
removed of PII, but
• Share results and open
for analysis to inform
• Respect territorial
jurisdic6ons and safe
Bb Study: Relationship Time in Learn & Grade
• Distribution in Time Spent is
highly skewed toward low
• Transforming data (log
transform) can produce normal
curves for analysis
• Of course, huge variation of
quality within that time spent
(of course materials, of student
Findings: Relationship Time in Learn & Grade
• Question: what is the
relationship between student
use of Learn and their course
• Investigate at student-course
level (one student, one course)
• 1.2M students, 34,519 courses,
• Significant, but effect size < 1%
Finding: Tool Use & Grade
Tool use and Final Grade do not have a linear rela6onship;
there is a diminishing marginal eﬀect of tool use on Final Grade
• Students absent from course ac6vity are at
greatest risk of low achievement.
• The ﬁrst 6me you read/see a PowerPoint
presenta6on, you learn a lot, but the
second 6me you read/see it, you learn
• GeYng from a 90% to a 95% requires
more eﬀort than geYng from a 60% to a
Log transforma2on shows
But strong effect in some courses
some for a
Investigation Grade by Specific Tools Used
Ques6on: what is the rela6onship
between use of Learn and student grade,
based on the tool used?
1. Filter data for courses with poten6al meaningful
use (>60 min average, enrollment >10 <500,
2. Iden6fy most frequently used tools
3. Separate tool use into no use & quar6les
4. Divide students into 3 groups by course grade
• High (80+)
• Passing (60-79)
• Low/Failing (0-59)
At every level, probability of higher grade increases with increased use.
Causal? Probably not. Good indicator? Absolutely.
Finding: Course contents
More is not always beker. Large jump none to some; then no rela6onship
Students above mean have lower likelihood of achieving a high grade than students below the mean
• Move beyond LMS use as proxy for eﬀort (where more is always beker), and get at ﬁner-grained
learning behaviors that are more useful (e.g. students who are struggling to understand material,
students who are not prepared).
• Major missing elements from research
– ﬁne-grained understanding of ac6vity over 6me (e.g. cramming vs. consistent hard working)
– quality of course materials and course design
Research Questions Ques2ons
1. Are there systema6c ways that instructors use LMS tools in
their courses that span instructors and ins6tu6ons?
2. What recommenda6ons can be drawn for faculty, instruc6onal
designers, and other academic technology leaders seeking to
increase the impact of LMS use at their ins6tu6on?
1. Use same ﬁltered data sample of student-course data
2. Calculate rela6ve student 6me per tool (as % of total course
6me), for comparison between courses
3. Cluster by pakerns in the balance of 6me spent in each tool
(unsupervised machine-learning; k means cluster analysis)
4. Add data as relevant to pakerns about enrollment, total 6me,
5. Make up cool names for each cluster and interpret meaning
Finding: Discussions with low/high avg use
Compare courses with low forum use to courses with forum use >1 hour / student average
Summary & Future Directions for DS Research
• Tremendous varia6on in use of Learn; most use skewed toward low/very low use.
• Importance of 6me spent in Learn for learning is also tremendously varied (“necessary” and “eﬀec6ve” use
• Cri6cal to account for this varia6on to understand poten6al importance of Learn ac6vity
• Analyze quality of ac2vity in greater depth (e.g. content of assignments, words in forum posts) to get
insights into quality of interac6ons
• Conduct 6me-series analysis (quan6ta6ve methods, design also needed); when someone accesses is more
important than if they do.
• Create proxies/derived values for behavior (above average, at average, etc.) by tool
Blackboard Analytics – Product Naming
Data warehouse products
Suite of analytics products
• Analy6cs for Learn – LMS data
• Student Management – SIS data
• Finance, HR, Advancement – ERP data
• Predic6ve analy6cs and early alerts for reten6on
• Provides data for faculty and advisors about at-risk students
• Formerly Blue Canary
X-Ray Learning Analytics
• Classroom engagement data for faculty
• Ac6vity aggregated into 30+ visualiza6ons
• Currently available for Moodlerooms & Self-Hosted Moodlers only
Past view Current view Future view
Past view Current view Future view
Past view Current view Future view
Blackboard Analytics – Our Approach & Philosophy
Products that provide insight into
the teaching and learning process
and academic data
A team of experts
in the analy6cs ﬁeld
John Whitmer, Ed.D.