Nottingham Trent University Elevates Big Data’s Role to Improving Student Retention in Higher Education
1. Nottingham Trent University Elevates Big Data’s Role to
Improving Student Retention in Higher Education
Transcript of a sponsored discussion on how one university uses big-data analysis to track
student performance and identify at-risk students.
Listen to the podcast. Find it on iTunes. Get the mobile app. Download the
transcript. Sponsor: Hewlett Packard Enterprise.
Dana Gardner: Hello, and welcome to the next edition of the HPE Discover Podcast Series.
I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this
ongoing discussion on IT innovation and how it’s making an impact on people’s
lives.
Our next big-data case-study interview examines how Nottingham Trent
University in England has devised and implemented an information-driven way to
encourage higher student retention.
By gathering diverse data and information and making rapid analysis, Nottingham
Trent is able to quickly identify those students having difficulties. They thereby can provide
significant reductions in dropout rates while learning more about what works best to usher
students into successful academic careers.
What’s more, the analysis of student metrics is also setting up
the ability to measure more aspects of university life and
quality of teaching, and to make valuable evidence-based
correlations that may very well describe what the next decades of
successful higher education will look like.
To learn more about taking a new course and the use of data science in education, we're pleased
to welcome our guest. We're here with Mike Day, Director of Information Systems at
Nottingham Trent University in Nottingham, UK. Welcome, Mike.
Mike Day: Thank you.
Gardner: First tell us a bit about Nottingham Trent University. It’s a unique institute, a very
large student body and many of them attending university for the first time in their families.
Day: That’s right. We've had around 28,000 students over the last few years, and that’s probably
going to increase this year to around 30,000 students. We have, as you say, many, many students
who come from poor backgrounds -- what we call "widening participation" students. Many of
them are first generation in their family to go to university.
Gardner
2. Sometimes, those students are a little bit under-confident about going to university. We’ve come
to call them "doubter students," and those doubters are the kinds of people that when they
struggle, they believe it’s their fault, and so they typically don't ask for help.
Gardner: So it's incumbent upon you to help them know better where to look for help and not
internalize that. What do you use to measure the means by which you can identify students that
are struggling?
Low dropout rate
Day: We’ve always done very well in Nottingham Trent. We had a relatively low dropout rate,
about seven percent or so, which is better than sector average. Nevertheless, it was really hard
for us to keep students on track throughout their studies, especially those who
were struggling early in their university career. We tended to find that we have to
put a lot of effort into supporting students when they had failed exams, which for
us, was too late.
We needed to try to find a way where we could support our students as early as
possible. To do that, we had to identify those students who were finding it a bit
harder than the average student and were finding it quite difficult to put their hand
up and say so.
So we started to look at the data footprint that a student left across the university, whether that
was a smart card swipe to get them in and out of buildings or to use printers, or their use of the
library, in particular taking library books out, or accessing learning materials through our
learning management system. We wanted to see whether those things would give us some
indication as to how well students were engaged in their studies and therefore, whether they're
struggling or not.
Gardner: So this is not really structured information, not something you would go to a relational
database for, part of a structured packaged application, for example. It's information that we
might think of as breadcrumbs around the organization that you need to gather. So what was the
challenge for dealing with such a wide diversity of information types?
Day: We had a very wide variety of information types. Some of it was structured, and we put a
lot effort into getting some good quality data over the years, but some of it was unstructured.
Trying to bring those different and disparate datasets together was proving very difficult to do in
very traditional business intelligence (BI) ways.
We needed to know, in about 600 terabytes of data, what really mattered, what were the factors
that in combination told us something about how successful students behave, and therefore
something about comparing those that were not having such an easy time at the university how to
compare that to those who were succeeding in it.
Day
3. Gardner: It sounds as if the challenges were not only in the gathering of good information but in
how to then use that effectively in drawing correlations that would point out where students
rapidly were struggling. Tell us about both the technology side and then also the methodologies
that you then use to actually create those correlations?
Day: You're absolutely right. It was very difficult to find out what matters and to get the right
data for that. We needed ultimately to get to a position where we could create great relationships
between people, particularly between tutors or academic counselors and individual students.
On the technology side, we engaged with a partner, that was a company called DTP
Solutionpath, who brought with them the HP IDOL engine. That allowed us to submit about five
years worth of back data into the IDOL engine to try to create a model of engagement, in other
words, to pick out what factors within that data in combination gave as a high confidence around
student engagement.
Our partners did that. They worked very closely with us in a very collaborative way, with our
academic staff, with our students, importantly -- because we have to be really clear and
transparent about what we are doing in all of this, from an ethical point of view -- and with my
IT technical team. And that collaboration really helped us to boil down what sorts of things really
mattered.
Anonymizing details
Gardner: When you look at this ethically you have to anonymize a great deal of this data in
order to adhere to privacy and other compliance issues. Is that the case?
Day: Actually, we needed to be able to identify individual students in all of this, and so there
were very real privacy issues in all of this. We had to check quite carefully our legal position to
make sure that we did comply with UK Data Protection Act, but that’s only a part of it.
What’s acceptable to the organization and ultimately to individual students is perhaps even more
important than the strict legal position in all of this. We worked very hard to explain to students
and staff what we were trying to do and to get them on board early, at the beginning of this
project, before we had gone too far down the track, to understand what would be acceptable and
what wouldn’t.
Gardner: I suppose it’s important to come off as a big brother and not the "big brother" in this?
Day: Absolutely. Friendly big brother is exactly what we needed to be. In fact, we found that
how we engage with our student body was really important in all of this. If we try to explain this
in a technical way. then it was very much Big Brother. But when we started to say, "We're trying
to give you the very best possible support, such that you are most likely to succeed in your time
in higher education and reap the rewards of your investment in higher education," then it became
a very different story.
4. Particularly, when we were able to demonstrate the kind of visualizations of engagement to
students, that shifted completely, and we've had very little, if any, problems with ethical concerns
among students.
Gardner: It also seems to me that the stakes here are rather high. It's hard to put a number on it,
but for a student who might struggle and drop out in their first months at university, it means
perhaps a diminished potential for them over their lifetime of career, monetization of income,
and contribution to society, and so forth.
So for thousands of students, this could impact them over the course of a generation. This could
be a substantial return on investment (ROI), to be a bit crass and commercial about it.
Day: If you take all of this from the student’s perspective, clearly students are investing
significant amounts of money in their education.
In the UK, that’s £9,000 (USD $13,760) a year at the moment, plus the accommodation costs,
and the cost of not getting a job early, and all of those sorts of things that those students put into
to invest in their early university career. To lose that means that they come out of the university
experience being less positive than it could have been, with much, much lower earning potential
over their lifetime.
That also has an impact on UK PLC, in that it isn’t perhaps generating as many skilled
individuals as it might. That has implications for tax returns and also from a university point of
view. Clearly if our students dropout, they aren’t paying their fees, and those slots are now
empty. In terms of university efficiency, there was also a problem. So everybody wins if we can
keep students on course.
On the journey
Gardner: Certainly a worthy goal. Tell us a little bit about where you are now? I think we
have the vision. I think we understand the stakes and we understand some of the technologies
we’ve employed. Where are you on this journey? Then, we can talk about so far what some of
the results have been.
Day: It was very quick to get to a point where the technology was giving us the right kinds of
answers. In about two to three months, we got into a position where the technology was pretty
much done, but that was only a really part of the story. We really needed to look at how that
impacted our practice in the university.
So we started to run a series of pilots into the series of courses. We did that over the course of a
year about 18 months ago and we looked at every aspect of academic support for students and
how this might change all of this. If we see that a student is disengaging from their studies, and
5. we can see that now about a month or two before it otherwise would have been able to do that,
we can have a very early conversation about what the problem might be.
In more than 90 percent of the cases that we have seen so far, those early conversations result in
an immediate upturn in student engagement. We’ve seen some very real tangible results and we
saw those very early on.
We expected that it would take as a considerable amount of time to demonstrate the system
would give us a value at an institutional level, but actually it didn't. It took about six months or
so into that pilot period that would set a year aside for to get to a position where we were
convinced, as an institution ,that we roll out across the whole university. We did that at the
beginning of this academic year and we rolled out about six months earlier than we thought. So
we might even start thinking about that.
We now have had another year thinking about what good practice is, seeing that academic tutors
are starting to share good practice amongst themselves. So there is a good conversation going on
there. There is a much, much more positive relationship between those academic tutors and the
students being reported from both the students and the tutors, we see that being very positive.
Importantly, there is also a dialogue going on between students themselves. We've started to see
students competing with each other to be the best engaged in their course. That’s got to be a good
thing.
Gardner: And how would they measure that? Is there some sort of a dashboard or visualization
that you can provide to the students, as well as perhaps other vested interests in the ecosystem, so
that they can better know where they are, where they stand?
Day: There absolutely is. The system provides a dashboard that gives a very simple
visualization. It’s two lines on a chart. One of those lines is the average engagements of the
cohort on a course by course basis. The other line is the individual student’s engagement
compared to that average engagement in the course; in other words, comparing them with some
of their peers on that.
We worked very hard to make that visualization simple, because we wanted that to be consistent.
It needed to be something that prompted a conversation between tutors and students, and tutors
sharing best practice with other tutors. It's a very simple visualization.
Sharing the vision
Gardner: Mike, it strikes me that other institutions of higher learning might want to take a
page from what you've done. Is there some way of you sharing this or packaging it in some way,
maybe even putting your stamp and name and brand on it? Have you been in discussions with
other universities or higher education organizations that might want to replicate what you’ve
done?
6. Day: Yes, we have. We're working with our supplier SolutionPath who have created now a
model that is used to replicate in other universities. It starts with a readiness exercise. because
this is not about technology mostly. It's about how ready you are, as an organization, to address
things like privacy and ethics in all of this. We've worked very closely with that.
We’ve spoken to two dozen universities already about how they might adopt something similar
not necessarily exactly the same solution. We've done some work across the sector in the UK
with a thing called the Joint Information Systems Committee, which looks at technology across
all 150 universities in UK.
Gardner: Before we close out, I'm curious.When you’ve got the apparatus and the culture in the
organization to look more discretely at data and draw correlations about things like student
attainment and activities, it seems to me that we're only in the opening stages of what could be a
much more data-driven approach to higher education. Where might this go next?
Day: There’s no doubt at all that this solution has worked in its own right, but what it actually
formed is a kind of bridgehead, which will allow us to take the principles and the approach that
we have taken around the specific solution and apply to other aspects of the universities business.
For example, we might be able to start to look at which students might succeed on different
courses across the university, perhaps regardless of traditional ways of recruiting students
through their secondary school education qualification. It's looking at what other information
might be a good indicator of success in a course.
We could start looking at the other end of the spectrum. How do students make their way into the
world of work? What kinds of jobs do they get? And is this something about linking right at the
beginning of a student’s university career, perhaps even at application stage, to the kinds of
careers they might succeed in, and to try and advise early on those sorts of things that student
might want to get involved with and engaged with. It’s a whole raft of things that we can start to
think about.
Research is another area where we might be able to think about how data helps us, what kind of
research might we best be able to engage in, and so on and so forth.
Gardner: Very interesting. Clearly educating a student is an art as well as a science, but it
certainly sounds as if you're blending them in a creative and constructive way. So congratulations
on that.
I'm afraid we’ll have to leave it there. We’ve been exploring how Nottingham Trent University in
England has devised and implemented an information-driven way to encourage higher student
retention, and we’ve heard how by gathering diverse data and information, and by making rapid
analysis from that, Nottingham Trent has been able to quickly identify students with difficulties.
So please join me in thanking our guest. We’ve been here with Mike Day, Director of
Information Systems at Nottingham Trent University. Thank you, sir.
7. Day: Thank you.
Gardner: I’d like to thank our audience as well for joining us for this big data innovation case
study discussion.
I'm Dana Gardner; Principal Analyst at Interarbor Solutions, your host for this ongoing series of
HP sponsored discussions. Thanks again for listening, and come back next time.
Listen to the podcast. Find it on iTunes. Get the mobile app. Download the
transcript. Sponsor: Hewlett Packard Enterprise.
Transcript of a sponsored discussion on how one university uses big-data analysis to track
student performance and identify at-risk students. Copyright Interarbor Solutions, LLC,
2005-2015. All rights reserved.
You may also be interested in:
• Forrester Analyst Kurt Bittner on the Inevitability of DevOps
• Agile on fire: IT enters the new era of 'continuous' everything
• Big data enables top user experiences and extreme personalization for Intuit TurboTax
• Feedback loops: The confluence of DevOps and big data
• Spirent leverages big data to keep user experience quality a winning factor for telcos
• Powerful reporting from YP's data warehouse helps SMBs deliver the best ad campaigns
• IoT brings on development demands that DevOps manages best, say experts
• Big data generates new insights into what’s happening in the world's tropical ecosystems
• DevOps and security, a match made in heaven
• How Sprint employs orchestration and automation to bring IT into DevOps readiness
• How fast analytics changes the game and expands the market for big data value
• How HTC centralizes storage management to gain visibility and IT disaster avoidance
• Big data, risk, and predictive analysis drive use of cloud-based ITSM, says panel