Authors: Andy Phippen, David Wright, Tanya Ovenden-Hope.
This paper explores the data provided by over 1000 schools in the UK related to their online safety policy and practice. By comparing with data from the previous year, we consider the current state of practice among UK schools and analyse progress over a 12-month period.
Are Schools Equipped to Address Online Safety in the Curriculum and Beyond?
1. In-depth
Are schools equipped to address online safety in the
curriculum and beyond?
Authors This paper explores the data provided by over 1000 schools in the UK related to their
online safety policy and practice. By comparing with data from the previous year, we
Andy Phippen
consider the current state of practice among UK schools and analyse progress over a
School of Management,
Plymouth University, UK 12-month period.
andy.phippen@plymouth.
ac.uk What is clear from this analysis is that the aspects that either use technological inter-
vention (i.e. filtering) and policy development are generally performing better than
David Wright those that require long-term resource investment (such as training) or whole school in-
South West Grid for volvement (such as parental education or community understanding). Monitoring and
Learning Trust, UK reporting also perform badly. It is interesting to note that even with an almost double
David.Wright@swgfl.org.uk
the number of participating establishments, the strongest and weakest performing as-
Dr Tanya Ovenden-Hope pects remain almost constant across 2010 and 2011, with only slight improvement.
School of Social Science
The analytical tool used to gather this data is now being used in pilot projects in the US
and Social Word, Plymouth
University, UK and Australia. Once it is in full use in these regions, detailed analysis of international
Tanya.ovenden-hope@ performance will be available, for the first time. This presents some exciting opportuni-
plymouth.ac.uk ties to understand at an international level, how schools engage with online safety and
ensure protection of their pupils, staff and wider community.
Tags
media education, online 1. Introduction
safety, school self-
The issue of online safety never out of the media and constant concerns for schools, who
assessment
have duty of care to both staff and pupils, as well as ensure policy is in place to show due dili-
gence related to different aspects of online safety. However, while the focus of much media
is on the sensational aspects of the issues (for example, predatory behaviour, cyberbullying),
the reality of online safety in schools is far more broad, ranging from technical countermeas-
ures such as effective password strategy and content filtering, to developing policy is in place
to deal with incidents if they arise.
In the UK, a lack of national strategy on online safety has meant that many schools have
adopted their own approaches which the institutions themselves have identified as, in many
cases, incomplete and inconsistent. A review of online issues by Tanya Byron (http://media.
education.gov.uk/assets/files/pdf/s/safer%20children%20in%20a%20digital%20world%20
the%202008%20byron%20review.pdf) proposed a holistic review to online safety, compris-
ing a broad range of issues from technical issues to wider parental and community educa-
tion. It called for a “whole school” approach where all staff were involved and engaged in all
aspects of online as provided with regular training to ensure their knowledge and practice is
up to date with the every changing field.
OFSTED’s Safe Use of New Technologies report (http://www.ofsted.gov.uk/resources/safe-
use-of-new-technologies), built on the recommendations of the Byron review concluding
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
1
2. In-depth
that outstanding online safety had to have a whole school ap- adopted by many more organisations since this publication, and
proach, including pupils, staff, governors, parents and the com- the data presented here is based upon returns from 1055 edu-
munity in policy and practice, and did not use technology in a cational establishments.
locked down manner.
In this paper we present analysis “a year on” from this first re-
However, while these important policy documents were wel- port, comparing development over the 12 month period and
comed, it also presented schools with a challenge in how to for the first time allow a comparison of progress to understand
transform this strategic vision of what online safety should be how institutions, and online safety policy and practice, has de-
into operational terms. veloped in the UK up to September 2011.
360 degree safe [www.360safe.org.uk] was launched by South
West Grid for Learning Trust [www.swgfl.org.uk], in November 2. Methodology
2009 as a means to allow schools to self-evaluate their own on- An overview of the 360 structure, detailing aspects covered,
line safety provision; benchmark that provision against others; can be found at http://360safe.org.uk/Files/Documents/360-
identify and prioritise areas for improvement and find advice degree-safe-Structure-Map. In total 28 aspects are detailed by
and support to move forward. It provided a tool for schools to the tool, from technical measures such as filtering and technical
firstly understand the breadth of issues associated with online security, through policy development to training and communi-
safety and then review their own performance and identify how ty education. For each aspect a school can give themselves a rat-
to improve. It provided summary reports of progression, which ing from 5 to 1 (5 being worst, 1 being best). For each rating in
helped all staff (not just those charged with the job of imple- each aspect, clear definitions are provided for each level to help
menting an online safety policy) to understand the scope of on- the self review process. Establishments carry out the self review
line safety and what the school is doing about the issue. via a web interface and submitted data is stored in a relational
database structure which holds the information in a collection
In operationalising an online safety “vision”, the tool provided
on related “tables”, each table related to a specific data element
a prioritised action plan, suggesting not just what needs to be
within the system. The three data tables which provide the core
done, but also in what order it needs to be done. This is a vital
for analysis relate to establishments, 360 degree safe aspects,
bonus for teachers and managers who approach the issue of
and individual ratings, which detail an entry that an establish-
online safety for the first time, in a school which has no (or only
ment has made against a specific aspect.
a very rudimentary) policy.
Each establishment’s “profile” comprises a number of entries in
Understanding Online Safety Policy and Practice the rating table, each related to a specific aspect. It is possible
with 360 Degree Safe Data for an establishment to have more than one entry in the rating
table associated with a specific aspect which would reflect that
As well as providing a tool for schools to understand and de- they are using the tool for school improvement around online
practice, the tool also collects all submis-
velop their own online safety policy and
sions into a central database. In building
a picture of practice across the UK, this Establishments
Aspects
resource is unique to hold data on every
school who have engaged with the tool.
In September 2010, the first analysis of
the 360 degree safe database was pub-
lished by the South West Grid for Learning
Rating
(http://www.swgfl.org.uk/Staying-Safe/
Content/News-Articles/Largest-ever-sur-
vey-of-E-Safety-in-schools-reveals) based
upon data returned from 547 establish-
Figure 1: 360 data structure
ments across England. The tool has been
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
2
3. In-depth
safety practice. An establishment’s profile will also reflect their Education, an agency of the UK’s Ministry of Defence1, which
current stage provides education for MoD employee’s children overseas.
Given the relational structure of the 360 degree safe data, the
primary approach to analysis is through the querying of this
data structure using SQL. In addition, summary data was loaded
into Microsoft Excel for further statistical analysis and graphing.
Analysis of the data focuses on establishment’s self review of
their online safety policy and practice, exploring their ratings
against the 28 aspects of 360 degree safe. Aspect exploration al-
lows the measurement of degrees of progression and improve-
ment in the self review and those where, in general, policy and
practice among UK educational establishment requires support
to deliver further progress.
It should be noted that the international data (US and Austral-
ia) has a slightly different, extended, structure, for the review
Figure 2: Establishment geography
aspects, and this will be discussed in more detail later in this
report.
The “phase” of the establishment responses shows the break-
It is acknowledged that the data being explored is self reviewed down between primary, secondary and post-16 and nursery. As
– the establishments give themselves ratings against the as- can be seen from figure 3, the majority of those registered are
pects and level definitions. However, self review is well estab- from primary schools. It is encouraging, given last year’s analy-
lished practice within the UK school system and level descrip- sis showing that primary schools consistently underperform
tors are very clearly defined. In addition, accreditation visits to against secondary schools in online safety policy and practice,
date have demonstrated that in the instances of inspection that that the largest area of growth is registrations from that phase
have occurred, self review ratings have been generally accurate. of school. While the number of secondary schools has more
Indeed, many schools are generally conservative with their as- than doubled, the number of primaries has more than trebled
sessments. We also now have a sufficiently large database that in 12 months.
“anomalous” returns are very apparent and can be followed up
with the school or its local authority.
3. Details of the Establishments Analysed
The vast majority of the data is drawn from English schools
although there are a few from Wales. There are almost three
times as many schools now registered to use the tool than there
were last year. However, we should acknowledge that not all
schools who have registered have embarked on their self re-
view.
Based upon the local authority specified by each establishment,
figure 2 details the proportion of establishments from differ-
ent regions. As we can see, while there is a large proportion
from the south west, over half are from other regions. The Mid- Figure 3: Establishment “phase”
lands also has a strong representation, and there are also good
spreads across other regions. SCE refers to Service Children’s
1 http://www.sceschools.com/home.php
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
3
4. In-depth
4. Analysis of Aspect Performance tool. Therefore, different aspects have been rated by different
numbers of establishments. In total, 559 establishments from
Top level analysis of practice and policy performance explores
our population have carried out the full self review, and 496
responses to different aspects given by each establishment. The
additional schools have reviewed at least one aspect. Of those
initial analysis explores the “best” rating any establishment has
establishments that have not completed a full review, figure 4
provided, given this reflects where establishments currently
illustrates the variety of
levels of completion to
date. It details the num-
ber of establishments
that have achieved each
given number of aspects
to show the range of
completion
This breakdown shows a
spread of responses from
those still in the early
stages of self review to
those nearing completion
of the full set of aspects.
It is interesting to note
Figure 4: The number of aspects completed by any establishment that has not completed the full review that, as with last year,
there is a large concen-
stand in their self re-
view. However, given
that 360 degree safe is
intended for use to im-
prove as well as evalu-
ate practice, a feature
of the 360 degree safe
database is that it re-
cords any evaluation
on a particular aspect
made by an establish-
ment at the time and
date of entry. This data
can be used to explore
which areas are show-
ing improvement in
schools.
It should also be noted
that it is not necessary
for an establishment
to have completed the
full self review to have
Figure 5: Aspect frequency
its data logged in the
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
4
5. In-depth
tration of establishments who have completed 15 aspects. We plore areas of strength and weakness across our establishments.
would observe that, if the tool was being used in a linear man- We present this data as an approximate “state of the nation”
ner, the 16th aspect is Password Security, arguably the first of report related to online safety policy and practice in the UK.
the aspects being reviewed that might require specialist techni- However, we acknowledge that it is likely that the respondents
cal input to make a judgement on the levels. We might hypoth- who have embarked on an online safety self review are likely to
esis (but cannot test at the present time) this may be a reason be more engaged as “early adopters” than those who have not.
why some reviews seem to stall at this stage. Therefore, we might make the assumption that the data pre-
sented may be better than average if it were possible to analyse
In further exploring which aspects are more “popular” with es-
performance in all educational establishments in the country.
tablishments, we can examine each aspect and the number of
However, in comparing the results from this year’s analysis with
establishments who have completed a self review of that ele-
those from the previous year, we will highlight a fairly consist-
ment. This is detailed in figure 5 and again supports our hypoth-
ent pattern, even with the addition of a significant number of
esis that aspects requiring technical input (or those following
new establishments. Therefore, this year we can say with higher
aspects requiring technical input) are less “popular” than other
confidence than last that this does represent a national picture.
aspects. We can see the two largest drops in aspect completions
are around Password Security and Technical Security: Each aspect can be rated by the self reviewing establishments
on a progressive maturity scale from 5 (lowest rating) and 1
The aspects are ordered as they appear in the self review tool
(highest). In all cases analysis of the aspect ratings shows an
and the pattern presented shows that most establishments will
across establishment maximum rating of 1 and minimum of 5.
undertake a linear approach to completing the self review. It
Therefore, in order to determine cross-establishment perfor-
should be noted that the tool can be used in a non-linear man-
mance, average scores for each rating are used to measure are-
ner, but figure 5 suggests that this is not the case in the majority
as of strength and weakness in online safety policy and practice.
of establishments.
Figure 6 illustrates overall averages across aspects:
We will now move from the top level quantity based returns
to look in more detail at each aspect presented in order to ex-
Figure 6: Average ratings per aspect
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
5
6. In-depth
The top 5 aspects across establishments are exactly the same as • Monitoring the impact of policy and practice (3.96)
last year. In 2010 the strongest aspects were: • E-Safety Committee (3.94)
• Filtering (2.57) • Staff training (3.84)
• Acceptable Use Policies (2.78)
And this is the same in 2011:
• Policy Scope (2.8)
• Community understanding (4)
• Digital and video images (2.93)
• Governor training (3.93)
• Policy development (3.02)
• Monitoring the impact of the e-safety policy and practice
In 2011 they are: (3.9)
• Filtering (2.5) • E-Safety Committee (3.82)
• Policy Scope (2.65) • Staff training (3.76)
• Acceptable Use Policies (2.71) All of these aspects require long term development and commit-
• Digital and video images (2.83) ment of resources (for example, regular and up to date training
• Policy development (2.88) or monitoring). As with the strongest aspects, all have improved
to some degree, which is encouraging to see. It is interesting to
There are two points to note in comparing the two sets of as- note that even with more than double the population size, the
pects. Firstly, there has been a slight change in that Policy Scope strongest and weakest aspects have remained very similar. This,
is now ahead of Acceptable Usage Policies. More significantly, again, gives us confidence in the representative nature of our
all of the aspects have improved on last year’s scores. While population data and the consistency of the self review process.
increases are not huge, all aspects have improved by some de-
gree. And while that is encouraging as we remarked upon last Standard deviation is also used to explore the “spread” of rat-
year, the strongest aspects all have either a documentary (i.e. ings in the self review process. This is a useful measure to con-
putting a policy in place, possibly derived from a local authority sider whether an aspect is consistently strong or weak across all
or regional broadband consortia) or technical in nature (which schools, or whether there is variance in the evaluation. A high
again is generally
provided by an out-
side agency or off
the shelf solution).
We see a similar
established trend
with the five low-
est rated aspects.
As we identified last
year these all focus
on education and
long term resource
commitment and
the 2011 weakest
aspects are exact-
ly the same as in
2010:
• Community
understanding
(4.03)
• Governor Figure 7: Standard deviation of aspects
training (4.03)
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
6
7. In-depth
standard deviation would mean that different establishments establishments than the previous year. Figure 8 shows the com-
were using a broad range of scores for self review. Figure 7 parison between the two sets of averages, and shows a very
shows the standard deviations across the aspects. similar pattern but an improvement across all aspects.
As with last year, “Filtering” is a high average and low standard We can compare this and last year’s scores and see there is vari-
deviation, which shows that filtering is consistently highly rated ation in the level of change. The “most improved” aspects are
across establishments. Also similarly to last year, other “strong” as follows:
aspects have a broader standard deviation. For example, Digital • Governors (0.16)
images and video and policy scope show that these practices • E-Safety Committee (0.14)
have a greater variance around schools.
• Policy development (0.13)
In considering the weakest aspects, we can see that both Staff • Policy Scope (0.13)
Training and Monitoring and Reporting Incidents have com- • The contribution of young people (0.12)
paratively narrow standard deviations, which would suggest This is positive to see improvement in some areas that are out-
that these aspects have consistently poor performance across side of the “policy or technical” areas. In particular the role of
schools. Another one of the weaker areas of practice – Informa- Governors in the online safety context is particularly encourag-
tion Literacy – also has a low deviation again reflecting consist- ing, given the stewardship of the school strategy and the poten-
ently poor performance. tial for more aware governors to ask questions of senior man-
agement around these issues.
5. 2010/2011 comparison
However, of least improved areas:
While we have used some comparison to last year’s data to
• Information literacy (0.01)
show that there is a consistency and robustness to our data set,
• Parental education (0.01)
we now consider the comparison in more detail. As has been
• Community understanding (0.03)
discussed above, the 2011 data set included considerably more
Figure 8: 2010/2011 average rating comparison
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
7
8. In-depth
• M&R Incidents (0.04) For example, we can see slight increase in spread for filtering,
• Personal data (0.04) personal data and information literacy, while observing a reduc-
tion for staff training and parental education, both areas of con-
The majority, again, are those that require long term invest-
cern from the broad exploration of online safety. Community
ment. Recent research around the abuse of professionals by
understanding, again highlighted as an area of concern, also has
students and other members of the school community (http://
experienced a narrowing of standard deviation (therefore an-
www.swgfl.org.uk/Staying-Safe/Content/News-Articles/The-
other area of consistently poor practice).
Online-Abuse-of-Professionals) highlights how important strong
community and parental engagement are in matters of online
safety. However, our data would suggest these are still weak 6. Primary Improvement, Secondary
areas showing little sign of improvement. We would also ob- Stationary
serve that personal data is a key area of concern from those As previously, the comparison of performance for primary and
working with schools, where establishments might be opening secondary establishments presents us with some very interest-
themselves up to potential data protection prosecution. Our ing comparisons. Figure 10 shows the difference between aver-
data would show that this is an area of weakness that is not age ratings in primary and secondary populations in 2011.
improving.
And we can see that, in general, primary establishments still
Figure 9 shows a comparison between 2010 and 2011 stand- report their performance as consistently weaker than their sec-
ard deviations. Again we see a consistent share to the spread of ondary counterparts. This is not surprising given the difference
data and this time greater variance in increases and decreases in resource available in a lot of primary settings.
in scoring. A change in standard deviation does not mean some-
thing has become “better” or “worse”, but is can show whether However, one of the most interesting things to draw from this
something has become more dispersed in terms of practice. comparison is that primary schools are “catching up” in terms of
their policy and practice.
Figure 9: 2010/2011 standard deviation comparison
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
8
9. In-depth
Figure 10: Primary/secondary comparison 2011
Figure 11: Primary/secondary comparison 2010
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
9
10. In-depth
If we consider the 2010 data, in some aspects the average rating In some of the strongest areas of improvement, almost a quar-
was more than a whole level difference: ter of a level has improved over the last year:
• Whole School (1.5 difference) • Whole School (0.28)
• Community understanding (1.23) • Technical Security (0.26)
• Mobile phones and personal hand held devices (0.96) • Professional standards (0.26)
• Password security (0.93) • Governors (0.23)
• Technical Security (0.81) • Password security (0.22)
However, with the 2011 data these differences have greatly re- In contrast, secondary schools, when the data is isolated, show
duced: little, in any improvement. In some cases, there has been a re-
• Mobile phones and personal hand held devices (0.78) duction in performance:
• Password security (0.64) • Technical Security (-0.13)
• Email, chat, social networking, instant messaging (0.54) • Professional standards (-0.11)
• E-safety education (0.46) • Governor training (-0.09)
• Technical Security (0.42) • Password security (-0.07)
• Information literacy (-0.07)
If we break the 2010/2011 comparison down between primary
• Community understanding (-0.07)
and secondary schools, as detailed in figures 12 and 13, we can
see clearly that there is a far more dramatic increase in perfor-
mance in primary schools:
Figure 12: Comparison of primary school averages 2010 - 2011
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
10
11. In-depth
in the number of partici-
pating establishments,
the strongest and weak-
est performing aspects
remain almost constant
across 2010 and 2011
with only slight im-
provement.
However, more in depth
analysis of the data
shows a more interest-
ing picture, which pre-
sents evidence that pri-
mary schools are clearly
improving in their per-
formance, while sec-
ondaries are remaining
stationary or, in some
cases having a slight de-
Figure 13: Comparison of secondary school averages 2010-2011
grading in performance.
This analysis does, however, only present the very high level
7. Conclusions
overview of what is possible with this unique resource. Further
This paper has explored a number of aspects around the data analysis is possible at any level of comparison, from a national
provided by over 1000 schools in the UK related to their online picture to regional analysis and even consideration of differ-
safety policy and practice. By comparing with similar analysis ent institutions in the same area. Since this analysis has been
from the previous year, we were able to both consider the cur- performed, significantly more schools have now engaged with
rent state of practice among UK schools but also analysing pro- the tool, almost 1,000 having now carried out a full profile. In
gress over a 12 month period. addition, the tool is now being piloted in the US and Australia,
through the Generation Safe project [http://generationsafe.
What is clear from this analysis is those aspects that either use
ikeepsafe.org]. Once this tool is in full use in these regions, de-
technological intervention (i.e. filtering) and policy develop-
tailed analysis of international performance will be possible,
ment are generally better performing than those aspects that
for the first time. This presents some exciting opportunities for
require long term resource investment (such as training) or
understanding how schools internationally engage with online
whole school involvement (such as parental education or com-
safety and ensure protection of their pupils, staff and wider
munity understanding). Monitoring and reporting also perform
community.
badly. It is interesting to note that even with an almost doubling
Edition and production
Name of the publication: eLearning Papers Copyrights
ISSN: 1887-1542
The texts published in this journal, unless otherwise indicated, are subject
Publisher: elearningeuropa.info
to a Creative Commons Attribution-Noncommercial-NoDerivativeWorks
Edited by: P.A.U. Education, S.L. 3.0 Unported licence. They may be copied, distributed and broadcast pro-
Postal address: c/Muntaner 262, 3r, 08021 Barcelona (Spain) vided that the author and the e-journal that publishes them, eLearning
Phone: +34 933 670 400 Papers, are cited. Commercial use and derivative works are not permitted.
Email: editorial@elearningeuropa.info The full licence can be consulted on http://creativecommons.org/licens-
Internet: www.elearningpapers.eu es/by-nc-nd/3.0/
ing
earn
eLearning Papers • ISSN: 1887-1542 • www.elearningpapers.eu
eL ers
28
u
ers.e
gpap
www
.elea
rnin n.º 28 • April 2012
Pap
11