This document discusses how research assessment and funding criteria are changing to focus more on real-world impact. It provides examples of initiatives that emphasize engaging with non-academic audiences and applying research to benefit society. The document also offers suggestions for researchers to demonstrate impact, such as publishing practitioner commentaries alongside papers, participating in research learning communities, and co-creating articles with industry professionals. Overall, it encourages researchers to consider how to communicate their work to relevant end-users and incorporate impact planning from the beginning of the research process.
3. Research assessment and funding criteria are
changing.
• Policymakers recognize that research outputs can drive economic
growth and sustainable development
• Individual career progression and institutional funding increasingly
rely on evidence of impact
• Standard Evaluation Protocol, Netherlands
• Excellence in Research for Australia (ERA)
• Research Excellence Framework (REF) and Research Councils UK (RCUK)
• San Francisco Declaration on Research Assessment (DORA)
• Government, industry, private, and charitable funders look for
engagement outside academia for societal benefit
6. We can build bridges between research and practice.
• The university’s central research office can help
• Open science and innovation in outputs
• Impact literacy
• Appoint practitioners to editorial boards
• More bite-sized content
• Publish more case studies
7. Impact literacy can be taught and encouraged.
• Impact Literacy Workbook
• Researcher toolkits
• Institutional health checks
• Mentoring workshops
• Impact Awards
• Real World Impact Blog
8. Practitioner commentaries alongside papers.
• Forthcoming 2018 Special Section:
Marketing as an Integrator in Integrated Care
• Papers published alongside practitioner commentaries
—Dr. Áine Carroll
National Director, Clinical Strategy and Programmes
Health Service Executive, Ireland
“ The challenge for social scientists and clinicians like me is
how do we co-create relevant, worthwhile research that
contributes to our shared goal of integrated care? I believe
that integrating quantitative and qualitative research
methods will lend depth and clarity to integrated care in a
way that is synergistic and most importantly, gives the
voice of the people we serve primacy.”
9. Research Learning Communities.
• Describes Research Learning Communities: groups of teachers,
facilitated by a university researcher, who engage with research
evidence in order to enhance practice
• Research-informed interventions improved writing outcomes of
children by 26%
—Professor Chris Brown (author)
School of Education and Childhood Studies
University of Portsmouth
“ This RLC case study provides an example of how research
activity, and the understanding that emerges from it, can
be developed into an intervention (the RLC model) which
itself can successfully drive improvements in classrooms
and schools.”
10. Co-created articles.
• Features 14 contributors from academia, associations, consultancy and
industry on an economic development issue of regional significance
—Profs Ibrahim Ajagunna, Fritz Pinnock and Tom M. Amonde
Caribbean Maritime University
“[The theme issue] enabled us to work with a team with varying
academic backgrounds and specialisms and to share in their
professional experience… We know that new learning emerges
from WHATT’s approach to asking questions that are important
to practitioners.”
11. Supporting the conversation.
• Spark debate
• Engage with the research management community
• Impact case studies
• Connections“
#realworldimpact
Emeraldgrouppublishing.com/realworldimpact
12. Impact is a series of small steps.
The ‘who’ of impact.
“Who will be most interested in this research? Who is the main
audience? Who has the decision-making power in this area
and can mobilise change? Having conversations with the right
people will help to inform your impact plan and tailor it to the
‘end-user’ or ‘beneficiary’ - the individual, group or
organisation that will ultimately benefit from your research.”
—Harriet Barker
Academic
Engagement
Manager, The
National Archives
Two for the price of one.
“Research Councils ask that you provide an “acceptable”
Pathways to Impact statement before they release any
funding. If you don’t consider impact now, you may well
find yourself applying for a separate “impact” grant later.
A whole new proposal when you could have just written it
into the original! ”
—Stephen Kemp
EPSRC portfolio
manager and
university impact
officer
It’s almost exactly 3 years since the Leiden Manifesto was published, in which the authors called for research evaluators to recognise that
“Scientists have diverse research missions. Research that advances the frontiers of academic knowledge differs from research that is focused on delivering solutions to societal problems.
Research assessment and funding criteria are increasingly changing, to reflect those different missions and to better gauge the impact of research on the real world.
Just two months ago, the seven UK research councils signed the DORA principles, stating that they “consider the journal impact factor and metrics such as the H-index…not appropriate measures for assessing the quality of publications or the contribution of individual researchers”
And HEFCE has confirmed that case studies that document impact will count for 25% of a university’s score in the 2021 REF.
But many of the impact metrics we use, even responsible ones, fall short of evidencing real world change. Worse, some say that they can encourage thinking which is too short term, and can inadvertently mask the real value of the research
Demonstrating this impact, though, is not an easy task. It’s very difficult to do in many disciplines. Measures of what some call ‘influence’, such as citations, can be documented. Social and main stream media, consultancy and speaking opportunities can be tracked. But linking real change in practice or public opinion to research is often referred to as a ‘leaky pipeline’.
For many researchers, real change is difficult to prove and hard to dedicate time to. Last year, Emerald conducted research with 1600 of our authors. Although we publish in applied fields, only 36% of our authors felt incentivised to engage with non-academic collaborators and audiences. Many cited structural disincentives, such as common academic recruitment and promotion practices which in some countries are still based primarily on numbers of high impact factor publications.
There is a role for many of us in the scholarly ecosystem to help bridge the research practice gap.
Collaborative research between academia and industry is often very high quality but those networks can be difficult to create. A few days ago I read on a blog about co-produced research that ‘knowledge is not plug and play. Relationships matter.’ The central research office has a role to play here. There are also initiatives like the Royal Society’s pairing scheme which gives policymakers and research scientists an opportunity to experience each other’s worlds.
Open science and dissemination of a wider range of research outputs through more varied channels is a joint responsibility of researchers, funders and publishers.
Publishers can appoint practitioners and policy makers to editorial boards, ask for ‘implications for practice’ sections in accepted papers, and invest in smart technologies to make content more accessible– such as the single sentence summaries, which Emerald have just started publishing.
One way to reach beyond other researchers is to publish more case studies and teach the outcomes of research in the classroom, particularly in applied courses.
Impact literacy can be taught and encouraged through Researcher toolkits, Institutional health checks and Mentoring workshops.
At Emerald, we have recently set up a formal partnership with Dr Julie Bayley, Director of Research Impact Development at University of Lincoln to set up standards, tools and awards on teaching and creating impact.
We have some copies on our stand of an impact literacy workbook we have recently produced if anybody would like to see one.
Here are a few examples of very REFABLE work that publishers can pretty easily support: Publish practitioner commentaries alongside papers in a special issue
Commission the publication of the outcomes of research learning communities – where practitioners and academics work together to put research evidence into practice. This recent book we published shows that writing outcomes of children improved by 26% when teachers engaged with research evidence, facilitated by an academic researcher.
Relevance to industry can be lacking in some academic papers. We have sought to tackle that in this journal via a question-based publishing format that creates engaging articles with a step by step response to a key strategic question. Each issue tackles a particular real world theme and features a range of contributors from different backgrounds.
In addition to co-created publications, publishers can play an important role sparking debate around impact and supporting the conversation. At Emerald we’ve launched a Real World Impact campaign, together with an advisory board and a range of contributors across academia, practice and the research management community. We’re giving a platform to different voices and showcasing grassroots examples of impact demonstration.
There isn’t a gold standard of impact. But there are sources of help. Many say that impact is a series of small steps. Impact officers can help guide researchers to engage with stakeholders early in the funding bid, to write a good pathways to impact statement, or a knowledge transition plan, as most funders will assess this before releasing any money. I’d be interested to hear from librarians at this conference who are also engaged in impact demonstration on behalf of their institution.
And this goes for publishers too. We should all make a commitment to ensuring the research we publish is as impactful as possible. The way we do that will differ from publisher to publisher and by discipline. But as researchers and institutions look beyond usage metrics, citations and media activity as indicators of research impact, so must publishers.
What can we do to transform content, to make it more accessible, to engage with stakeholders outside of academia that we may not be as familiar with, and develop products for these audiences.
Can we support researchers with impact literacy services. Impact out in the real world can look quite different to the metrics we have engineered our businesses around, although I’m sure we would all agree that our belief in the value of research to shape and influence positive societal outcomes hasn’t changed for centuries.