A tour of Kudos to show the content in which it was developed (competition for funding, growing impact agenda, huge growth in output, fight for visibility and usage , “off-grid” sharing), our vision (more impact for research, more recognition for researchers), the platform through which we do this (a central system for explaining publications in plain language, managing sharing across multiple channels, and measuring effect across multiple metrics), the extent to which it works (use of the Kudos toolkit correlated to 23% higher downloads of full text on publisher websites) and how this data is made available to institutions (libraries, research offices and communications teams).
ICOLC 2016: Boosting visibility and impact of published research
1. Boosting visibility and impact
of published research
Charlie Rapple
Kudos Co-Founder
@charlierapple @growkudos www.growkudos.com
2. KUDOS
helps institutions mobilize researchers
to undertake more outreach
around their work and thus
increase visibility and impact
@charlierapple @growkudos www.growkudos.com
3. Why?
competition for funding
growing impact agenda
huge growth in outputs
fight for visibility and usage
“off-grid” sharing
@charlierapple @growkudos www.growkudos.com
4. 44
What is Kudos
working to achieve?
More impact for research • More recognition for researchers
Better evidence to help researchers and
institutions use communications more
effectively to drive impact
Better collaboration between these
groups to maximize results of each
others’ efforts to create impact
@charlierapple @growkudos www.growkudos.com
5. 55
How do we do it?
Centralize
how researchers
manage communications
around their work
Map results
of these efforts to a range
of metrics
Give institutions
“actionable insights”
and longer-term
intelligence from the
evidence and patterns
that emerge
@charlierapple @growkudos
6. 66
Does it work?
Nanyang Technological Institute study, 2015
Explaining and sharing via Kudos
correlated to
higher downloads of
full text on publisher sites
23%
@charlierapple @growkudos
7. 3.
EXPLAIN
2.
FIND
A WORK
4.
SHARE
Use of Kudos tools
is correlated to
23%
MORE full-text
downloads
1.
REGISTER
(one-off)
Increase the impact
of your publications,
and map actions to
results, with the
simple toolkit from
growkudos.com
5.
MEASURE
17. 1717
Gather resources for
impact case studies,
repurpose plain language
content for press or
discovery
Look at a range of metrics
in one place – learn what
works and shape other
activities, guidance
Centrally track,
measure, interact with
and amplify all
researchers’ comms
about their work
Learn about, profile
and reward
researchers who act
for impact
19. 1919
You can get
more insight
into communications
Twitter, Facebook,
ResearchGate etc by
using Kudos’ trackable
links
You can
connect
publications from
with related resources
e.g. Figshare,
and use Kudos to
and share the overall
story
You can
compare
performance
across a range of
metrics, and map
communications
to your publications
results
You can
increase
visibility & measure
effectiveness of
posted in institutional
systems or public
engagement services
Kudos
helps you get the most out of other tools
20. 2020
Kudos for Institutions
Annual subscription
Rolling start
Tiered pricing from £3k
Jisc agreement – discounts for UK
Contact me to discuss an agreement and
discount for your consortia!
charlie@growkudos.com
Notas del editor
We’re a relatively young company set up 3 years ago, to help institutions mobilize researchers to undertake more outreach around their work and thus increase visibility and impact
Our web-based platform uses metrics to show researchers that small communications efforts can have a big effect in terms of the visibility, usage and impact of their work.
We provide them with evidence to help justify spending 10-15 minutes, each time they publish, to ensure that work finds its audiences.
Last year, Kudos worked with pilot partners around the world to develop an institutional service that draws on data from the more than 100,000 researchers signed up so far for our free researcher service.
Before I explain the services, I thought it would be useful to set the context by looking at the challenges we’re trying to help researchers and institutions to address.
There are many pressures in our environment affecting both researchers and their institutions.
In most countries, funding is at an all time low and so winning grants is increasingly competitive, and requires more and more evidence of past performance.
We’re bringing multiple metrics together in one place to make it easier for researchers and institutions to summarise and learn from past performance.
These funding challenges are then combining with a drive for greater public engagement and accountability, and creating a growing expectation of maximum research impact.
So we’re focused not only on measuring performance, but also providing the tools, guidance and motivation to improve it.
In the meantime, more and more research is taking place and therefore the literature is growing at an ever faster pace.
As a reader it’s hard to keep up with all the material in your field; as an author, it’s hard to ensure that your work finds its audience.
So we’re motivating researchers to add high-level summaries of their work, by showing them clearly how that is increasing readership – as readers, this helps them digest more of the literature; as authors, that in turn increases the likelihood of their work being found, read and applied.
Finally we’re seeing growth in “off-grid” sharing, with researchers posting their work in sites like ResearchGate. As an institution, this can be a concern for a number of reasons – for example, concerns about copyright compliance, or the fact that the institution has no view onto usage in those channels as it would, say, with repository content or content on publisher sites. “Off-grid” sharing cuts the institution out of the picture in terms of being able to support or derive intelligence from that activity.
So we’re working to provide an alternative – better - approach that still encourages researchers to share their work, but enables them to do so outside of the “walled gardens” that sites like ResearchGate and Academia.edu effectively are – and it’s our wider range of metrics, and the boost to performance that comes from collaboration with and support from the institution, publisher, society and funder – this is what attracts researchers to begin using Kudos to manage their outreach.
In that context, our vision is “More impact for research, more recognition for researchers” and our tools are specifically focussed on creating impact, rather than just tracking or measuring it.
From an institutional perspective, we want to provide better evidence to help you and your researchers use communications more effectively to drive impact.
We also want to be this nexus for better collaboration between researchers, institutions and publishers, societies and funders to maximize the results of each others’ efforts to create impact by managing those efforts through the same system rather than doing that in separate silos.
How do we do this?
Firstly, we provide a central platform for researchers to manage communications around their work. This includes a “profile” page for each publication where the researcher can add a plain language explanation – this helps with the discoverability of the work, and they can also add links to related materials which further help people to find or understand the work. The second part of the communications piece is that we provide trackable links for researchers to use when they share their work – whether that is via email, social media or academic networks, or even offline in presentations and reading lists. This is really unique as it means Kudos is logging a much wider range of the ways people share their work – not just public channels like social media, but the other kinds of communications that make up the bulk of how researchers interact!
We then map the results of these efforts to a range of metrics – attention data from Altmetric, Times Cited counts from Web of Science, full text download counts from publishers, and communications metrics including clicks and views.
This provides a rich dataset from which the researchers themselves, but also their institutions and publishers, can derive insight – which types of communication are most effectively increasing the readership and impact of our work? Where should efforts be focused in future?
The obvious question to ask at this point is whether these efforts by researchers to explain and share their work do actually improve the performance of those publications.
We were approached about this time last year by the Altmetric team at Nanyang Technological Institute in Singapore, and they asked if they could take a dump of our data and undertake some analysis of the effectiveness of what we’re doing. There were about 5,000 articles in a test group, which had been shared through Kudos, and a similarly sized control group of articles for which the Kudos tools had not been used.
The team looked at average downloads of these articles on the various publishers’ websites, and found that articles for which the Kudos tools had been used had 23% higher downloads on average than those articles for which the tools weren’t used. This study is just being submitted for publication in PLOS – and you can imagine those results were a fantastically exciting moment for us in terms of validating the last few years of our lives!
So let’s take a look at the workflow for researchers using Kudos to explain and share their work.
It is a simple process – it takes perhaps 20 minutes the first time a user does this, and then perhaps 10 minutes each time they have a new publication.
The one-off registration process is typical – name, title, institution, academic specialty – and it’s possible to sign up using Facebook, Twitter or LinkedIn if you want to streamline that process.
You then search for publications to add to your account – or connect your Kudos and ORCID accounts to automatically import anything you’ve already added in ORCID, now and ongoing.
We recommend that people start by just adding and explaining one publication, so they can test whether Kudos works for them before beginning to use it for everything they’ve ever published.
In a moment I’ll look in detail at the explain, share and measure steps but note that I’ve put them in a circle here because people do this process once, typically for the most recent thing they’ve published, and then when they see their results, they will come back and do it again and again.
Drilling down then, this is an example of a publication profile page in Kudos. You can see the original metadata from Crossref alongside the journal cover, and the big blue button linking to the full text on the publisher site on the right hand side.
Here I’ve highlighted the explanatory content added by the researcher – so at the top, there’s a short, or alternative, title for the work – perhaps simpler than the original title, or more descriptive if (for example) the work is a book chapter that doesn’t really have a title of its own.
Further down the page, the researcher answers two simple questions: what is this about, and why is it important? Our research way back at the beginning of Kudos showed that these were the two key questions that people want to answer when they are scanning the literature.
Just below that you can see the “perspective” field – here, each contributor to the work can add their own statement, perhaps a different view on “why this is important” based on their specific role in the project – or often, some interesting thoughts about why they undertook the work, or who inspired them, or how they are now moving forward. This is rich stuff – opening up the story behind the research – giving people in institutional press offices and communications roles some useful hooks on which to hang any media stories they want to create around the work.
Finally on the right you can see some “related links” that the researcher has added – in this case, to a press release, and some BBC coverage – people also use this field to link to data, slides, video, figures, code, and often the repository version of the work, for which we’ve recently introduced a new label “open access version” so that we can play our part in helping to increase visibility and usage of content in repositories – the plain language explanations can be a new metadata set for that content, making it more discoverable, and then the resources area can provide the link.
I’m often asked whether being able to explain research in plain language isn’t a specialist skill – I can tell you that 99% of the plain language summaries I read in Kudos are very accessibly written – and our timestamps show people are putting these together in typically 5-10 minutes - so in our experience people are able to do this, given the right context – in all the steps they will have taken to get to this point, we’ll be reminding them that Kudos aims to help them reach a broader audience.
None of those fields are mandatory, and the researcher could choose to jump straight to the “Share” tab that I’ve shown here, and either connect their social media accounts to share directly from here to Facebook, Twitter or LinkedIn, or simply use a “generate link” button to copy a trackable link that can then be pasted into email, academic networks, blog posts, presentations and so on
The trackable link is what enables Kudos to help researchers measure the success of their communications efforts, compare across them, and map these directly to a range of metrics.
These snapshots from the metrics pages in Kudos show this unique “table” (top right) which is the only place researchers will be able to look at “how many times did I share my work? How many people clicked on those links? How many of them went on to read the publication? What has that done to Altmetric score, or my Times Cited count?” – and I think this is gold dust! Because what is the point of communicating if you can’t drill down to whether it’s actually encouraging more people to look at your work?
There are then “detailed metrics” graphs which actually map the specific actions against each other and against the metrics, on a timeline, to really show the correlations between explaining and sharing, and improved results.
On the left here you can see the graph which shows full text downloads on the publisher site (in orange) increasing shortly after the author’s first action in Kudos (marked by the purple “A”), and continuing to increase bit by bit as the author takes follow up actions (when you mouse over the letters, you see the action taken – for example “shared via LinkedIn”). When the publisher has taken an action, that is also marked on this graph with a ”P” so everyone can see how different efforts are coming together to improve results.
On the right is a similar graph showing how the actions have affected the Altmetric score for the work.
I just want to pause there and really stress the uniqueness, the novelty, and the importance of being able to map actions to results in this way.
Before we started Kudos I did a lot of work around research communications, from speaking skills to help people present their work at conferences, to social media training.
One of the things that became clear to me is that researchers are at the centre of the most relevant network in terms of spreading the word about their work – but often don’t really use that network.
Sometimes this is because they feel uncomfortable blowing their own trumpet. Sometimes it’s because they think the journals in which they have published will be read by anyone who might be interested.
Mostly it’s because people don’t have much time, and didn’t see any direct benefit of communications efforts.
Metrics related to their work were either not available – most publishers still don’t provide article-level usage to authors, for example – or the metrics were too disconnected from communications efforts – how do you know whether a bit of blogging and tweeting one year has anything to do with a rise in citations of your work the following year?
I meet people who are great at doing outreach – and interested in their metrics – but struggling to put those two things together because the actions and results happen in different websites.
It became really obvious to me that we need to put those datasets together – you need to map communications efforts against publication performance metrics so that it is much clearer when your efforts are correlated to improved results.
This is what we set the Kudos platform up to do and so I still get very excited showing people how we’re managing to do that.
I’m often asked how Kudos is different to some of the services shown here, so let me provide some quick answers around that:
Kudos is not a networking site. We’re the place where you can measure and compare the effectiveness of all the sites you already use for communication and networking, such as Twitter or Facebook
Kudos is not a metric; we bring together a range of metrics to help you determine which of your communications efforts are most effective, and then you can focus future efforts accordingly and save time as a consequence.
Kudos basically puts the researcher in control of the performance of their work – it’s not about measuring impact, so much as creating that impact in the first place.
And here’s a snapshot of the dashboard that each researcher has which shows them all of their publications, regardless of where they’ve published – so they don’t have to learn different ways to do this depending on where they’ve published, and they can compare results across different publishers. And similarly they can use Kudos as the central system for managing any kind of communication, so again they can compare across those channels – and look at their results across a range of metrics. All in one dashboard!
And just to look at some examples of the successes people have had, Martin here on the left explained his work, shared it in 3 places, got 18 new readers – while Tony and Sean on the right, collaborated in their use of Kudos, shared by email as well as social media, and not only ended up with 67 new readers, but were able to see that Twitter has led to more readers than the other channels so they might want to focus on continuing to build their Twitter network, and invest less time in Facebook in future.
Meanwhile all of this data – these actions, and their results - is being compiled to give the same kind of intelligence, at an aggregate level, to organizations with which the researchers are affililated.
Here is the dashboard for one of our institutional subscribers, University of Huddersfield.
It includes a high level summary of how many of their researchers are active, how many publications they have explained and shared, and how many page views in Kudos they have achieved so far. It also includes a quick view of the publications that have most recently been explained, the researchers who have recently been active, and this awesome Twitter widget that makes it really easy to track people’s tweets about their work, and to interact with those right here with just one click – rather than having to find and follow those researchers manually, and filter out all the other things they tweet about to find the posts about their work – Kudos does all the hard work for you! We know which researchers are affiliated with you because they tell us that during registration, so any posts they share about their work we surface for you here – while ignoring all their posts on other topics! So you just have to log in to your dashboard and with one click it’s “like” this, ”retweet” that, “reply” to the other.
In the middle of the page you can see some examples of the reports that Kudos provides – this is your “favourites” area where you can put shortcuts to the reports you use most often from the …
… 30 something reports that we provide!
And these reports really help you drill down into our data, so you can start to answer questions like:
Which kinds of researchers are more or less likely to be making an effort to increase the visibility and impact of their work?
How and where are they communicating?
How do they describe their work, and what kinds of resources do they connect it to that might further help to set it in context, or demonstrate its impact?
What effect does all of this have? What actions, resources or channels are more or less likely to be increasing impact?
The answers to these kinds of questions can be really powerful, in terms of:
enabling institutions to recognize and reward researchers for a broader range of impact actions and results
Increasing the effectiveness of the institution’s communications, or its guidance to researchers, based on this unique ability to understanding which actions and channels are effective
Increasing the discoverability of the institution’s work, whether on Kudos or in your own website or repository
Because you can use our “widgets” to pull back the content that your researchers create in Kudos into institutional websites, or your repository – and this is a key point, in that Kudos isn’t really setting out to be a destination site. We’re a toolkit that motivates authors to do more outreach around their work, helps them manage that process, and brings together metrics to reward them for doing it – but for the content that they create, we’d much rather get that out into existing discovery workflows than try to replace them and become a discovery platform ourself! So our widget helps with that, and we distribute this content as widely as possible – it’s indexed by Europe PubMedCentral, for example, and indeed these widgets are used by PaperHive which you’ll hear about shortly (or have just heard about!)
So in conclusion: Kudos is a unique system for collaboration between researchers and institutions in pursuit of greater impact.
It helps you get the most out of other tools you might be using –
So if you or your researchers are using social media or academic networks, we help you connect the dots and determine whether those kinds of activities actually do help to increase publication performance.
If you’re using ORCID, we can help researchers get extra value from that, and also be a hub for connecting their publications from ORCID with other materials they may have put online – figures, videos and so on.
If you license metrics such as citations or Altmetrics, we can help you look at those figures in one place and see whether and how they are driven by communications efforts.
And finally if you have a repository, we can help you increase the visibility and usage of that content.
Our institutional service is based on an annual subscription with an any time start. We have price tiers of £3,000, £6,000 and £9,000 or approximate local equivalents - if you’re in the US, or indeed in any non-UK currency, the conversion rates at the moment make it a very good time to work with us! Our current subscribers include the University of Huddersfield, the London School of Economics and Political Science, Carnegie Mellon University, the University of Kent, the University of Technology in Sydney - so we’re building up customers around the world. We’ve also signed a Jisc agreement recently which means there is a standard pre-approved agreement and discount for UK institutions, and we’d be happy to negotiate something similar with any other big consortia if you think your members will be interested in our service – please just get in touch!