This document discusses the need for new metrics to measure the efficiency and quality of research funding. It argues that current metrics like impact factor and citations do not adequately measure these factors. It presents problems with the current system, including a lack of efficiency due to specialized research silos and reproducibility issues. The document proposes a marketplace called Science Exchange that would provide researchers centralized access to expert service providers to validate results, improving efficiency and reproducibility. It discusses short-term goals like tracking these new "altmetrics" and long-term goals of promoting a cultural shift towards more efficient and reproducible research.
2. Why/how do we measure impact?
Need to assess the impact of research funding
➡ traditionally measured by impact factor, more recently by article
level metrics like citations and downloads
These metrics are not associated with quality
➡ retraction rates correlate strongly with impact factor
➡ citations don’t stop even after a retraction
Metrics to measure the impact of research funding
➡ quality: high quality reproducible research
➡ efficiency: spend of funding
4. Problem 1: Efficiency
Research is not efficient due to individuality and
competition:
➡ Research increasingly specialized, not possible for researchers to
do everything themselves ➭ enter core facilities
➡ not possible or efficient for every research institute to have every
core facility
Implications for efficiency:
➡ Need a central open system to enable easy access
➡ Need a community-driven platform, not a listing directory
5. Problem 2: Quality
Quality of academic research is under scrutiny:
➡ 47 of 53 “landmark” oncology studies not reproduced (Amgen)1
➡ 43 of 67 cardio/oncology publications were contradictory (Bayer)2
➡ 431 of 432 ms/oncology publications not reproduced (Ionaddis)3
Implications for quality:
➡ Lack of academic reproducibility results in lack of new therapies
➡ Bayer halted 65% of target validation projects in 2011
➡ Drug development estimated at ~$4B per NME
1. Drug development: Raise standards for preclinical cancer research. Begley CG, Ellis LM. Nature. 2012; 483(7391):531-3.
2. Reliability of 'new drug target' claims called into question. Mullard A. Nat Rev Drug Discov. 2011; 10(9):643-4.
3. Claims of sex differences: an empirical assessment in genetic associations. Patsopoulos NA, Tatsioni A, Ioannidis JP. JAMA. 2007; 298(8):880-93.
7. Cause: Academic Incentives
Academic incentives not aligned with efficient high-
quality research
➡ Incentives to compete over publications, not collaborate / share
➡ Incentives to acquire publications, not validate results / findings
Need: A platform to track altmetrics of efficiency and
quality in academic research
9. Science Exchange
Mission: “Improve the efficiency of scientific
research, by making it easy for researchers to
access a global network of scientific resources
& exper tise.”
COMPANY VITALS COMPANY METRICS
Inception: May 2011 +800 experiment types
Headquar ters: Palo Alto, CA +1000 providers
Investors: Andreessen Horowitz and others +250,000 visits since launch
10. An Online Marketplace
Science Exchange provides access to a network of
experts who operate outside current academic
incentive structure
➡ fee for service model with reputation / rating system: incentive
only to produce high quality data
➡ 1000+ exper t providers including academic core facilities &
commercial vendors
➡ 800+ experiment-types accessible from expert providers
Key Value:
A central portal for efficient access to specialized core
facilities, that can be used to validate academic research
11. The Reproducibility Fund
$1M Pilot Fund to support reproducibility
➡ Researchers submit studies for validation through Science
Exchange’s expert providers
➡ 40 studies funded in initial pilot
➡ Shows proof of principle to NIH & funding agencies
12. The Reproducibility Fund
Improved incentives for academic researchers
➡ No costs: academic studies funded by Reproducibility Fund
➡ Fast-track publication: original results expedited for
publication & ‘badged’ as reproducible
➡ 2-for-1 publication: replicated results published
➡ Awareness: top quality research is rewarded
Increase Provide free credits
reproducibility, reduce for research
drug failure Funding and publishing reproducibility
partners
Academic
Use SciEx platform Researchers
to validate research
studies
14. Short-Term: Altmetric of Efficiency
Promote open access to expert providers
Improves collaboration with experts at core facilities & CROs:
➡ No requirement to purchase duplicate equipment
➡ No need to learn highly specialized one-off techniques
Provides one central, efficient solution:
➡ Ensures researchers know where to go
➡ Tracks usage & efficiency of research spending as an impt altmetric
15. Short-Term: Altmetric of Quality
Promote high-quality reproducible research
Leverages expert providers on Science Exchange:
➡ Improves independent validation of results
➡ Improves discovery of robust translatable drug targets
➡ Improves tracking / feedback to ensure quality
Provides incentive system for quality:
➡ Reproducibility Fund
➡ Tracks reproducibility as an important altmetric
16. Long-Term Objectives
Promote a cultural change towards efficient and
high-quality academic research
Advocate for funding agencies to:
➡ Promote efficiency through open access to shared resources
➡ Reward efficient use of grant money by researchers
➡ Fund validation as well as novel studies to promote reproducibility
End Goal:
Cultural shift towards altmetrics of
efficiency & quality over novelty