Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

The Scientific and Technical Foundation for Altmetrics in the United States

  • Sé el primero en comentar

The Scientific and Technical Foundation for Altmetrics in the United States

  1. 1. Scientific and technical foundation for altmetrics in the US William Gunn, Ph.D. Head of Academic Outreach Mendeley @mrgunn https://orcid.org/0000-0002-3555-2054
  2. 2. Why altmetrics? http://www.stm-assoc.org/2009_10_13_MWC_STM_Report.pdf •978 data repositories •19 funder policies •16 data journals New forms of scholarship need new metrics.
  3. 3. Problems with Impact Factor
  4. 4. Country Documents Citable documents Citations Self- Citations Citations per Document H index United States 7,063,329 6,672,307 129,540,193 62,480,425 20.45 1,380 China 2,680,395 2,655,272 11,253,119 6,127,507 6.17 385 United Kingdom 1,918,650 1,763,766 31,393,290 7,513,112 18.29 851 Germany 1,782,920 1,704,566 25,848,738 6,852,785 16.16 740 Japan 1,776,473 1,734,289 20,347,377 6,073,934 12.11 635 France 1,283,370 1,229,376 17,870,597 4,151,730 15.6 681 Canada 993,461 946,493 15,696,168 3,050,504 18.5 658 Italy 959,688 909,701 12,719,572 2,976,533 15.26 588 Spain 759,811 715,452 8,688,942 2,212,008 13.89 476 India 750,777 716,232 4,528,302 1,585,248 7.99 301 Problems with Impact Factor
  5. 5. http://www.plosmedicine.org/article/info:doi/10.1371/journal.pmed.0030291 Problems with Impact Factor During discussions with Thomson Scientific over which article types in PLoS Medicine the company deems as “citable,” it became clear that the process of determining a journal's impact factor is unscientific and arbitrary.”
  6. 6. The higher the impact factor, the more likely the research is to be retracted, partly due to intense competition. http://bjoern.brembs.net/news766.html.11 What matters is who is reading your work! There is no correlation between the number a citations an article receives and the impact factor of the journal. http://www.bmj.com/content/314/7079/497.1.full
  7. 7. Adams, Jonathan. "Collaborations: the fourth age of research." Nature 497.7451 (2013): 557-560.
  8. 8. King, Christopher (2012) Thomson Reuters Annual Report http://ar.thomsonreuters.com/_files/pdf/MultiauthorPapers_ChrisKing.pdf
  9. 9. What are altmetrics? Research impacts more than authors
  10. 10. http://dx.doi.org/10.3789/isqv25no2.2013.04
  11. 11. Citations are slow
  12. 12. Mendeley is fast
  13. 13. Professors on Mendeley tend to be in applied math, stats, and physics. Graduate students on Mendeley tend to be in engineering disciplines. http://dx.doi.org/10.6084/m9.figshare.1041819
  14. 14. Cell Biology and Neuroscience are highly active disciplines, relative to their output. Social sciences are highly active relative to their citations/paper http://dx.doi.org/10.6084/m9.figshare.1041819
  15. 15. Readership vs. citations •it comes with a payload of metadata •it accrues faster •it illuminates previously hidden impact
  16. 16. http://arxiv.org/html/1203.4745v1 altmetrics show broader impact
  17. 17. http://www.asis.org/Bulletin/Apr-13/AprMay13_Lin_Fenner.html altmetrics show broader impact
  18. 18. Consistency is key http://www.niso.org/publications/isq/2013/v25no2/chamberlain/
  19. 19. Issues To Be Addressed •Identity •Privacy •Attribution •Gaming •Filtration standards/ best practice
  20. 20. NISO Altmetrics Standards Alfred P. Sloan Foundation American Library Association California Institute of Technology Center for Research Libraries EBSCO Elsevier Harvard Internet Archive Wiley Library and Information Technology Association Library of Congress Los Alamos National Laboratory National Institutes of Health National Library of Medicine OCLC Princeton Columbia Smithsonian Stanford …
  21. 21. NISO Altmetrics Standards •Types of sources: Mendeley, Twitter, Views, Downloads, Facebook •Quality of sources: collection, reporting, aggregation methods; provenance; availability •Use cases: discovery and assessment (of people and objects)
  22. 22. Types of sources Most altmetrics providers use the following: •Page views or downloads •Mendeley readers (articles only for now) •Tweets •FB events •Comments •Github
  23. 23. Quality of sources Collection methods vary & counts are inconsistent. Further study is needed. For reporting, transparency is key. Show raw data, not just a derived number. Aggregation of raw data is generally done by the recipient (institution, funder, publisher, author, etc) according to their need, instead of using one central source.
  24. 24. Quality of sources Understanding and open reporting of provenance is important for community buy-in and long term stability. Raw data should be available under open license, via API, with identifiers. Identifiers include DOI:object, ORCID:person, ISNI/Ringgold: institution Ex. This person(ORCID), at this institution (ISNI), released this object (DOI).
  25. 25. Use cases Two main use cases exist: Discovery and Evaluation Because the data sources remain variable, discovery can be done now. Accuracy of numbers matters less in recommendation than in assessment. Precision important for both.
  26. 26. Next Steps •NISO white paper will be in public comment period soon. •Working Groups will be established to develop best practices and standards. •Pending approval, NISO will issue recommended practice or published standard. •NISO to develop training to implement and adopt any recommended standards.
  27. 27. Amgen: 47 of 53 “landmark” oncology publications could not be reproduced Bayer: 43 of 67 oncology & cardiovascular projects were based on contradictory results Dr. John Ioannidis: 432 publications purporting sex differences in hypertension, multiple sclerosis, or lung cancer. Only one data set was reproducible http://reproducibilityinitiative.org There is no gold standard
  28. 28. www.mendeley.com william.gunn@mendeley.com @mrgunn https://orcid.org/0000-0002-3555-2054
  29. 29. ...and aggregates research data in the cloud Mendeley extracts research data… Install Mendeley Desktop Collecting rich signals from domain experts.
  30. 30. Home Work Mobile Cloud Library
  31. 31. Shared Folder
  32. 32. Mendeley Research Catalog
  33. 33. Mendeley Research Catalog
  34. 34. Read papers + keep track of notes 470M documents
  35. 35. Taking some misery out of writing
  36. 36. Information Extraction
  37. 37. We are publishing this data to the LOD cloud http://code-research.eu/
  38. 38. Defining readership •Each document addition is a “read” •stamped with metadata describing the context of the read event •a read is like a citation, but faster and captures more

×