Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Towards a multidimensional valuation model of scientists

319 visualizaciones

Publicado el

Presentation given at the Atlanta Conference on Science and Innovation Policy

Publicado en: Educación
  • Inicia sesión para ver los comentarios

  • Sé el primero en recomendar esto

Towards a multidimensional valuation model of scientists

  1. 1. Towards a multidimensional valuation model of scientists http://nrobinsongarcia.com @nrobinsongarcia
  2. 2. Critiques to individual assessment – DORA – Impact Factor – Leiden Manifesto – Misuse of indicators – Metric Tide – Indicators are not adequate Threats from current research evaluation 1. Evaluation schemes rely heavily on journal publication and citation-based indicators 2. Invisible profiles are degraded despite growing importance 3. Changes in the scholarly system are ignored Motivation
  3. 3. Critiques to individual assessment – DORA – Impact Factor – Leiden Manifesto – Misuse of indicators – Metric Tide – Individual assessment Threats from current research evaluation 1. Evaluation schemes rely heavily on journal publication and citation-based indicators 2. Invisible profiles are degraded despite growing importance 3. Changes in the scholarly system are ignored Motivation
  4. 4. Effects of current evaluation schemes • Effects on the scientific workforce – Task reduction (de Rijcke et al, 2016) – Homogeneity of profiles (Milojević et al, 2018) – Mistrust and conservatism (Abramo et al, 2015) • Effects on knowledge production – Quality of research and transparency (Moher et al 2018) – Research integrity (Naudet et al 2018) – Short-sighted research agenda (PoP culture)
  5. 5. Conceptual framework • What is expected of a scientist? Evaluative dimensions of performativity • What is their potential to achieve such expectations? Constraints and confounding effects
  6. 6. Conceptual framework Evaluative dimensions External factors Personal features • Nationality • Gender • Language • …. • Performance • Expectations • …. • Work environment • Institutional logics • National policies • … Opportunity to perform
  7. 7. Evaluative dimensions • Scientific engagement • Community career (Laudel & Gläser, 2008) • Overall scientific production • Social engagement • Outreach • Participatory science • Capacity building • Resources • Human capital • Trajectory • Past experience i.e., international, non-academic • Open practices • Transparency • Reproducibility • Participatory
  8. 8. Research questions 1. Are there different research profiles? 2. Is there a correspondence with career stage? 3. Do current research evaluation schemes favor certain profiles?
  9. 9. Roadmap 1. Proof-of-concept • Desktop research • Interviews • CV analysis • Exploratory analyses 2. Descriptive phase • Operationalization • Profiling of scientists 3. Analytic phase • Career trajectories • Analysis by gender, nationality, field • Contextualization
  10. 10. Roadmap 1. Proof-of-concept • Desktop research • Interviews • CV analysis • Exploratory analyses 2. Descriptive phase • Operationalization • Profiling of scientists 3. Analytic phase • Career trajectories • Analysis by gender, nationality, field • Contextualization
  11. 11. Research design – Multiple case study • 6 Research groups: 228 scientists • 2 universities: Technical vs. Multidisciplinary • 6 research fields: Physics, Biomedical Sciences and Social Sciences Multidisciplinary Univ Technical Univ Biomedical Sciences 15 19 Social Sciences 9 61 Physics 118 6
  12. 12. Research design – Data sources • Web of Science and Google Scholar – Research outputs • CV and personal website – Trajectory, model validation • Social media activity – Outreach • Interviews – Motivations, model validation
  13. 13. Exploratory analysis VARIABLES Scientific engagement Share of co-authored papers Social engagement Share of papers co- authored with industry Capacity building Number of first-year authors, last position, continued publishing Trajectory Years since first publication Open practices Share of papers available in Open Access
  14. 14. Exploratory analysis ALL BIBLIOMETRIC!! VARIABLES Scientific engagement Share of co-authored papers Social engagement Share of papers co- authored with industry Capacity building Number of first-year authors, last position, continued publishing Trajectory Years since first publication Open practices Share of papers available in Open Access
  15. 15. Exploratory analysis ARCHETYPAL ANALYSIS • Statistical data representation technique to characterize multivariate data sets (Cutler & Braiman, 1994) • First used in scientometrics in 2013 (Seiler & Wohlrabe, 2013) • It defines archetypes of individuals based on extreme values of one or more variables • Individuals are then characterized as pure or mixtures of archetypes
  16. 16. Preliminary results – All scientists • 228 scientists • 3 distinct fields • Physics • Social Sciences • Biomedicine
  17. 17. Preliminary results – All scientists • Find most suitable number of archetypes • Iteration process (4) trying to up to 10 archetypes • Check Residual Sum of Squares (RSS) • Apply elbow rule
  18. 18. Preliminary results – All scientists • Arc. 1 High industry collaboration • Arc. 2 High age & pupils • Arc. 3 High collaboration & OA • Arc. 4 Middle age & middle collab. & middle OA Collaboration Industry Pupils Age Open Access
  19. 19. Preliminary results – Social Sciences • Arc. 1 Middle age & middle OA • Arc. 2 High age & pupils • Arc. 3 High industry & collaboration • Arc. 4 High collab & high OA Collaboration Industry Pupils Age Open Access
  20. 20. Preliminary results – Physics • Arc. 1 High OA & middle collab & middle industry • Arc. 2 Middle collab & high industry & middle pupils • Arc. 3 Middle OA & middle age • Arc. 4 High age & pupils & industry Collaboration Industry Pupils Age Open Access
  21. 21. Preliminary results – Biomedicine • Arc. 1 High collab & high industry • Arc. 2 Middle industry & middle age • Arc. 3 High OA & high collab • Arc. 4 High age & high pupils Collaboration Industry Pupils Age Open Access
  22. 22. Preliminary conclusions • Need for constructive discussions on limitations of current research assessment schemes of individual • Expectations from scientists • Modelling of research career and trajectories • Stop isolating performance • Development of balanced valuation models • What do we value and how can it be observed • Ambiguity vs. reductionism
  23. 23. Preliminary conclusions Archetype 1 Archetype 2 Archetype 3 Archetype 4 Soc Sci Traj Cap Op Soc Sci Traj Cap Op Soc Sci Traj Cap Op Soc Sci Traj Cap Op Social Sciences Physics Biomedicine Overall Industry-oriented Mentors (Open) collaborators The middle class • Despite being very different areas and the limitations of the indicators identified we observe some consistency in the profiles • OA differences especially for Biomedicine in archetype 1 • Some profiles (e.g. 2) are ridden mostly by career stage
  24. 24. Thank you!! http://nrobinsongarcia.com @nrobinsongarcia
  25. 25. References Abramo, G., D’Angelo, C. A., & Rosati, F. (2015). The determinants of academic career advancement: Evidence from Italy. Science and Public Policy, 42(6), 761–774. https://doi.org/10.1093/scipol/scu086 Cutler, A., & Breiman, L. (1994). Archetypal Analysis. Technometrics, 36(4), 338–347. https://doi.org/10.1080/00401706.1994.10485840 Laudel, G., & Gläser, J. (2008). From apprentice to colleague: The metamorphosis of Early Career Researchers. Higher Education, 55(3), 387–406. https://doi.org/10.1007/s10734-007-9063-7 Milojević, S., Radicchi, F., & Walsh, J. P. (2018). Changing demographics of scientific careers: The rise of the temporary workforce. Proceedings of the National Academy of Sciences, 115(50), 12616–12623. https://doi.org/10.1073/pnas.1800478115 Moher, D., Naudet, F., Cristea, I. A., Miedema, F., Ioannidis, J. P. A., & Goodman, S. N. (2018). Assessing scientists for hiring, promotion, and tenure. PLOS Biology, 16(3), e2004089. https://doi.org/10.1371/journal.pbio.2004089 Naudet, F., Ioannidis, J. P. A., Miedema, F., Cristea, I. A., Goodman, S. N., & Moher, D. (2018, June 4). Six principles for assessing scientists for hiring, promotion, and tenure. Retrieved 7 June 2018, from Impact of Social Sciences website: http://blogs.lse.ac.uk/impactofsocialsciences/2018/06/04/six-principles-for-assessing-scientists-for-hiring-promotion-and- tenure/ Rijcke, S. de, Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use—A literature review. Research Evaluation, 25(2), 161–169. https://doi.org/10.1093/reseval/rvv038 Seiler, C., & Wohlrabe, K. (2013). Archetypal scientists. Journal of Informetrics, 7(2), 345–356. https://doi.org/10.1016/j.joi.2012.11.013

×