The document discusses the issues with using quantitative metrics and rankings to assess university performance and research. It provides examples of how rankings have been influential in policy decisions around the world despite their questionable scientific validity. Concerns raised include the lack of transparency in methodology, arbitrary weighting of indicators, and incentives for gaming the system. While metrics provide objective measures, they fail to capture many important qualitative aspects of universities and research. Overall, the document questions the practice of "governing by numbers" and emphasizes the need for more holistic evaluation that considers national priorities and systemic impacts.
ICT Role in 21st Century Education & its Challenges.pptx
Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry
1. Harassing with Numbers:
the Uses and Abuses of
Bureaucracy and Bibliometry
Giuseppe De Nicolao
Università di Pavia
2. • Quality assurance, accountability, rankings,
efficient allocation of resources, objective
measures of performance, ...
• Boring stuff, lots of bureaucracy .. but
changes our lives and impacts also research
and teaching
• Let’s begin with a story
3. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
10. New York Times, November 14, 2010
Alexandria’s surprising prominence was actually
due to “the high output from one scholar in one
journal” — soon identified on various blogs as
Mohamed El Naschie, an Egyptian academic
who published over 320 of his own articles in a
scientific journal of which he was also the
editor.
15. • ... of the 400 papers by El Naschie indexed in
Web of Science, 307 were published in Chaos,
Solitons and Fractals alone while he was editor-
in-chief.
• El Naschie’s papers in CSF make 4992 citations,
about 2000 of which are to papers published in
CSF, largely his own.
16. All ingredients in one story
• reputation race at work
• “the most accurate picture ... ever produced”
• gaming affecting:
– university rankings
– journal Impact Factor
– individual bibliometrics
Objection: this is just an outlier
Be serious: no one can be so stupid to let important
decisions depend on questionable rankings
18. Who is a “highly skilled migrant”
in the Netherlands?
Decided by the rankings
19. Highly skilled migrants
Can I become a highly skilled migrant in the Netherlands - even if I
haven't got a job yet?
To be eligible, you must be in possession of one of the following
diplomas or certificates:
• a master's degree or doctorate from a recognised Dutch institution
of higher education or
• a master's degree or doctorate from a non-Dutch institution of
higher education which is ranked in the top 150 establishments in
either the Times Higher Education 2007 list or the Academic
Ranking of World Universities 2007 issued by Jiao Ton Shanghai
University in 2007
20. Sardegna (an Italian region): am I
eligible for a scholarship
to attend a PhD?
Decided by the rankings
21. APPLICATION WILL BE SCORED
BASED ON PRESTIGE OF PHD
SCHOOL ACCORDING TO QS WORLD
UNIVERSITY RANKINGS
22. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
29. Should you believe in the Shanghai ranking? An MCDM view
J.-C. Billaut D. Bouyssou P. Vincke
• all criteria used are only loosely connected with what they
intended to capture.
• several arbitrary parameters and many micro-decisions that
are not documented.
• flawed and nonsensical aggregation method
• «the Shanghai ranking is a poorly conceived quick and dirty
exercise»
«any of our MCDM student that would have proposed such a
methodology in her Master’s Thesis would have surely failed
according to our own standards»
34. Twenty Ways to Rise in the Rankings (1/3)
by Richard Holmes http://rankingwatch.blogspot.it/2013/12/twenty-ways-to-rise-in-rankings-quickly.html
1. Get rid of students. The university will therefore do better in
the faculty student ratio indicators.
2. Kick out the old and bring in the young. Get rid of ageing
professors, especially if unproductive and expensive, and hire
lots of temporary teachers and researchers.
5. Get a medical school. Medical research produces a
disproportionate number of papers and citations which is good
for the QS citations per faculty indicator and the ARWU
publications indicator. Remember this strategy may not help
with THE who use field normalisation.
35. 7. Amalgamate. What about a new mega university formed by
merging LSE, University College London and Imperial College? Or
a tres grande ecole from all those little grandes ecoles around
Paris?
9. The wisdom of crowds. Focus on research projects in those
fields that have huge multi - “author” publications, particle
physics, astronomy and medicine for example. Such
publications often have very large numbers of citations.
10. Do not produce too much. If your researchers are
producing five thousand papers a year, then those five hundred
citations from a five hundred “author” report on the latest
discovery in particle physics will not have much impact.
Twenty Ways to Rise in the Rankings (2/3)
by Richard Holmes http://rankingwatch.blogspot.it/2013/12/twenty-ways-to-rise-in-rankings-quickly.html
36. 13. The importance of names. Make sure that your researchers
know which university they are affiliated to and that they know
its correct name. Keep an eye on Scopus and ISI and make sure
they know what you are called.
18. Support your local independence movement. Increasing the
number of international students and faculty is good for both
the THE and QS rankings. If it is difficult to move students
across borders why not create new borders?
20. Get Thee to an Island. Leiden Ranking has a little known
ranking that measures the distance between collaborators. At
the moment the first place goes to the Australian National
University.
Twenty Ways to Rise in the Rankings (3/3)
by Richard Holmes http://rankingwatch.blogspot.it/2013/12/twenty-ways-to-rise-in-rankings-quickly.html
37. Rankings
• Fragile scientific grounds
• Cost of providing data
• Incentive to gaming
• Raw data are obscured
Why, then?
38. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here.
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
42. Conflicting opinions
• Non-aggregators:
key objection to aggregation: the arbitrary
nature of the weighting process by which the
variables are combined
• Aggregators:
value in combining indicators: extremely
useful in garnering media interest and hence
the attention of policy makers
43.
44. Can you really govern by numbers?
Let’s see a survey of reactions
to university rankings
45. Germany
• “We look back decades and people came to
German universities; today they go to US
universities.”
• The Exzellenzinitiative (2005): from
traditional emphasis on egalitarianism
towards competition and hierarchical
stratification
46. France
• The Shanghai ranking
“generated considerable embarrassment
among the French intelligentsia, academia
and government: the first French higher
education institution in the ranking came only
in 65th position, mostly behind American
universities and a few British ones”
47. Australia
• The Shanghai and QS: at least two Australian
universities among the top 100.
• Opposing strategic options:
– fund a small number of top-tier competitive
universities
– “creation of a diverse set of high performing,
globally-focused institutions, each with its own
clear, distinctive mission”.
48. Japan
• “The government wants a first class university
for international prestige ”
• “in order for Japanese HEIs to compete
globally, the government will close down some
regional and private universities and direct
money to the major universities”
• some institutions will become teaching only.
49. The Education and University
Minister: «We are lagging behind in
the world rankings. For this reason
we are going to present the reform of
the University [...] I wish I will never
see again that the first Italian
university is ranked 174-th »
Italy
54. E. Hazelkorn on rankings
• 90% or 95% of our students do not attend elite
institutions. Why are we spending so much on what
people aren’t attending as opposed to what they are
attending?
• May detract resources from pensions, health,
housing, ....
• Are “elite” institutions really driving national or
regional economic and social development?
55. Does trickle-down work?
E. Hazelkorn: “Governments and universities must stop obsessing
about global rankings and the top 1% of the world's 15,000
institutions. Instead of simply rewarding the achievements of
elites and flagship institutions, policy needs to focus on the
quality of the system-as-a-whole.”
There is little evidence
that trickle-down works.
56. Where are we?
• (Even) Phil Baty (Times Higher Education)
admits that there are aspects of academic life
where rankings are of little value
• Can we/you afford the ‘reputation race’?
• We will have to live in a world in which
extremely poor rankings are regularly
published and used.
What can be done then?
57. some advices from the authors of
“Should you believe
in the Shanghai Ranking?”
58. “Stop being naive”
• There is no such thing as a ‘‘best university’’ in
abstracto.
• Stop talking about these ‘‘all purpose
rankings’’. They are meaningless.
• Lobby in our own institution so that these
rankings are never mentioned in institutional
communication
63. from “Is There Life After Rankings?”
• Not cooperating with the rankings affects my
life and the life of the college in several ways.
Some are relatively trivial; for instance, we are
saved the trouble of filling out U.S. News's
forms, which include a statistical survey that has
gradually grown to 656 questions
• The most important consequence of sitting out
the rankings game, however, is the freedom to
pursue our own educational philosophy, not
that of some newsmagazine.
64. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
70. Let us ask SCOPUS:
no evidence of collapse
Italy’s scientific documents 1996-2010
71. No trace of collapse fo all other countries
(Europe and non-Europe)
Then, what’s the explanation of the graph
in the scientific paper?
72. EXAMPLE: DUE TO WELL KNOWN RECORDING DELAYS IN
BIBLIOMETRIC DATABASES, IN 2010 THE NATIONAL SCIENCE
FOUNDATION REGARDED 2008 E 2009 DATA AS UNRELIABLE
IT’S JUST A MATTER
OF DELAYS
73. The moral of
the story
Bibliometric data of last two years are not in
steady-state: do not use for scientific
(or assessment) purposes
74. 1. For the (bibliometric) bureaucrat a number is
something objective and trustable.
2. Awareness about errors, uncertainty, relevance,
manipulability is usually very low.
3. Administrative and normative use of bibliometry
is extremely fragile.
4. In the last two years, hundreds if not thousands
of Italian researchers spent a lot of time asking
Web-of-Science and Scopus to update/correct
their bibliometric profile.
Comment
75. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
89. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
92. The Holy Grail of research assessment
• Peer review is subjective, lengthy, and expensive
• We have a lot of bibliometric data relative to
journals (e.g. IF) and scientists:
– # papers
– # cites
– h-index
– ...
• Solution: work out indicators to obtain objective,
quick and unexpensive bibliometric assessment
also at individual level
95. Assessing poetry
... determining a poem's
greatness becomes a
relatively simple matter. If
the poem's score for
perfection is plotted
along the horizontal of a
graph, and its importance
is plotted on the vertical,
then calculating the total
area of the poem yields
the measure of its
greatness.
97. For the matrix entries labeled IR we rely
on the informed peer review
ANVUR proposal: Use bibliometry, # of citations
(and informed peer review)
97
A
B
C
D
Citations
A
B
C
D
A B C D
A
B
C
D
Citations
A
B
C
D
A B C D
A A A?
D D
D
A
A
A?
D
IR
IR
IR
IRIR
IRIR
IR IR
IR
IR
IR IRIR
Bibliometry (IF,…) Bibliometry (IF,…)
Recent
articles
Old
articles
98. Research as target shooting
A paper is
an arrow aiming
at high IF & cites
99. E = 1 B = 0,8 A = 0,5 L = 0
ITALIAN RESEARCH ASSESSMENT: COLORS AND SCORESITALIAN RESEARCH ASSESSMENT: COLORS AND SCORES
102. Due to a flawed design, the actual targets did not match the
specification and did differ between scientific areas
Target specification
103. Medical Sc. vs Industr. & Inform. Eng.
40%
25%
14%
21%
22%
21%
13%
44% Ingegneria Industriale
e dell’InformazioneScienze
Mediche
Industr. &
Information Eng.
Medical
Sciences
104. The moral of the story: since the
targets in different areas (and
disciplines!) are not matched, scores
are not comparable and any
subsequent aggregation (e.g. for
funding) becomes nonsensical.
We failed the quest, but the Grail
may still exist ...
106. Report on the pilot exercise to develop bibliometric
indicators for the REF [the research assessment]
Bibliometrics are not sufficiently robust at this stage to
be used formulaically or to replace expert review in the
REF
http://www.hefce.ac.uk/pubs/year/2009/200939/
bibliometry
And the Holy Grail? Let’s ask HEFCE
107. Code of Practice – European Mathematical Society
http://www.euro-math-soc.eu/system/files/COP-approved.pdf
1.... the Committee sees grave danger in the routine use of bibliometric
and other related measures to assess the alleged quality of
mathematical research and the performance of individuals or small
groups of people.
2.It is irresponsible for institutions or committees assessing individuals
for possible promotion or the award of a grant or distinction to base
their decisions on automatic responses to bibliometric data.
And the Holy Grail? Let’s ask EMS
108. On the use of bibliometric indices during assessment –
European Physical Society
http://www.eps.org/news/94765/
... the European Physical Society considers it essential
that the use of bibliometric indices is always
complemented by a broader assessment of scientific
content taking into account the research environment, to
be carried out by peers in the framework of a clear code
of conduct.
And the Holy Grail? Let’s ask EPS
109. And the Holy Grail? Let’s ask
Académie des Sciences
Du Bon Usage de la Bibliometrie pour l’Évaluation
Individuelle des Chercheurs”- Institut de France, Académie
des Sciences
http://www.academie-sciences.fr/activite/rapport/avis170111gb.pd
Any bibliometric evaluation should be tightly associated to a
close examination of a researcher’s work, in particular to
evaluate its originality, an element that cannot be assessed
through a bibliometric study.
110. And the Holy Grail? Let’s ask IEEE
IEEE Board of Directors: Position Statement on “Appropriate
Use of Bibliometric Indicators for the Assessment of Journals,
Research Proposals, and Individuals”.
http://www.ieee.org/publications_standards/publications/rights/iee
Any journal-based metric is not designed to capture qualities
of individual papers and must therefore not be used alone
as a proxy for single-article quality or to evaluate individual
scientists.
112. 1. Avoid using journal metrics to judge individual papers or
individuals for hiring, promotion and funding decisions.
2. Judge the content of individual papers and take into
account other research outputs, such as data sets,
software and patents, as well as a researcher’s influence
on policy and practice.
113. Signed by 484 organizations including:
- American Association for the Advancement of Science (AAAS)
- American Society for Cell Biology
- British Society for Cell Biology
- European Association of Science Editors
- European Mathematical Society
- European Optical Society
- European Society for Soil Conservation
- Federation of European Biochemical Societies
- Fondazione Telethon
- Higher Education Funding Council for England (HEFCE)
- Proceedings of The National Academy Of Sciences (PNAS)
- Public Library of Science (PLOS)
- The American Physiological Society
- The Journal of Cell Biology
- Institute Pasteur
- CNRS – University Paris Diderot
- INGM, National Institute of Molecular Genetics; Milano, Italy
- Université de Paris VIII, France
- University of Florida
- The European Association for Cancer Research (EACR)
- Ben-Gurion University of the Negev
- Université de Louvain
114. And the Holy Grail?
Let’s ask the literature
Interpretation and Impact ”... analysts should also
be aware of the potential effect of the results in terms
of future behavioural changes by institutions and
individuals seeking to improve their subsequent
'ranking'."
115.
116.
117.
118.
119.
120.
121. Outline
• The most accurate ranking ever produced
• Should you believe in university rankings?
• Governing by numbers
• The end is not near. It is here
• The Good, the Bad, and the Ugly
• The quest for the Holy Grail
• You’ll gonna hear me roar!
123. • Media provide distorted information about
expenditure, performance and efficiency of
higher education.
• This justified heavy budget cuts
(- 18,7% from 2009 to 2013 in real terms)
• “Bureaucratic delirium”
• Flawed bibliometric assessments
incentivating gaming
What to do?
Italy
124.
125. A blog devoted to research evaluation
and higher-education policy
•Birth: October 2011
•Members of the Editorial Board: 14
•Collaborators: > 200
•Contacts from November 2011 to June 2014: 8 million
•More than 13,000 daily contacts in 2014
•Articles published: 1.627
•25,000 comments by readers
•Funding: donations from the readers
•Often cited by national newspapers and magazines
•Good visibility among cultural blogs
(see e.g. 8-th position in http://labs.ebuzzing.it/top-blogs/cultura)