The document discusses the impact and implications of university rankings. It notes that while rankings aim to measure quality and compare institutions, they often reduce quality to a few quantifiable indicators and ignore important factors like teaching quality, student experience, and community engagement. As a result, rankings can distort institutions' priorities and behaviors. The document reviews research showing that rankings significantly influence students, employers, universities, governments, and academic work. Many countries are using rankings to restructure their higher education systems and concentrate resources in a small number of elite institutions.
5. www.dit.ie/researchandenterprise
Pursuing Quality
• Quality and excellence are the main drivers impacting on and affecting
higher education, nationally and globally;
• Recognition of key role higher education plays within society and as an
economic driver;
• Quality assurance provides needed confidence for prospective students
and employers;
• Growing necessity to regulate the global marketplace – especially as more
provider appear and talent is mobile;
• Society has a right to know whether its institutions are capable of meeting
its expectations: value-for-money and investor-confidence.
6. www.dit.ie/researchandenterprise
Quality is Concern for all Stakeholders
• National geo-political positioning and pride;
• Beacon to attract/retain investment, business and talent;
• Institutional reputation and status;
• Performance assessment of scientific-scholarly research;
• Graduate capability and opportunities;
• Link between qualification and career opportunities and life-style;
• Value-for-money and return-on-(public) investment;
• Growing importance of global networks.
11. www.dit.ie/researchandenterprise
Understanding Rankings
• There is no such thing as an objective ranking
• Because:
– The evidence is never self-evident;
– Measurements are rarely direct but consist of indicators;
– Choice of indicators and weightings reflect value-judgements of the
rankings organisations.
• Each indicator is considered independently from each other - with no
consideration as to context, history, mission, etc.
– In reality, there is a relational aspect to the indicators or multi-
collinearity
12. www.dit.ie/researchandenterprise
Evolution of Rankings
• Global Rankings emerged in 2003 –
– Part of US academic system for 100 years but today popularity is worldwide;
– Significant force impacting and influencing policymakers and the academy;
• Four phases:
– Phase 1 (1900 -1950s) Sub-National/Elite Rankings
– Phase 2 (1959 – 2000) National Rankings
– Phase 3 (2003-) Global Rankings
– Phase 4 (2008-) Supra-national Rankings
• Today, 10 major global rankings and 150+ national/specialist rankings.
13. www.dit.ie/researchandenterprise
Global Rankings
(red = government sponsored)
• Academic Ranking of World Universities (ARWU) (Shanghai Jiao Tong
University, China), 2003
• Webometrics (Spanish National Research Council, Spain), 2004
• National Taiwan University Rankings (formerly Performance Ranking of
Scientific Papers for Research Universities, HEEACT), 2007
• Leiden Ranking (Centre for Science & Technology Studies, University of Leiden),
2008
• SCImago Journal and Country Rank (SJR) (Spain), 2009
• University Ranking by Academic Performance (URAP) (Informatics Institute of
Middle East Technical University, Turkey), 2009
• QS World University Rankings (Quacquarelli Symonds, UK), 2010
• THE World University Ranking (Times Higher Education, UK), 2010
• U-Multirank (European Commission, Brussels), 2014
• Best Global Universities rankings (USNWR, US), 2014
14. www.dit.ie/researchandenterprise
Select National Rankings (red = government sponsored)
INSTITUTIONAL
DISCIPLINE/
SUB-CATEGORIES
SPECIALIST
•University Ranking System (Bulgaria)
•CHE-HochschulRanking (Germany)
•Expert University Ranking (Russia)
•Good University Guide (Australia)
•Guardian University Guide (UK)
•University Rankings of Islamic Countries (Iran)
•Higher Education Commission Rankings
(Pakistan)
•La Repubblica Grande Guida Università (Italy)
•Maclean’s On Campus (Canada)
•National Rankings of Best Universities
(Kazakhstan)
•Netbig Chinese University Ranking (China)
•Nigeria Universities Commission Ranking
•OHEC (Thailand)
•Perspektywy University Ranking (Poland)
•Ranking U-Sapiens (Colombia)
•Sunday Times Good University Guide (Ireland)
•Times Higher Education University Guide (UK)
•Top 200 University Rankings (Ukraine)
•URANK-rank (Sweden)
•US News and World Report (USNWR) College
Rankings (US)
• Dataquest (India)
• India Today (India)
• Outlook (India)
• Le Nouvel Observateur (France)
• Sherif Magazine (Iran)
• National Research Council Ranking of
Doctoral Programmes (US)
• Toplawschools.com (US)
• American Universities Admission
Programme: Undergraduate American
Universities Rankings for International
Students (US)
• US News and World Report (USNWR) Top
Med Schools (US)
• WPROST MBA (Poland)
• CollegeNET Social Mobility Index
Ranking (US)
• Georgetown Public Policy Review
Placement Efficiency Ranking
(US)
• Metroversities (US)
• New York Times Most
Economically Diverse Top
Colleges (US)
• Online Study Australia Online
University Ranking List (Australia)
• Princeton Review (US)
• Saviours of Our Cities (US)
• Social Mobility Index (CollegeNet
and Payscale, US)
• Washington Monthly College
Guide (US)
• Washington Monthly Ranking of
Community Colleges (US)
17. www.dit.ie/researchandenterprise
Who Uses Rankings
Students, public opinion and government are biggest users of rankings &
more likely to be negatively influenced
•Domestic undergraduate students
•Internationally mobile students and faculty
•Postgraduate students
•Government/Policymakers
•Academic partners and academic organisations
•Employers
•Sponsors, philanthropists and private investors
•Industrial partners
•Higher education institutions
•Public opinion
18. www.dit.ie/researchandenterprise
What People Want To Know
• Teaching and learning: environment and quality;
• Fields of specialisation/department: level of intensity, expertise, quality
and competence;
• Faculty quality: qualifications, expertise and track-record, research,
• Efficiency level: how much output vis-a-vis funding;
• Graduate expectations: career, salary and lifestyle;
• Employability of graduates: trends and competences;
• Research capacity of HEI & research team;
• Research infrastructure: level of use and efficiency;
• Performance benchmarked regionally, nationally & internationally;
• Attraction capacity and internationalisation;
• Etc.
19. www.dit.ie/researchandenterprise
What Rankings Measure
Rankings Measure
• Bio- and medical sciences Research
• Publications in Nature and Science
• Student and Faculty Characteristics
(e.g. productivity, entry criteria,
faculty/student ratio)
• Internationalization
• Reputation – amongst peers,
employers, students
Rankings Do Not Measure
• Teaching and Learning, incl. "added
value", impact of research on
teaching
• Arts, Humanities and Social Science
Research
• Technology/Knowledge Transfer
• Impact and Benefit of Research
• Regional or Civic Engagement
• Student Experience
20. www.dit.ie/researchandenterprise
What Global Rankings tell Us
Because age and size matters, there is a super-league of large, well-endowed,
comprehensive universities, usually with medical schools and in English-
language countries.
21. www.dit.ie/researchandenterprise
Advantages
• Provide simple, quick and easy way to measure/compare HE performance
and “quality”;
• Place HE within wider comparative and international framework;
– Inform student choice and stakeholder opinion;
– Beacon to attract/retain mobile capital and talent;
– Performance assessment of scientific-scholarly research;
– Signal of what to expect upon graduation and from graduates;
– Value-for-money and return-on-(public) investment;
• Accountability tool, esp. in societies/for HEIs where QA culture/practices
weak or immature;
• Heighten attention to quality and drive-up performance:
– Accelerate modernisation agenda;
– Emphasize institutional strategic decision-making and data collection/analysis:
22. www.dit.ie/researchandenterprise
Disadvantages
• HEIs are complex organisations meeting diverse needs, but rankings usually
measure/compare “whole institutions” using same set of indicators;
– Undermines mission diversity, and ignores diversity of student cohort;
– Drives isomorphism/norming around single model of HE or quality/excellence;
• Academic quality is complex and not easily reduced to quantification;
– Use of proxy variables can misrepresent and lead to unintended consequences;
– Difficulty obtaining meaningful indicators and (international) comparative data;
– Bibliometric data is unreliable for all disciplines, and doesn’t capture the impact
or benefit of research;
• Leads to simplistic comparisons: whereas, statistical differences between
institutions are insignificant;
• International differences can be very great;
• Indicators can encourage perverse behaviour – over-emphasis on small set
of indicators.
24. www.dit.ie/researchandenterprise
Institutional Reaction: Some Findings
• 83% HEIs unhappy with their rank compared with 58% in 2006;
• 32% HEIs want to be first nationally compared with 19% in 2006;
• 29% HEIs want to be in the top 5% internationally compared with 24% in
2006;
• Overwhelming majority HEIs use rankings to inform strategic decisions,
set targets or shape priorities, and inform decisions about international
partnerships;
• 84% HEIs use rankings to monitor peer institutions in their own country,
and ~77% monitor peers worldwide;
• 84% HEIs have a formal internal mechanism to review their institution’s
rank, and 40% - this is led by Vice Chancellor, President or Rector;
25. www.dit.ie/researchandenterprise
Student Reaction: Some Findings
• 80% undergraduate and postgraduate (taught and research) students
have a high interest in rankings, with no real difference between
undergraduate and postgraduate students (i-graduate, 2014);
• High achieving and high socio-economic students are most likely to make
choices based on non-financial factors, e.g. reputation and rankings;
• International students continue to rate reputation and position in
rankings as key determinants in their choice of institution, programme
and country;
• Strong correlation between rankings, perceptions of quality,
institutional reputation and choice of destination, at the national and
institutional level;
26. www.dit.ie/researchandenterprise
Stakeholder perceptions
• EMPLOYERS have implicit rankings based on own experience:
•US accounts claim law firms regularly use USNWR rankings to
"determine the threshold for interviews" (Espeland and Sauder, 2007, 19);
•25% of UK graduate recruiters interviewed "cited league tables as their
main source of information about quality and standards" (University of
Sussex, 2006, 87, 80, also 87-92);
• ACADEMIC PARTNERSHIPS:
• 40% say rankings integral to decision-making about international
collaboration, academic programmes, research or student exchanges;
• 57% think rankings influencing willingness of other HEIs to partner with
them;
• 34% say rankings influencing willingness of other HEIs to support their
institution’s membership of academic or professional organisations.
27. www.dit.ie/researchandenterprise
Impact on Faculty and Academic Work
• Increased emphasis on academic performance/outputs:
– Contracts tied to metrics/performance;
– New salary and tenure arrangements;
– Active head-hunting of high-achievers.
• Rankings used to identify under-performers and ‘reputational’ disciplines;
• Can impact negatively or positively on staff morale;
• Faculty not innocent victims:
– Faculty use rankings to identify partnerships;
– Rankings confer social and professional capital on faculty in high-ranked HEIs.
28. www.dit.ie/researchandenterprise
Government Responses
•National governments and supra-national organizations interpret
rankings as a proxy for capacity/capability to be globally competitive in a
world dominated by new knowledge generated by talented people;
•Deliberate steps to restructure HE/research systems and institutions to
create “world-class” or flagship universities (France, Germany, Russia, Spain,
China, South Korea, Taiwan, Malaysia, Finland, India, Japan, Singapore, Vietnam and
Latvia, etc.)
–Concentrate excellence and resources in small number of elite universities (Neo-
liberal Model);
–Create greater vertical or hierarchical (reputational) differentiation;
–Greater differentiation between teaching and research universities;
–Link resource allocation to competitive processes, often informed by
rankings.
29. www.dit.ie/researchandenterprise
Policy Impact beyond HE
•Serbia, Albania, Romania, Jordan, Czech Republic use rankings to classify
universities;
•Russia, Brazil, Chile, Singapore, Saudi Arabia, Kazakhstan, Mongolia and
Qatar restrict state scholarships to students admitted to high-ranked
universities;
•India, Russia, Singapore use rankings as criteria for collaboration;
•Dutch (2008) and Danish (2011) immigration laws target foreigners from
top universities (150, and 20 respectively);
•Macedonia: Law on HE (2008) automatically recognises top 500 Times
QS, SJT or USN&WR, and uses rankings to evaluate university performance.
•US states benchmark salaries (Florida and Arizona) or ‘fold’ rankings into
performance measurement system (Minnesota, Indiana and Texas).
31. www.dit.ie/researchandenterprise
Rankings changing how we think about HE
•Cross-national/jurisdictional comparisons are inevitable by-product of
globalization and will intensify in the future;
•Creating sense of urgency and accelerating modernisation agenda;
•Driving up institutional performance and providing some public
accountability and transparency;
•Pushing HE to focus on quality and accurate data
collection/benchmarking;
•Good quality, international comparative information is essential to
underpin strategic leadership and decision-making at the institutional level,
and to demonstrate value, impact and benefit.
32. www.dit.ie/researchandenterprise
What Are You Trying To Achieve?
• Is the aim to create World-class universities or a World class system –
– Should the aim be to improve the capacity and quality of the whole system OR
reward the achievements of elite flagship institutions?
– Should resources be directed to the few universities which perform best
against rankings OR should national policy avoid distortions in resource
allocation and ensure resources meet the needs of the wider tertiary
education sector?
• Does a rankings-led strategy strengthen national competitiveness OR
undermine national sovereignty?
• Should you use indicators chosen by rankings organisation OR develop
indicators which meet the strategic requirements of your country or
institution?
• Should HE data be collected and monetised by commercial organisations
or by an independent international organisation?
33. www.dit.ie/researchandenterprise
Rankings-led Strategy
• Quality traditionally assessed via “self-regulating” QA and peer-review,
but:
– QA can be difficult to compare internationally;
– Interest in going beyond measuring and evaluating quality to linking
performance and productivity to resource allocation.
• Rankings have filled gap:
Many governments and institutions have adopted a rankings-led strategy:
– Restructure HE/research systems/HEIs to create “world-class” or flagship
universities;
– Embed indicators in strategic planning, and use to measure performance and
reward success;
– Use indicators for scholarships, and to target collaboration and professionals;
─ Re-orientation in research priorities towards "reputational" disciplines,
─ Etc.
34. www.dit.ie/researchandenterprise
Beware Unintended Consequences (1)
• Prestige and reputation become dominant drivers of the “system” leading
to steep(er) hierarchy – rather than pursuance of equity and diversity;
• Quality is a complex concept:
– Many indicators measure wealth/socio-economic advantage, and privilege the
most resource-intensive institutions/students;
• Concentrating resources and research activity may be counter-productive
and undermine national economic capacity
– Widens privilege gap, affecting other HEIs and their students, but may also
threaten the cities and regions in which they reside, exaggerating long-standing
inequality issues;
– No evidence more concentrated national systems generate higher citation
impact;
– Financial costs can be very high – and threaten other policy goals.
35. www.dit.ie/researchandenterprise
Obsession with Elites
• ~18,000 HEIs worldwide (as per WHED data).
• 196m worldwide student enrolments 2012 (WB)
– 20m HE students in EU28 (20.5m w/ Switzerland)
• Rankings as top 100 = 0.5% HEIs or 0.4% students worldwide
• Obsession with rankings is skewing our understanding of student cohort;
36. www.dit.ie/researchandenterprise
Beware Unintended Consequences (2)
• International comparisons not always appropriate or relevant;
• Why should a country or institution align itself to indicators chosen by others?
• Adopting a rankings-led strategy can affect/reorient priorities and
practices, leading to perverse behaviour and “gaming”;
• Because rankings incentivise behaviour, what is measured is critical.
37. www.dit.ie/researchandenterprise
Dos and Don’ts
Don’t
• Use rankings as a stand-alone evaluation tool;
• Change national policies or institutional priorities to conform to rankings;
• Use rankings to inform policy/priorities or resource allocation decisions;
• Direct resources to a few elite universities and neglect the needs of the wider tertiary
education sector and society.
Do:
• Ensure rankings are aligned with national values and objectives, have a clear purpose;
• Use rankings only as part of an overall quality assurance, assessment or
benchmarking system and not as a stand-alone evaluation tool;
• Ensure indicators are fit-for-purpose, and measure outcomes in preference to inputs
whenever possible;
• Understand the limitations of rankings, and the unintended consequences.
39. www.dit.ie/researchandenterprise
Alternative Rankings
• Multi-dimensional Rankings/Banding
– U-Multirank (EU)
– CHE-HochschulRanking (Germany)
• System-level Rankings
– Lisbon Council (Brussels)
– Universitas 21 (Australia)
• Measuring Value to Community, Value-for-Money
– Washington Monthly (US)
– Postsecondary Institution Rating System (US Government)
40. www.dit.ie/researchandenterprise
Alternatives To Rankings
• Institutional profiling
– U-Map (EU)
– HE Performance Evaluation Framework (Ireland, Norway, Australia)
• Assessment of Learning Outcomes
– Survey of Student Engagement (US + Canada, Australia, China, South Africa,
New Zealand, Ireland)
– Degree Qualifications Profile (Lumina Foundation, US)
– AHELO: Assessment of Higher Education Learning Outcomes (OECD)
– Learning Gain (Germany, Australia, Brazil, Colombia Canada, China, Russia, US,
UK)
– Voluntary System of Accountability (VSA) (US)
• Open/On-line and Social Media
– UniStats (UK), MyUniversity (Australia)
– Rate-my-Professor
Notas del editor
Nations, their institutions, and all aspects of daily life are regularly measured against each other;
Emphasis on quality and excellence has become matter of great concern and interest for all stakeholders:
National geo-political positioning and pride;
Attract mobile investment and talent;
Performance assessment of scientific-scholarly research is increasingly important, especially for publicly funded research;
Greater focus on outputs and performance as mechanism for financing higher education and actively encouraging differentiation and modernisation.
Students have become savvy participants, consumers and customers as the link between HE and career/salary grows;
Growing importance of global networks – for education exchange, joint programmes, research, staff development and training, etc.
Traditionally, higher education quality has been measured by input factors: student entry numbers and qualifications, academic qualifications, budget/income, etc. Today, there is an increasing focus on outputs, impact, benefit and relevance.
Universities should be funded more for what they do than for what they are, by focusing funding on relevant outputs rather than inputs,…Competitive funding should be based on institutional evaluation systems and on diversified performance indicators with clearly defined targets and indicators supported by international benchmarking (Europa, 2006).
However, there is no definitive, internationally agreed definition of quality because context is critical – differences between publicly-funded and for-profit or well-endowed not-for-profit private institutions can be considerable; likewise socio-economic background and national circumstances are key determinants.
‘Weightings are decided upon by Times Higher Education based on their opinion of the importance of the measured criteria balanced against the appropriateness of the indicator to evaluate the intended measure’. http://www.topuniversities.com/university-rankings/world-university-rankings/methodology/simple-overview
There are no internationally agreed set of indicators to measure quality;
Appropriateness of these metrics/methodology?
Bias towards English-language publications
Bias towards traditional research outputs
Bias towards recording publications in science, biomedical and technology disciplines and not arts, humanities and social sciences
Bias towards high graduation rates, highly selective admission criteria, etc. which favour more select/elite institutions with ‘wealthy white applicants’ against those institutions which admit many low-income students (see Jaschik, 17/08/07, Inside HE)
Value of ‘peer review
Phase 1 (1900 -1950s) Sub-National/Elite Rankings
Focus “distinguished persons” via looking at academic origins, e.g. characteristics such as nationality, birthplace and family;
Phase 2 (1959 – 2000) National Rankings
Emphasis on reputational factors relying on Citation Index; response to mobility, aspirant middle class and ideological shift towards markets;
USNWR (1987) – CHE (1998)
Phase 3 (2003-) Global Rankings
Shanghai ranking created to highlight position of Chinese universities vis-à-vis competitor universities in response to the government’s desire to establish world-class;
Has become “gold standard” – with many of the advantages associated with “first mover”.
Phase 4 (2008-) Supra-national Rankings
Supra-national authorities (EU U-Multirank; OECD AHELO; US federal government Postsecondary Institution Rating System) marks significant paradigm shift
Governments compelled to step-in to regulate the marketplace – arguably issues of global economic security;
Education recognized as globally traded service under GATS (General Agreement on Trade in Services)
Key ‘system-wide’ messages from Shanghai Jiao Tong University (SJTU) Academic Ranking of World Universities are (Sheil, paper to Leiden conf, 2009):
Of the world’s 10,000+ universities, research performance is concentrated in the top 500 and is virtually undetectable (on that index) beyond 2,000.
There is a band of around 250 world-class research-intensive institutions however within this there is a ‘super-league’ of approximately 25 world-leading institutions.
These 25 world leaders are distinguished by large budgets, large endowments, age, excellent staff to student ratios, and most importantly, access to large pools of highly developed human capital (staff and students).
There are very few ‘movers’ on the SJTU index. The biggest non-US movers in the Top 100 (since 2003) are the result of mergers and strategic alliances such as Manchester (gained 49 places), Copenhagen (21 places) and Paris XI (24 places), and Paris VI (UPMC) (21 places).
Access to top 25, for the foreseeable future, is beyond most nations. For example, Harvard with 187 ‘Highly Cited’ researchers matches Canada (as a nation) with 186. (Note that Harvard has grown by 16 Hi-Cis during the past 18 months, double the number of Hi-Cis in Ireland and two fewer than New Zealand).
We know that the top global academic talent is highly concentrated. Alumni from 198 universities have gone on to win Nobel Prizes but at the time of award these were working in just 136 universities. Nearly all of the World’s 6,950 High Citation researchers are concentrated in 424 universities.
Universities from the smaller nations can however compete well at the ‘field’ level:
Swiss Fed Inst Tech – ETH Zurich – 15th in Natural Sciences and Mathematics
Karolinska Institute – 9th in Clinical Medicine and Pharmacy, 18th in Life and Agricultural Sciences
Australian National University – 38th in Natural Sciences and Mathematics, 40th in Life and Agricultural Sciences
They can also be a signpost to participation in “the global knowledge network on an equal basis with the top academic institutions in the world” (Altbach and Salmi, 2011, 1), exposing the “rot in the higher education system” (Okebukola, 2013, 150) or “contributing to an improved human value system” as part of the evolution of humanity (Luo, 2013, 181).
Doing well in rankings is seen as a “more powerful asset for a nation than possession of weapon[s] of mass destruction” (Billal, 2011, 2), equivalent to an “instrument of competitive battle and influence” (Fursenko, Russian Education Minister, quoted in Kishkovsky, 2012) or comparable to the “performance of a nation’s football team in an international competition…The image of the whole country is based on the perception of a few” (Chapman et al., 2014, 41).
Disregards relational aspect of indicators/multicollinearity: e.g. older well-endowed private universities more likely to have lower faculty/student ratios and higher per student expenditure compared with newer or public-funded institutions.
Overall, the trend is for HE leaders to desire a higher ranking than they currently hold – and this has increased over time. For example, whereas in 2006, HE leaders were content to be within the top 10-25 percent, now they wish to be in the top 5 percent.
Given linkages between performance and resource allocation, there is a strong focus on pursuing ambitions nationally (Hazelkorn, 2015, 96-97).
And, despite the statistical impossibility and financial cost of achieving the desired status, this has not halted the number of HEIs, plus ministers and other policymakers, worldwide proclaiming a desire to be more highly ranked.
"You should hold a degree from a Times top 100 university ranked at no 33 or higher"
Those who rise to the top of the academic league table accumulate ‘research power’ (Marginson and Considine, 2000) and are well rewarded in the deregulated academic labour market
Macedonia: Article 159 of the Law on Higher Education, 26 February 2008, number 35/2008. Unfortunately there is no official English translation, but basically degrees from the top 500 universities listed in the THES or SJT or US News and Report are automatically recognised without going through the otherwise complex recognition process.
Scholarships for study abroad restricted to students admitted to highly ranked universities – e.g. Mongolia, Qatar (Salmi & Saroyan 2007)
‘At the annual meeting of the Liberal Party, Prime Minister Lars Løkke Rasmussen launched an ambitious goal to have at least one Danish university among the top 10 in Europe by 2020, as measured by the THE ranking. At present, Copenhagen is 15th in Europe on the list while Aarhus is 20th.’ http://www.universityworldnews.com/article.php?story=20091211083831760
State funding increased by avr 58% from 1987-1995 for colleges that first appears in the rankings by 1990. By comparison, funding increased only 49% for colleges that were never ranked and 48% for those already on the list. Increase in state expenditure attributable to US News exposure = 6.5% per student.
Re Finland (from Ian Dobson) for the past three academic years, the Finnish MoE has provided an additional $20 million euros a year (on top of the previous year's funds) to be distributed among Finland's universities according to the standard formula. The component of that which is relevant to the three institutions that have come together to form Aalto University was 3 million euros. The same situation is to pertain for 2010, with Aalto to receive 3 million and the others 17 million between them. In addition, the MoE is providing 67 million euros to help with the creation of a 'world class university'. The MoE has also agreed to provide double any private funds raised by any university in 2010. (I.e. if the uni earns 10 euros, MoE will provide 20).
Runs counter to/undermines other policy objectives: widening access, regionalism, research-informed teaching, Mode 2 research
Narrowly defines ‘excellence’ around a few categories
Widens gap between elite and mass education with illusion of diversity;
Distorts focus of HE away from teaching/research-informed teaching towards research, in most narrow sense;
Misaligns research focus and weakens innovation capacity.
Onus on government to inform public about use and abuse of ranking.
To Perfect Methodology (inter alia)
EU Classification Project
OECD AHELO project
Rankings Journals – scientific societies, Denmark, Australia
To Improve Position/Drive Performance
National benchmarking, mapping and research assessment – Ireland, Netherlands
Performance targets and funding – Norway, Australia, Denmark
EU Expert Group: Assessment of University-Based Research
EU Ranking of European Higher Education Institutions
Birnbaum argues that such World Class Universities can only be built if they are firmly grounded in strong and indigenous educational and social foundations. Trying to develop them by using imported rhetoric, imported models and large sums of money is destined to fail:
"Attempting to build World-Class Universities without attending first to the educational and social ground on which such institutions might stand is, as Ivan Illich once said, is "like trying to do urban renewal in New York City from the twelfth story up." Rather than more World Class Universities, what we really need in countries everywhere are more world-class technical institutes, world-class community colleges, world class colleges of agriculture, world class teachers colleges, and world class regional state universities.“
“To the extent that research does promote economic development, targeted support for universities that have more narrowly-defined programmes of demonstrated excellence in areas of strategic importance may be equally or more fruitful than focusing on only those institutions that achieve overall top rankings” (Chapman et al., 2014, 69).
Salmi: http://tertiaryeducation.org/2014/07/world-class-universities-or-systems/
“At the end of the day, world-class systems are not those that can boast the largest number of highly ranked universities. They are, instead, those that manage to develop and sustain a wide range of good quality and well articulated tertiary education institutions with distinctive missions, able to meet collectively the great variety of individual, community and national needs that characterize dynamic economies and healthy societies.”
All improvements to bibliometric methods should be undertaken. But even if corrected, it still emphasizes a ‘narrow’ definition of research, e.g. only peer reviewed articles.
Current methodologies work on principal that present performance is a good indicator of future performance;
Therefore, indicators seek to capture existing ‘excellence’ in known fields and by existing and well established researchers and teams
Focus on past undermines potential, new ideas, new and younger teams.
Existing indicators and methodologies rely on data that is easily measured encourages research that is more predictable.
‘New knowledge is new precisely because it was unanticipated. Consequently, it is hard to predict which projects are going to generate useful and informative data that will add to our body of knowledge and which will generate that homerun finding. Today, too many of our postdocs believe that getting a paper into a prestigious journal is more important to their career than doing the science itself.’ http://www.pnas.org/content/107/50/21233.full
Multidimensional Rankings: user-driven whereby each individual or stakeholder group can rank according to his/her own preferences according to different characteristics, e.g. U-Multirank.
System-level Rankings assess quality, impact and benefit of HE system as-a-whole using broad set of indicators, e.g. investment, access and participation rates, contribution of higher education and research to society, internationalisation, government policy and regulation, e.g. Universitas 21, Lisbon Council.
Measuring Value to Community:
Washington Monthly Review
Unlike U.S. News and World Report and similar guides, this one asks not what colleges can do for you, but what colleges are doing for the country;
Others include:
Savoirs of Our Cities (2009); Metroversities (2012) (Dobelle)
US Government Ratings 2015 based on such things as average student debt, graduation rates, and graduates’ earnings;
Voluntary Institutional Metrics Project proposes to provide college-by-college comparisons of cost, dropout and graduation rates, pg employment, student debt and loan defaults, and how much people learn.
Student engagement represents two critical features of collegiate quality. The first is the amount of time and effort students put into their studies and other educationally purposeful activities. The second is how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning.
NSSE doesn’t assess student learning directly, but survey results point to areas where colleges and universities are performing well and aspects of the undergraduate experience that could be improved.
Lumina Foundation Degree Qualifications Profile 5 primary areas of competence:
Specialized Knowledge,
Broad/ Integrated Knowledge,
Applied Learning,
Intellectual Skills
Civic Learning.