1. The document discusses using the Web of Science database and its citation indexing tools to analyze and gauge research output and influence for individuals, institutions, topics, and countries.
2. Key features highlighted include citation mapping and reports, comparisons to peer institutions, and custom datasets to establish baselines and metrics for research performance evaluation.
3. Proper contextualization of citation data within fields and years is important for serious assessment activities rather than just averaging citations.
Call Girls Kengeri Satellite Town Just Call 👗 7737669865 👗 Top Class Call Gir...
Gauging Research Output and Influence
1. Gauging Research Output and Influence Using Traditional Citation Tools and Unique Analytical Solutions. ALA 2010 Midwinter Meeting Jan 15-19, Boston Paul Torpey Jeff Dougherty
2.
3.
4.
5.
6.
7. Web of Science – More than “Science” Social Sciences Citation Index – 2,700+ Titles Diversity – Web of Science Supports All Academic Programs Anthropology Gerontology Psychology Area Studies Health Policy & Services Public Administration Business History Public, Environmental & Occupational Health Communication Hospitality, Leisure, Sport & Tourism Rehabilitation Criminology & Penology Industrial Relations & Labor Social Issues Demography Information Science & Library Science Social Sciences Economics International Relations Social Work Education Law Sociology Environmental Studies Linguistics Substance Abuse Ergonomics Management Transportation Ethics Nursing Urban Studies Ethnic Studies Planning & Development Women’s Studies Family Studies Political Science Geography Psychiatry
8. Arts & Humanities Citation Index – 1,500 Titles Web of Science – More than “Science” Diversity – Web of Science Supports All Academic Programs Archaeology Folklore Medieval & Renaissance Studies Architecture History Music Art Humanities Philosophy Asian Studies Language & linguistics Poetry Classics Literary Reviews Religion Dance Literary Theory Theater Film, Radio, Television Literature
9.
10.
11. Web of Science Forward Citation Map Psychology, Experimental Computer Science, Artificial Intelligence Business Religion Business
20. Web of Science – powerful “Analyze Results” features Analyze Results features are present in all ISI Web of Knowledge databases – as well as All Databases search results.
22. Web of Science – Backward Citation Map 2nd generation citations reveal citation pathways
23.
24.
25.
26.
27.
28.
29.
30.
31.
32. For Your Institution – Papers Average citations specific to Category, year of article, and document type.
33. For Your Institution – Papers Percentile position specific to field and year. -- The closer to zero, the more highly cited.
34. Relative Performance At-a-Glance With a baseline of 1.0 for each, you immediately have a gauge of the level of influence of an author’s collective works compared other published work in the field. For Your Institution – Authors
36. For Your Institution – Disciplines The published intellectual output of a university presented by discipline. Here too, gauge relative influence at a glance.
37. For Your Institution – Disciplines The published intellectual output of a university presented by discipline. Here too, gauge relative influence at a a glance.
42. Citation Analysis is Objective in nature -- a key piece of the overall Research Evaluation puzzle. Citation Metrics Awards/Honors Funding data Peer review
43.
44. Gauging Research Output and Influence Using Traditional Citation Tools and Unique Analytical Solutions. ALA 2010 Midwinter Meeting Jan 15-19, Boston Jeff Dougherty Paul Torpey
Notas del editor
By “characterizing” we mean going beyond simply “how much” to examining output by discipline, or journal, or conference – as well as identifying trends in growth or decline of output.
Counting, measuring, comparing quantities, analyzing measurements: quantitative analysis is perhaps the main tool of science. Scientific research itself, and recording and communicating research results through publications, has become enormous and complex. It is so complex and specialized that personal knowledge and experience are no longer sufficient tools for understanding trends or for making decisions. Yet the need to be selective, to highlight significant or promising areas of research, and to manage better investments in science is only increasing. Those in universities, government offices and labs, and boardrooms must decide what research should be supported and what should not, or which research projects and researchers should receive more support than others. Until relatively recently, peer review was the main route by which science policymakers and research funders made policy decisions about science. A library faced with collection decisions, a foundation making funding choices, or a government office weighing national research needs must rely on expert analysis of scientific research performance. Increasingly, universities everywhere must demonstrate their special capabilities to a variety of constituencies. Universities that seek research funding from government agencies and foundations must provide evidence of their accomplishments and capacities. And in many countries, universities, whether public or private, must account for their performance as part of national or professional accountability protocols. Indeed, every university must have a clear, evidence-based understanding of the institution’s performance towards its goals and mission. This understanding is achieved and maintained through ongoing evaluation of all of the institution’s functions. Because research is a central function, the university must evaluate its performance. Data on research performance helps to inform strategic decisions about what areas of research to support or build. It also helps the university leaders understand the institution’s position relative to global and domestic standards of research production: How much research is conducted? What is its impact? How many of the faculty members’ articles are published in first-class journals? Is that number of publications increasing or decreasing? With solid, objective information about production and impact, the university has a strong basis for setting goals, charting progress, making budgetary and hiring decisions, investing in facilities, and working with external agencies.
Bibliometrics should always be taken in context
The first major award for the scientometric field, the Derek John De Solla Price Award of the journal Scientometrics , was first awarded in 1984 to Eugene Garfield. Little SCiBig Sci 1963 -- This book contains four introductory lectures on quantitative methods in the analysis of historical and modern science: 1. Prologue to the science of science (exponential and logistical growth of scientific publication and manpower); 2. Galton revisited (productivity distributions, laws of l,otka and Zipf); 3. invisible colleges and the affluent scientific commuter (multiple discovery and authorship, Bradford’s law, half-lives of papers); 4. political strategy for big scientists (saturation, language distribution, emergence of Japanese science, big science phenomena).