Write and Cite “Chicago Style”: Helping Students and Patrons Understand The C...
Eval_Elec_Resources.pptx
1. EVALUATING AND SELECTING ONLINE
RESOURCES:
AN AMERICAN LIBRARY ASSOCIATION
TECH SOURCE WORKSHOP
Jill E. Grogg,
Electronic Resources Librarian and
Associate Professor
The University of Alabama Libraries
jgrogg@ua.edu
RachelA. Fleming-May,
Assistant Professor
School of Information Sciences,
The University ofTennessee-Knoxville
rf-m@utk.edu
2. OURINTERESTINTHISISSUE:
Jill:
Electronic Resources Librarian, ARL
Library
Rachel:
Former Practitioner , ARL Library
Research interest in “Use” on a
practical and theoretical level
Use…USAGE—measurement of e-
Resource usage
The Concept of Electronic
Resource Usage and
Libraries (Library
Technology Reports,
Aug./Sept. 2010)1
3. COLLECTION PRACTICES REDUX
OLD: Supply-side NEW: Demand-driven
• Use is not primary
• Print-based
• Inputs only led to large
institutions held hostage by
rankings (e.g., ARL)
• Growth rate not sustainable
• Information is widely, cheaply
available
• Patron demand and use analysis
drive collection decisions
• Assessment culture
21st c. Library
4. DATA-DRIVEN DECISION-MAKING
• Evolution of “cost-effective”
• Use multiple variables to make big decisions
• Triangulate:
1. Usage statistics and cost per use
2. User feedback, even something as simple as Survey
Monkey
3. External measures of quality, where applicable
(Eigenfactor or Impact Factors)
• Seek continuing education opportunities for
statistical analysis
5. AGENDA:
• Discuss approaches to
understanding the use and value
of e-resources
Overview of current concepts and
practice related to e-Resource usage
measurement
Discussion of approaches to
augmenting data
• Learn the basics of negotiation for
librarians, including a BATNA
(BestAlternative to a Negotiated
Agreement)
8. TO MEASURE WE FOCUS ON
…and
…number of patrons who enter the
building; number of dollars invested in
e-resources
…number of book circulations; number
of electronic article downloads
9. HOW VALUABLE IS DATABASE XTO UNDERGRADUATE STUDENTS AT OUR
INSTITUTION?
Current Approach to Assessment:
Focus (ostensible) Undergraduate Students’ use of a specific information resource
Presumption: Use is proof of usefulness/value
Tool(s) • Usage reports: log-ons to and downloads from Database X
• Cost Data
Data Statistical; time-specific. Some granularity—possible to identify
some detail about individual sessions
Focus of Assessment
(actual)
Cost of Database X/Number of times Database X is logged on to by
a segment of the user population
Enhanced
Understanding?
Compare measures of access to those of other databases;
measures for Database X at peer /aspirational institutions.
Potential outcomes: Decision to keep/eliminate subscription based on perceived
“performance” or importance of Database X to students.
10. “Among other changes, the
Complete CollegeTennesseeAct
funds higher education based in part
on success and outcomes,
including higher rates of degree
completion.”
So What? Why can’t
we continue to
assess things that
way?
So What? Why
can’t we continue
to assess things
that way?
11. • Many instances of use are
removed from the library,
thus unobservable
• Statistical Data about usage
provides a sketch of when
and where, what (in a more
limited sense)…
• …but no “why.”
USE IS FREQUENTLY ASSESSED IN ORDER TO GENERATE
“OBJECTIVE” DATA FOR DECISION MAKING.
12. Understanding
USE Matters.
“Questions such as, ‘Who uses these
resources?’ or ‘Are these huge outlays
of funds justified in terms of use, or
value derived from use?’ or ‘What
difference do all of these resources
make to students and faculty in
universities?’ must be answered if
university administrators, trustees,
students, and faculty are expected to
support ever-increasing levels of
funding for the acquisition and
development of these resources and
services.”5
13. Use is often treated as a
PRIMITIVE CONCEPT in
Library and Information
Science:
an idea so fundamental to
the theoretical framework
as to be indefinable, even
when presented as a
phenomenon to be
measured and quantified.
14.
15. “some of the basic ‘natural laws of library and
information science’ may not apply as well or
as consistently in the realm of electronic
information discovery and use”4
16. SO, IS USE A PRIMITIVE CONCEPT?
No. Use does not, in fact, have a singular conceptual
meaning in the LIS domain and can signify many
actions, processes, and events.
17. THE USE TYPOLOGY: DIMENSIONS OF USE
I. Use as an Abstraction
Ia. Use as a Facilitator
II. Use as an Implement
III. Use as a Process
IV. Use as a Transaction
IVa. Use as a Connector
18. “Of the 57,148 households [surveyed], 27,511
(48.1%) had a household member who used the
public library in the past year. ”6
• A GENERAL TERM FOR ALL TYPES OF
LIBRARY/INFORMATION USE
• DISASSOCIATED FROM ANY SPECIFIC
INSTANCE OF THE PHENOMENON
19. • Isolated instances of library or
information use
• Can be recorded and quantified
• Removed from the user
Vendor-supplied data (COUNTER compliant or
otherwise)
Transaction log analysis
Including page view time measurement (are they really
reading?)
Log-ons—what about database timeouts?
20. “statistics provided by electronic book
vendors…show that [our] community uses
e-books quite heavily. The data do not
show, however, how books are used. For
instance, the available statistics show
that a book has been accessed but do not
differentiate between a one-second click
on a title and a five-hour immersion in
a book…
21. UNDERSTANDING OF
USE AS A PROCESS
Article
Download
Visit to the
Reference
Desk
Db A: Log
on
Application of
library/information resources,
materials, and/or services…
To complete a complex or multi-
stage task
To the solution of a problem
“This study reveals that
undergraduate students
experience information use
in a complex, multi-tiered
way that needs to be
addressed by higher
educators when creating
information literacy
pedagogy.”7
22. AUGMENT STATISTICAL ASSESSMENT WITH OTHER APPROACHES:
REQUIRES MULTIPLE DATA COLLECTION METHODS
REQUIRES “BIPARTISAN” SUPPORT, I.E., WORKING WITH
PUBLIC SERVICES TO GAIN A FULLER UNDERSTANDING OF
HOW AND WHY PATRONS USE THE RESOURCES THEY DO.
23. HOW VALUABLE IS DATABASE XTO UNDERGRADUATE STUDENTS AT OUR
INSTITUTION?
Current Approach to Assessment:
Focus (ostensible) Undergraduate Students’ use of a specific information resource
Presumption: Use is proof of usefulness
Tool(s) Usage reports: log-ons to and downloads from Database X
Cost Data
Data Statistical; time-specific. Some granularity—possible to identify
some detail about individual sessions
Focus of Assessment
(actual)
Cost of Database X/Number of times Database X is logged on to by
a segment of the user population
Enhanced
Understanding?
Compare measures of access to those of other databases;
measures for Database X at peer /aspirational institutions.
Potential outcomes: Decision to keep/eliminate subscription based on perceived
“performance” or importance of Database X to students.
24. • Usage Statistics, plus…
• Students’ own
Words
Interviews
Focus groups
Surveys
Research journals
Actions
Observed behavior
Incl. usability
Citations in school work
Improved understanding (after
instruction in use of a specific resource)
• Who are the students using
this resource…
or not?
• Why do they use this particular
resource (e.g., JSTOR)
…instead of another?
• When andWhere do they use
it?
• How do they use it? For what
purposes?
• How do they feel about this
resource and its role in their
schoolwork?
25. HOW VALUABLE IS DATABASE XTO UNDERGRADUATE STUDENTS AT OUR INSTITUTION?
Focus of
Assessment
Undergraduate Students’ use of a specific information resource
Presumption: It’s not possible to fully understand the importance of an information
resource through observation alone.
Tool(s) Multi-method
Potential
Partnerships?
• Instruction librarians
• Undergraduate course instructors
• Student workers
• Graduate students
Data • Statistical: what and where
• Granular and individual; affective
Enhanced
Understanding?
True value of particular resource or resource type in the learning process
Potential
outcomes
• Decision to keep/eliminate subscription based on enhanced picture of
Database X’s “performance” or importance to students.
• Generation of Reportable Data regarding resource usage outcomes
26. • Lack of time for more
sophisticated data
collection
• Lack of research expertise
Partnerships outside
institution:
Similar institutions
Consortial partners
Within institution:
Academic departments
Students
Instructors
Other entities
Research groups
Data collection units
Student success programs
27. • Grant funded by IMLS, December
2009-2012
Principal Investigators CarolTenopir,
UTK; Martha Kyrillidou, ARL, Paula
Kaufman, UIUC
• Purpose: “…to study the value of
academic libraries to students,
faculty, policymakers, funders…”
and Return on Investment (ROI) in
academic libraries
• Comprehensive: models
incorporate all inputs in the library
system (faculty, staff, students,
library resources) and determine how
each influences the system
articulate all values of the library and
areas of investment and return
Teaching/
Learning
Research Social/
Professional
e-Science
Collaborative Scholarship
Institutional Repositories
Functional Areas
ScholarlyEndeavors
Slide adapted from Carol Tenopir’s presentation,
“ForumValue, Outcomes, and Return on Investment of
Academic Libraries (Lib-Value) (funded by IMLS)” at the
January, 2010 ARL Assessment Forum, Boston, MA.
THE LIB-VALUE PROJECT:
28. KEY QUESTIONS:
• Does the reputation of a
university’s library influence
Enrollment?
Recruitment of faculty and
students?
Material or financial
donations?
• Do library resources and/or
services play a role in
Student success?
Retention?
29. • Unsure of focus/type of
research to conduct
Institutional priorities
Regional accreditation
Supporting student
learning
Facilitating a culture of
assessment
Engagement in the
Scholarship ofTeaching
and Learning (SoTL)
30. QUESTIONS:
• What are you doing at your library NOW to evaluate
electronic resource purchases?
• What would you like to be doing?
• Why are you NOT doing it?
• What could you STOP doing in order to perform
more granular and expansive data analysis?
31. ACTIVITY:
• Think of an evaluative project at your institution
What is the objective of this project?
What is the question this project is designed to answer?
Which specific tools are being applied to answering the
question?
What kind of data is being generated or gathered to answer
this question?
Are there tools, approaches, or data that might be more
suitable to the job?
Which?
Why?
32. • Different evaluations for different products – one
checklist cannot fit everything
• Determination of “cost-effective” = community
needs analysis
• Discipline-specific, project-based = manageable
• Stewards of e-resources = using evidence rather than
assumption to justify expenses
34. COMMUNITY NEEDS ANALYSIS, SIMPLIFIED
• Why do you want to conduct a community needs
analysis for e-resources? (Goals & objectives)
• Assess current collection/services situation – what data
do you need to conduct assessment and how will you do
this (methods)?
• Who is your community? – what data you need to
answer this question and how will you do this
(methods)?
• What does your community need? – what data do you
need to answer this question and how will you answer
this question (methods)?
35. PATRON-DRIVEN ACQUISITION
• Exploit vendor-provided resources – they can help
with workflow (e.g., ebrary, EBL)
• Collection management becomes risk management
Maintaining largest pool of titles possible?
Removing and adding titles based on demand?
Building rules for different publishers
36. NEGOTIATION AND E-RESOURCES
• In current environment, negotiation is a given
• Licenses control risk and are written by lawyers
Who is at greatest risk for infringement?The library?
The vendor/publisher/content provider?The user?
• Approach all with dispassion
• Licenses, like evaluation, should reflect local needs
37. BATNA
• BATNA: Best alternative to a negotiated agreement
– alternatives almost always exist
• Use it in lieu of a bottom line
• Fisher and Ury: BATNA as key to going into
negotiation confidently
• Is free good enough for your user community based
on your community needs analysis, including
evaluation of usage statistics?
38. NEGOTIATION CHECKLIST(S)
• Conducted background research?
• Identified BATNA?
• Determined deal breakers?
• Differentiated less important items from true deal
breakers?
• Communicated with appropriate personnel?
• Created shared document that outlines internally-agreed
upon definitions, etc. for:
Authorized users
Signatory authority
Jurisdiction
39. LICENSING CHECKLIST(S)
Licensing experts* advise including:
• The name of the licensor and licensee
• The name of the person in your organization who
has negotiating and signing authority for
agreements
• A description of the content being licensed
• The duration of time for licensing the content
*Becky Albitz, Rick Anderson,Trisha Davis, Fiona Durrant,Ann Okerson, and more
40. SAMPLE ITEMS TO INCLUDE
• Rights (should all be yes)
• Organization X’s Responsibilities (should all be no)
• Vendor Responsibilities (should all be yes)
• Unacceptable terms (and why, with applicable
policies, state statutes, etc.)
42. 1. Fleming-May, Rachel A., and Jill E. Grogg. 2010. The concept of electronic resource usage and libraries.
Vol. 46, LibraryTechnology Reports.
2. Swigger, Keith, and Adeline Wilkes. 1991.The use of citation data to evaluate serials subscriptions in an
academic library. Serials Review 17 (2):41-46; 52.
3. Ibid.
4. Peters,Thomas A. 2002.What's the use? the value of e-resource usage statistics. New LibraryWorld 103
(1172/3):39-47.
5. Miller, Rush, and Sherrie Schmidt. 2002. E-Metrics: Measures for Electronic Resources. Serials:The
Journal for the Serials Community 15 (1):19-25.
6. Sin, Sei-Ching Joanna, and Kyung-Sun Kim. 2008. Use and non-use of public libraries in the information
age: A logistic regression analysis of household characteristics and library services variables.Library &
Information Science Research 30 (3):207-215.
7. Maybee, Clarence. 2006. Undergraduate Perceptions of Information Use:The Basis for Creating User-
Centered Student Information Literacy Instruction. TheJournal of Academic Librarianship, 32(1), 79-85.
8. Levine-Clark, Michael. 2006. Electronic Book Usage: A Survey at the University of Denver. portal:
Libraries and the Academy 6 (3):285-299.
9. Luther, Judy. 2008. University investment in the library: What's the return? InLibrary ConnectWhite
Papers.
10. Tenopir, Carol. 2010. University Investment in the Library, Phase II: An International Study of the
Library'sValue to the Grants Process. In Library ConnectWhite Papers.
Notas del editor
Justifying expendituresEvaluating performanceComparing resources or services, often for the purpose of acquiring/”deacquiring”
“Among other changes, the Complete College Tennessee Act funds higher education based in part on success and outcomes, including higher rates of degree completion.”
E-Metrics: Measures for Electronic ResourcesbyRush MillerUniversity Librarian and DirectorUniversity of PittsburghandSherrie SchmidtDean of University LibrariesArizona State University LibraryKeynote delivered at the 4th Northumbria International Conference on PerformanceMeasurement in Libraries and Information Services P. 3
(Swigger & Wilkes, 1991, p. 42)
Such as the 80/20 rule…
Library & Information Science Research 30 (2008) 207–215Sei-Ching Joanna Sin ⁎, Kyung-Sun Kim p. 210