Presented at the 2010 Electronic Resources & Libraries Conference. --
Sally R. Krash, Southwest Research Institute --
Abstract: Usage statistics with cost per use have become an important factor in evaluating an electronic resource. This session will consider the ways to gather usage statistics and also look at the other points of considerations in e-resource evaluation. Discussion will also cover the value of usage statistics and whether we are just being overwhelmed by numbers.
3. Gathering
• 2003-2007
– Do it yourself/spreadsheet
• 2008
– Tested 2 different solutions
• 2009-present
– Harrassowitz E-Stats (Solution 2)
Do-it yourself Solution 1 Solution 2
Gather data
Manipulate data
Upload data
Generate
spreadsheet
Generate
charts/graphs
Decision
3
4. Gathering
• Solution 1
– Gathering & manipulating data was time consuming &
difficult
– Database structure did not reflect our subscriptions, so
it didn’t work for us
• Solution 2
– Vendor did the hard parts
– Spreadsheet, delivered twice a year, can be sorted by
title or publisher, which works for us
– We have requested that subject information be
included to help with collection development.
4
5. Evaluation
• 2004
– Cost-per-use
• Counter JR3 + DB1/DB3
• 2005-2007
– Cost-per-use
– Cost-per-article
• Counter JR1
• 2008-present
• Cost-per-download
– Counter JR1 + BR1/2
• Cost-per-search (citation databases)
– Counter DB1
JR1 Successful Full-Text Article Requests by Month & Journal
JR3 Number of Successful Item Requests by Month & Journal
DB1 Total Searches & Sessions by Month & Database
DB3 Total Searches & Sessions by Month & Service
BK1 Number of Successful Title Requests by Month & Title
BK2 Number of Successful Section Requests by Month & Title
http://www.projectcounter.org/
5
6. Evaluation
2004 2005 2006 2007 2008
Cost-per- $6.58 $6.15 $10 $7.61 n/a
use
Cost-per- n/a $22.52 $24 $15.52 $22
download
Cost-per- n/a n/a n/a n/a $9.23
search
Review cost data. May
Set cut-off point. (cost and usage)
Generate list of all resources that fall beyond that point. (Cancellation list)
Initial review of list. (in Library)
Submit preliminary renewal list to subscription agent for pre-payment.
Send adjusted list to Library Committee, along with list of needed additions.
Revise lists after Library Committee review.
Submit final renewal/addition/cancellation lists to subscription agent. Aug.
6
11. Future
• Counter Reports:
– Counter JR2 & BR3 Turnaways. We would like to see more publishers
provide this data. We can use it for collection development.
– Counter JR5 Number of Successful Full-Text Article Requests by Year-of-
Publication & Journal. Now we can finally see current year usage vs.
older/but recent content.
– Counter J/BR1 (optional) Number of Successful Full-Text Item Requests by
Month & Title. We don’t care if it’s a book or journal, we care about the
download. We are gathering JR 1 & BR1 anyway, and we are looking at
individual titles to assess a cost-per-download.
• New for 2010 (for SwRI)
– Referex: We recently added Referex, an e-book subscription package that
only offers search data, no download data. How do we evaluate this full-
text resource with only search data?
– E-book purchases: In this case, we are making the purchase up-front, and
then getting usage data. How will we evaluate usage/cost in this case?
11
Notas del editor
Renewal considerations:Cost-per data + current research needs + research trends + needed new resources. “Core Collection” no longer a factor for us.
We look at downloads by journal & publisher.
We look at searches by platform (citation databases); we look at trends and we prefer to see an upwards trend.
We also look at activity by subject area for both journals & books, and use this information for collection development.
Usage = downloads + searches (citation databases only). We would like to see the blue line get closer to the red line.