SlideShare una empresa de Scribd logo
1 de 37
The Importance of Being Earnest: 
Marrying COUNTER Statistics to Inter-Library Loan Data 
to Produce a Sincere Collection Management Device 
2014 Annual Charleston Conference 
Nancy Abashian & Stephanie Hess 
Binghamton University Libraries
Act 1. Scene 1. A chance social meeting 
• Buy vs. Borrow 
– Aligning tools with data 
– Incorporating collection development criteria 
– Role of subject specialists 
– Budget demands
Lady Bracknell: I have always been of the opinion that a man who desires to get married 
should know either everything or nothing. Which do you know?
Web Reports -- Atlas
Generate Web Report
Export To Microsoft Excel
Search Criteria: Journal Report 1
ILLiad Custom Search Results
Parsing ILLiad Reports for COUNTER 
Finished Requests 
(Borrowing/ Document Delivery/ Lending) 
Document Type 
Article 
(Journal Reports) 
Loan 
(Book Reports) 
WebRequestForm 
ArticleRequest/ 
BookChapterRequest
Lending Costs: IFM Cost
ILLiad data → ERMS payment record
Algernon: I don't play accurately - any one can play accurately - 
but I play with wonderful expression.
Algernon: The truth is rarely pure and never simple. 
Use existing 360 
Core database? 
No! We need individual 
titles to hang cost data & 
usage statistics.
Create a Library Specific Holdings Database 
Uncheck the default 
boxes as necessary
Adding Titles to the ERMS database: Manually Add / Batch-load
Holdings Import message
Upload Status for Batch Title Upload 
** If an upload fails, a link will appear directly 
below the Upload Status on which you can 
click to view all flagged errors.
Library Specific Holdings Database
Cost Data Template: Download & Upload
Algernon: I am eating muffins because I am unhappy.
Outcome: Multiple Payments
Payment Record: Base Fee (Borrowing) 
= Lending Library OCLC Symbol 
Fund names include Process 
Type, Department, Lending 
Library, Patron Status, & 
ILLiad Fee/ Cost type
Payment Record: IFM Cost (Borrowing) 
= Lending Library OCLC Symbol
Payment Record: IFM Cost (Lending)
Jack Worthing and his “brother” Earnest 
360 Counter → Intota Assessment 
– Platforms currently running side by side 
– Full transition deadline was December 6, 2014 
• ProQuest has extended the transition time for users to move to 
the new reporting interface to January 26th, 2015
Waiting for Aunt Augusta...
Algernon: The very essence of romance is uncertainty.
(De)Deduplication
Column A: Sort A to Z 
Conditional Formatting to highlight Duplicate Values
Beware of Formatting & Corrupt ISSNs 
See http://www.projectcounter.org/code_practice.html
Choose Vendor > File > Upload
Jack Worthing a.k.a. 360 Counter
A successful engagement?
Great expectations...
Baby in a hand-bag ≠ Manuscript in a pram 
• Avoiding confusion during data output/ input 
– The glories of Excel and csv files 
• Easy to manipulate; Use conditional formatting feature to highlight duplicate values 
– Auto-harvest is not an option (no SUSHI) 
– Label all files very clearly 
• The challenges of normalization 
– Inconsistent/ incomplete ILL patron request forms 
– Relevant paid subscriptions/ DDA/ POD 
– Open/ free / reciprocal access 
• Advantage: Serials Solutions citation linker in ILLiad
Happily ever after? 
Nancy Abashian 
Head of Reader Services 
abashian@binghamton.edu 
Stephanie Hess 
Electronic Resources Librarian 
shess@binghamton.edu

Más contenido relacionado

Similar a Charleston conference - importance of being earnest 20141201 final revised

So much data so many uses with notes
So much data so many uses with notesSo much data so many uses with notes
So much data so many uses with notesGailBordenTech
 
Research Statement - 22 Examples, Format, Pdf E
Research Statement - 22 Examples, Format, Pdf EResearch Statement - 22 Examples, Format, Pdf E
Research Statement - 22 Examples, Format, Pdf EBrenda Thomas
 
One Small Step for Fundraising: Online Donations with iATS Payments
One Small Step for Fundraising: Online Donations with iATS PaymentsOne Small Step for Fundraising: Online Donations with iATS Payments
One Small Step for Fundraising: Online Donations with iATS PaymentsIdealist Consulting
 
Show Me the Money grants training 11-10-15
Show Me the Money grants training 11-10-15Show Me the Money grants training 11-10-15
Show Me the Money grants training 11-10-15Laura Helle
 
GIST Acquisitions Manager Preview - 2011 ILLiad Conference
GIST Acquisitions Manager Preview - 2011 ILLiad ConferenceGIST Acquisitions Manager Preview - 2011 ILLiad Conference
GIST Acquisitions Manager Preview - 2011 ILLiad ConferenceGetting It System Toolkit
 
Treasury management resume november 2015
Treasury management resume november 2015Treasury management resume november 2015
Treasury management resume november 2015llamunyon
 
How to Elevate Recruiting Performance with Competitive Intelligence
How to Elevate Recruiting Performance with Competitive IntelligenceHow to Elevate Recruiting Performance with Competitive Intelligence
How to Elevate Recruiting Performance with Competitive IntelligenceIntelCollab.com
 
You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...
You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...
You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...HostedbyConfluent
 
Kobotoolbox: Most powerful quantitative data collection tool
Kobotoolbox: Most powerful quantitative data collection toolKobotoolbox: Most powerful quantitative data collection tool
Kobotoolbox: Most powerful quantitative data collection toolAlexZayarPhyoAung
 
Garbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and Wise
Garbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and WiseGarbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and Wise
Garbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and WiseBloomerang
 
OCLC Resource Sharing Stats Overview
OCLC Resource Sharing Stats OverviewOCLC Resource Sharing Stats Overview
OCLC Resource Sharing Stats Overviewkramsey
 
Keeping Governments Accountable with Open Data Science: Extracting and Analyz...
Keeping Governments Accountable with Open Data Science: Extracting and Analyz...Keeping Governments Accountable with Open Data Science: Extracting and Analyz...
Keeping Governments Accountable with Open Data Science: Extracting and Analyz...odsc
 
Microservices, Events, and Breaking the Data Monolith with Kafka
Microservices, Events, and Breaking the Data Monolith with KafkaMicroservices, Events, and Breaking the Data Monolith with Kafka
Microservices, Events, and Breaking the Data Monolith with KafkaVMware Tanzu
 
Access PA and interlibrary loans
Access PA and interlibrary loansAccess PA and interlibrary loans
Access PA and interlibrary loansFrances Vita
 
How, what, where, and what's next in online giving
How, what, where, and what's next in online giving How, what, where, and what's next in online giving
How, what, where, and what's next in online giving FirstGiving
 
CV Ayomipo Ajayi
CV Ayomipo AjayiCV Ayomipo Ajayi
CV Ayomipo AjayiAyo Ayo
 

Similar a Charleston conference - importance of being earnest 20141201 final revised (20)

So much data so many uses with notes
So much data so many uses with notesSo much data so many uses with notes
So much data so many uses with notes
 
Research Statement - 22 Examples, Format, Pdf E
Research Statement - 22 Examples, Format, Pdf EResearch Statement - 22 Examples, Format, Pdf E
Research Statement - 22 Examples, Format, Pdf E
 
One Small Step for Fundraising: Online Donations with iATS Payments
One Small Step for Fundraising: Online Donations with iATS PaymentsOne Small Step for Fundraising: Online Donations with iATS Payments
One Small Step for Fundraising: Online Donations with iATS Payments
 
Show Me the Money grants training 11-10-15
Show Me the Money grants training 11-10-15Show Me the Money grants training 11-10-15
Show Me the Money grants training 11-10-15
 
GIST Acquisitions Manager Preview - 2011 ILLiad Conference
GIST Acquisitions Manager Preview - 2011 ILLiad ConferenceGIST Acquisitions Manager Preview - 2011 ILLiad Conference
GIST Acquisitions Manager Preview - 2011 ILLiad Conference
 
Treasury management resume november 2015
Treasury management resume november 2015Treasury management resume november 2015
Treasury management resume november 2015
 
How to Elevate Recruiting Performance with Competitive Intelligence
How to Elevate Recruiting Performance with Competitive IntelligenceHow to Elevate Recruiting Performance with Competitive Intelligence
How to Elevate Recruiting Performance with Competitive Intelligence
 
Web Site Evaluation
Web Site EvaluationWeb Site Evaluation
Web Site Evaluation
 
You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...
You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...
You Put *What* in Your Stream?! Patterns and Practices for Event Design with ...
 
Kobotoolbox: Most powerful quantitative data collection tool
Kobotoolbox: Most powerful quantitative data collection toolKobotoolbox: Most powerful quantitative data collection tool
Kobotoolbox: Most powerful quantitative data collection tool
 
Garbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and Wise
Garbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and WiseGarbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and Wise
Garbage In = Garbage Out – Keeping Your Donor Database Healthy, Wealth, and Wise
 
OCLC Resource Sharing Stats Overview
OCLC Resource Sharing Stats OverviewOCLC Resource Sharing Stats Overview
OCLC Resource Sharing Stats Overview
 
Keeping Governments Accountable with Open Data Science: Extracting and Analyz...
Keeping Governments Accountable with Open Data Science: Extracting and Analyz...Keeping Governments Accountable with Open Data Science: Extracting and Analyz...
Keeping Governments Accountable with Open Data Science: Extracting and Analyz...
 
Microservices, Events, and Breaking the Data Monolith with Kafka
Microservices, Events, and Breaking the Data Monolith with KafkaMicroservices, Events, and Breaking the Data Monolith with Kafka
Microservices, Events, and Breaking the Data Monolith with Kafka
 
School Management System
School Management SystemSchool Management System
School Management System
 
cv of CHERNAY PAULSEN
cv of CHERNAY PAULSENcv of CHERNAY PAULSEN
cv of CHERNAY PAULSEN
 
Access PA and interlibrary loans
Access PA and interlibrary loansAccess PA and interlibrary loans
Access PA and interlibrary loans
 
How, what, where, and what's next in online giving
How, what, where, and what's next in online giving How, what, where, and what's next in online giving
How, what, where, and what's next in online giving
 
CV Ayomipo Ajayi
CV Ayomipo AjayiCV Ayomipo Ajayi
CV Ayomipo Ajayi
 
April 2015 Newsletter
April 2015 NewsletterApril 2015 Newsletter
April 2015 Newsletter
 

Charleston conference - importance of being earnest 20141201 final revised

Notas del editor

  1. Act 1. Scene 1. Please allow me to set the stage for our presentation. As the Head of Reader Services, mid way through this year, I was grappling with bringing Resource Sharing into the fold of my department after it had lived in Technical Services for several years. Having researched best practices, I shared some of what I found interesting-specifically about the data within ILLiad-with my colleagues piquing the interest of Stephanie, our Electronic Resources Librarian. We convened to discuss the possibility of marrying ILL data to COUNTER reports brainstorming through all of the anticipated challenges, how would we go about correlating ILL transactions to COUNTER report types as well as reconciling different COUNTER releases (R1, R2, R3& R4) in the ERMS. The conversation centered the data around what type of reports ILLiad can provide and what types of COUNTER reports can be matched with the ILL data. Speaking from a Resource Sharing perspective, this conversation was really born out of the Buy vs. Borrow dilemma. When exactly is it more advantageous for a library to outright purchase an item rather than expend time, energy and money to borrow an item through inter-library loan? There are many parameters by which libraries operate in order to attempt to answer this question. Thresholds are often set by subject bibliographers that include the number of ILL requests for a specific item, the bottom line they are willing to pay and the number of reciprocal libraries willing to lend-and this is just to name a few. In some cases, the thresholds seemed somewhat arbitrary. In any case, we did not have a great deal of data on the subject to determine if we were following the trends in the buy vs. borrow arena or if we needed to look at the problem using different measurements. Stephanie and I set out to develop a cost-per-use model that might assist in the decision making process. We began by examining the tools available to us for data collection-ILLiad and COUNTER in our ERMS, and what information we could compare between the two. We then hoped to incorporate specific criteria developed by the subject specialists and the parameters they must work with in terms of their specific subject budgets.
  2. Let me begin with a brief description of ILLiad for those of you who are unfamiliar. ILLiad is the engine that drives Inter-library loan for many libraries. Others may use OCLC’s WorldShare Resource Manager, but at Binghamton we use ILLiad. ILLiad transmits requests for borrowing requests or, those items our library does not own. And lending requests-those items other libraries would like to borrow from our collection. It is also used to measure document delivery-requests created by our Binghamton patrons for items held at Binghamton (effectively making us our own biggest lender) In any case, ILLiad is a powerhouse of data when looking for information that might identify gaps in a collection, most requested items and the like. In order to harness this data, we have a couple of options.
  3. We began by examining our option for reporting within ILLiad. Conveniently located within the the ILLiad client, we have access to canned web reports for all functions within ILLiad: Borrowing, Lending and Document Delivery. For the purposes of our project though, we determined that we were primarily looking to identify information within Borrowing Reports, or, on the reports for requests for items we do not own. By focusing on borrowing, we were looking at what it cost our library to acquire a single item, for a single user.
  4. Further review of all of the available borrowing reports within ILLiad allowed us to identify the Requests Fulfilled report as the report with the most significantly valuable data for our project. Put differently, this report included information which was best suited to be compared with COUNTER statistics in the ERMS.
  5. After selecting the Request Finished borrowing report, which incidentally includes borrowing requests which are both successfully completed or “filled” as well as those which are cancelled, you must set the parameters for that data which you are looking for- which includes dates and request type-in our case, articles or loans. Once you’ve established these parameters, you may generate your report.
  6. Generating the report initially reports in this brief, fairly tidy graphical representation of what requests have been finished, and what have been canceled and why among other things. It is at this point that you will need to take the additional step of exporting this data into an excel spreadsheet in order to go through some additional stripping out of the unnecessary details that I will describe shortly-this process needs to be completed before the data is ingested into the ERMS for further evaluation.
  7. Another option for reporting within ILLiad is utilizing the custom search function. We found that this option allowed for a more granular result overall. Harvesting ILLiad data by transaction dates, transaction status, and document type refined the process and allowed for greater flexibility in reporting. By custom searching, we were able to better match the criteria for the JR 1 and other COUNTER reports utilizing this approach. Using this approach we were able to match these ILLIad reports with successful text retrieval by title and month in COUNTER.
  8. This is what a custom search looks like: We took a few additional steps to customize our searches: Strip out all confidential information (i.e. names, addresses, e-mail, etc.) as well as extraneous columns, i.e. “Cited in” “Document Type”, “Web Request Form”, etc. Filter by Lending library (No BNG/ Annex/ Science because those are tracked via Document Delivery Report) Filter by month to get totals. All totals were converted to PDF because that’s our standard delivery format. De-dupe title list Fields to use in ERMS records: Transaction number as Local System reference ID; Process Type of Borrowing/ Lending; ** Document Type determines which COUNTER Report to use, i.e. Articles = Journal Reports and Books = Book Reports *System ID was mostly OCLC so we use OCLC as the vendor for COUNTER report uploads
  9. Following the data harvesting, initially looking at the finished borrowing requests, we discovered that we created 195,319 unique Borrowing Requests, these requests are created by our patrons using a Web Request form. The web request form requires a user to indicate the type of material used; either article, book chapter or loan. Knowing the document type enabled us to make comparisons with COUNTER using the JR1/JR1a report for articles and chapters and the BR1 report for loans. This step was the tie breaker between using the canned web reports and the advantage of using the custom search feature within ILLiad. We found the web reports to be helpful initially, but cumbersome when parsing out the details.
  10. Another important thing to take into consideration on the Resource Sharing side is the cost analysis that must be understood before making data comparisons. First is the understanding of IFM, the standard monetary exchange between institutions during inter-library loan transactions. This money is transacted behind the scenes, often without the ILL processing clerk knowing the bottom line at a transactional level. Many libraries, like our own, have reciprocal relationships with other libraries which result in free lending back and forth at no cost-none the less other OCLC & non-reciprocal libraries do charge fees paid through our OCLC ILL fee management system, IFM.
  11. Last but not least, when extracting and interpreting ILLiad data is the consideration the base fee category when incorporating the information into the ERMS. Where the IFM Cost represents what Libraries charge to transact an ILL request between one another, the base fee includes copyright and other extraneous charges to have rights to the item.
  12. So as Nancy was saying, this could help generate cost-per-use by combining COUNTER usage stats + ILL stats and Aleph payments + ILL Fees (if any). We’re using information contained in ILLiad’s Base Fee and IFM Cost fields when available but are considering adding other ILL associated costs such as postage, copyright fees, estimated staff time, license/ subscription fees, office supplies, etc. in the future. Once you’ve populated the template with your selected cost data, you can then upload the file into the ERMS which will batch process it, thereby creating your payment records. This is what the basic cost data template looks like in Serials Solutions 360 Resource Manager. Certain fields, such as the Fiscal Year, are mandatory for every type of payment in order to successfully upload your cost data. However, required fields can also vary depending on what level of payment you’re entering. For example, entering cost data at the provider level requires that the provider name and code be completed although you don’t need to include that particular data when entering database or individual title level payments. Inputting payment information at the database level allows you to omit individual titles, Title ID, and Serials Solutions ID numbers. Now 360 Counter (the accompanying assessment tool for 360 Resource Manager) can ingest many different versions and types of Project COUNTER reports but it can only deploy its cost-per-use metric for individual titles (which is what ILLiad gave us) when sourcing from select COUNTER reports such as JR1. We also decided to use the most recent COUNTER release (R4) because of Serials Solutions’ pending transition of 360 Counter to Intota Assessment. The new assessment interface will only accept R3 and R4 COUNTER reports. Due to this infrastructure feature of the ERMS and the nature of our ILLiad-supplied data, we opted to go with the individual title template so we had to include the Database Name, Database Code, Title Name, Title ID, and SSID.
  13. We initially thought that we might be able to utilize an existing Serials Solutions database but found this option to be unfeasible because there are no individual titles in the existing OCLC ILLiad database entry. And given that the details of the new database are very much localized and only pertinent to our library, we subsequently decided to create a new Library Specific Holdings database that would allow the ERMS to generate the requisite fields (i.e. Database Name, Database Code, Title Name, Title ID, and SSID) in order to provide us with the proper means of hanging our ILLiad cost data and usage statistics.
  14. Our initial load consisted of ILLiad journal and book titles that had a Transaction Status of “Finished Request” since we planned to begin with JR1 and BR1 COUNTER reports. To create the new Library Specific Holdings Database we harvested the relevant data according to the ILLiad Custom Search Criteria described earlier and exported the entire title list as an Excel file. At this point, we sorted, “de-duped”, and normalized the ILLiad search results list by using the Conditional Formatting feature to identify duplicated titles and then hid the duplicates. We next created a master ILLiad Database Title File using the Serials Solutions Database Title template. With the duplicates hidden, it was easy enough to copy and paste the titles for each year into the Serials Solutions Database Title template and then save the whole kaboodle as a csv file. We also performed another de-duplication of that compilation prior to upload the file to weed out any overlooked duplicates. (Please note that we retained the original ILLiad files ‘as-is’ for use in compiling the usage statistics into the relevant COUNTER templates which we’ll discuss in a few moments). A few tips: Be sure that the Display in boxes for 360 Core, 360 Link, and 360 MARC Updates are unchecked or you might experience most unpleasant effects in your catalog, on both the public and staff sides. Please do not rush this part of the process. Develop your own local standards and adhere to them as much as possible. Aim for accuracy in normalizing your titles and consistency while populating templates and refer to the vendor’s Knowledge Base and Ulrich’s Web when verifying title metadata. You may also wish to confer with a specialist if you have a lot of foreign language titles with which you are unfamiliar.
  15. Here we see the Upload Library Specific Holdings page where we can download the empty template and then upload the populated version. The ERMS processes Database Title Template files rather quickly depending on their size. When we uploaded our revised ILLiad Database Title List , the ERMS automatically generated the Database Name, Database Code, Title ID, and SSIDs that we needed for the cost data template in less than an hour. Tip: Once you’ve input the bulk of your titles, be careful not to overwrite the existing database afterwards or you will lose any cost data or notes attached to the earlier holdings. To prevent this problem from occurring, be sure to uncheck the “Overwrite Existing” box. I also highly recommend downloading a backup file of the entire database before uploading a new set of titles by clicking on the “Download Backup” button and saving the txt file. Personally, I prefer to save our backups to the library shared drive, on my own hard drive, and to a USB drive.
  16. You’ll receive a holdings import e-mail message once the ERMS processes the database title template. This message will indicate whether or not the upload was successful.
  17. However, even before receiving this e-mail, you can view the status of a pending upload by going into the database details general titles list. And if an upload does fail, a link captioned “Failures” will appear directly below the Upload Status on which you can click to view all of the flagged errors. This “mini-report” also provides the corresponding line numbers in the CSV file, allowing you to quickly locate and rectify the incorrect data before attempting another upload.
  18. And here we behold the end-result of our successful Database Titles upload. As you can see, the Library Specific Holdings Database has been populated with our ILLiad title list and is ready to have cost data added. We’d like to take this opportunity to point out that this landing page houses the Upload Title and Download Titles buttons. As an alternative to the Download Backup button, you may also use the Download Titles button to generate backup files of the entire database as well. And you can utilize the “Add Title” button to manually add a single title.
  19. As previously mentioned, to generate a cost-per-use metric for individual titles in 360 Counter, it’s necessary to upload COUNTER reports that contain individual titles such as Journal Reports 1, 1a, 1GOA, etc. / Book Report 1, 2, 3, etc. / Title Report 1, 2. Serials Solutions 360 Resource Manager has a cost data upload tool that can create cost data templates for your individualized holdings by database. Now that we’ve created our OCLC ILLiad test database, it will appear in the drop down list under Journal Titles Cost Template. This allowed us to download a complete list of ILLiad holdings and populate the required fields.
  20. Once you’ve uploaded a populated cost data template, the ERMS will generate a payment ID and you’ll be able to generate a Serials Solutions Management report with the full inventory of cost data that’s now available. This is rather helpful in facilitating quality control since you can download the cost data details report as a CSV file and verify if you’ve overlooked any fields or make changes if necessary. Right now we have a total of ~16K payments in the ERMs (only 150 of which are for ILLiad) so I definitely filter and sort according to ERMS status type, database, etc. and then scan for empty fields.
  21. And here we see an individual title (i.e. Research on Social Work Practice) with multiple ILLiad base fee payments attached to it.
  22. And when you open a payment line for Research on Social Work Practice, you’ll see the details relevant to this particular ILL transaction. Note here the cost associated for this item is derived from the Base Fee-the fee I have previously described as covering copyright payments. You’ll also notice that a little rainbow symbol obscuring the OCLC symbol of the lending library. Discretion is the better part of valor after all.
  23. Here we see an example of a payment record for an IMF Cost (Borrowing) transaction: The Wildean in the Edit View. We’ve added multiple fund names to indicate the payment type, ILLiad process type, lending/ borrowing library as represented by the OCLC symbol, discipline, and patron classification. Remember: the IFM Cost for borrowing is associated with the amount libraries charge one another for a transaction. 360 Counter does have a Fund Name Data report which permits us to filter our payment data by Fund name so we’ve decided to use Fund Names as convenient handles to retrieve information about multiple facets of ILLiad data, including Process Type, Subject Area/ department, Lending Library, Patron Status, etc.
  24. If, and when, you see a negative IFM Cost, this can be interpreted as a lending transaction-an instance where another library paid your library to borrow your item.
  25. In The Importance of being Earnest, the motif of double identities creates several problems for the players. Earnest has always been Jack’s unsavory alter ego and Jack has used Ernest as an escape from real life. Like Jack, we are currently scrambling to reconcile two environments, or rather, assessment platforms, in order to get what we ultimately want, i.e. valid cost-per-use figures based on ILLiad data. Right now ProQuest/ Serials Solutions is in the midst of transitioning from 360 Counter (which uses PivotLink as the architecture for generating reports) to Intota Assessment (which is Oracle-based). The platforms are currently running side by side until the new platform has completed its beta stage and all existing data has been migrated. The original deadline for the transition was December 6, 2014 but has now been extended to January 26, 2015.
  26. Unfortunately, several known issues with 360 Counter have cropped up recently that are delaying our ability to generate cost-per-use reports at the moment. Most notably these include the system’s temporary inability to reliably display the cost-per-use column in reports. Based on updates we receive from ProQuest (such as the one shown in this slide), the vendor is aggressively pursuing remedies for this and other problems so we anticipate being able to view our cost-per-use for ILLiad data fairly soon. Hopefully, we’ll be able to include an update in our forthcoming proceedings paper, complete with screen shots that show the cost-per-use for ILLiad reports.
  27. In the meantime, we’re forging ahead with inputting our cost and usage data (eating muffins so to speak). To date, we’ve input Journal Report 1 (R4) and Book Report 1 (R4) for years 2011 through 2014. This is what the completed COUNTER JR1 (R4) looks like. As you can see, we omitted certain data such as publisher, journal DOI, and Proprietary Identifier but used the titles and usage stats from the original ILLiad report. So how exactly did we “COUNTER”ize the ILLiad data?
  28. Given that 360 COUNTER will reject reports if titles are duplicated, we had to spend a great deal of time normalizing the list and removing duplicates. Again, the Conditional Formatting function to highlight Duplicate values was key.
  29. We sorted the titles in Column A after highlighting the duplicate values and then painstakingly went through the list line by line, normalizing each title and compiling the usage for each month. Obviously this was a lengthy, tedious process but the good news is that the bulk of the ILLiad data was archived so once you complete a year it’s done. And future maintenance will be minimal by comparison.
  30. Other things to be aware of when working with the raw ILLiad data are the all too common presence of corrupted ISSNs which is largely due to numerous, miscellaneous patrons entering whatever identifying information they can find into the WebRequest Forms. Always be sure to properly format the spreadsheet according to the COUNTER Code of Practice, especially with regards to dates. All cells with null values must contain a zero; they cannot be blank or have text in them. See http://www.projectcounter.org/code_practice.html
  31. Once the files are ready, go into the 360 Counter module and choose Upload which will take you to the next page (pictured here) where you can select your vendor and file, then click upload. Here we can see that our file has been accepted by the system where it will go through a multiple step validation process before being fully ingested which usually requires ~48 hours. However, the duration of this process has lengthened a bit because of the development work being done on the ERMS.
  32. To monitor the status of recently uploaded reports, simply go to the 360 Counter page and look for a status symbol next to each report. An hour glass indicates that a file is in process, a green (or occasionally yellow) checkmark means that a file has successfully uploaded, and a red ban symbol means that an upload failed.
  33. After the COUNTER reports and the cost data for the corresponding years have been input, you should be able to run various reports that will show the cost-per-use by title. As mentioned earlier, recently added cost data for ILLiad titles (e.g. Camerawork) isn’t displaying properly but we can see some older cost data does in the cost-per-use column for other providers/ platforms.
  34. Our great expectation is, that as soon as the platform anomalies have been rectified, we will be able to generate reports that include ILLiad costs and usage and be able to compare and chart this information with other COUNTER-compliant vendors. We also hope that the practices we have formulated during this project will be transferrable and help us COUNTER-ize the usage and cost data for other vendors that collect statistics in non-standardized ways or follow a different set of standards like ICOLC.
  35. Harvest the data one year at a time and clean all aspects of said data before proceeding to another time frame; Allow yourself ample, uninterrupted time for this part of the process; do not rush! Perform regular quality control checks Once the archived data has been input, the bulk of the work will be done. Ongoing maintenance can be performed on a quarterly, semi-annual, or annual basis to coincide with your regularly scheduled usage statistics upload. Normalization can be challenging. The web request forms from which we harvest our data often are incomplete or include incorrect or corrupt information. In the case of articles, often Resource Sharing purchasing from document suppliers-and working outside of ILLiad, how can we include this information in the future? When we borrow freely within our consortiums or within reciprocal relationships, how do we assess hidden costs of couriers, memberships, man power, etc. One significant Advantage and one that deserves further consideration for our project =Serials Solutions Citation Linker Because of our participation in the Information Delivery Services Project, IDS, we are able to, using an ILLiad add on, do a full Serials Solution Citation search for an article and processes the ILL licensing data through what is known as the ALIAS API. ALIAS allows ILLiad to perform unmediated article request processing by managing licenses and performing load leveling for partner libraries. By utilizing this service we spend far less time processing requests and request more frequently from free partners.