2. Collections in the late 1990s Apple Computer Inc. records Douglas Engelbart papers Stephen Cabrinety collection By 2000, over 7,000 items of legacy computer media received as part of hybrid collections Now over 26,000 items recorded during accessioning process
4. First Digital Lives Research Conference: Personal Digital Archives for the 21st Century
5. FRED (Forensic Recovery Evidence Device: Digital Intelligence) Software: FTK suite (AccessData) - EnCase
6. AIMS Born-Digital Collections: An Inter-Institutional Model for Stewardship University of Virginia Yale University Hull University Stanford University Funded by the Andrew W. Mellon Foundation
7. Robert Creeley papers Stephen Jay Gould papers Keith Henson papers re: to Project Xanadu Peter Rutledge Koch papers
8.
9. Stephen Jay Gould Influential American paleontologist, evolutionary biologist and historian of science, Gould began his career at Harvard University in 1967 and worked until his death in 2002. 98 3 ½” floppy diskettes 61 5 ½” floppy diskettes 4 sets of punch cards 3 computer tapes
10. Dear Peter,Unfortunately we do not manufacture any motherboards now a days which can support the 5.25 floppy. The interface are different than 3.5 and they are becoming obsolete and are no longer available on the newer motherboards.
38. Email Mining on Peter Koch’s Emails http://suif.stanford.edu/~hangal/muse/
39. What are our Roles? Donors & users expect us to acquire, organize, preserve and provide access to b-d collections Special Collections staff capture, appraise, arrange and describe b-d materials AND contribute to requirements for both access and delivery as well as arrangement and description tools Our digital group will preserve in our preservation repository (SDR) and provide public access and invite participation – Hypatia (under development)
40. Challenges Read contents from storage media (punch cards, tapes, 8/5.25/3.5 inch. floppy diskettes, Zip disks, etc.) View contents with different formats (WordPerfect, Lotus 1-2-3, Quark files, etc.) Organize “large” collections (420,000 files or multiple computers in 1 collection) Long term preservation (hardware failure, obsolete file formats, unknown future, etc.) Wide scope of knowledge needed (computer hardware, operating systems, application, repository and virtualization software, archival processing, security (authentication , encryption), Web 3.0, digital preservation, natural language processing, etc.) Descriptive standards are in flux Accessioning procedures under development Delivery options for different formats
[INTRO]A little over two years ago, a few elements converged and a core group of us at Stanford began to get more serious about developing a viable method for processing the born-digital “papers” in our collectionsMost of my talk is centered around our first trialsbut I’d first like to describe the context and pressures at SUL that put us on this path …
The major pressure was the growing quantity of legacy media in our “backlog” …With Stanford situated in Silicon Valley, it’s no big surprise that we have a lot of computer collections that contain old legacy media. Hence our acquisition in the late 1990s of the records of Apple Computer Inc., the papers of Douglas Engelbart and a really large collection of computer games and software. [images: mouse/engelbart, box of Atari games/Cabrinety]
Because of those very acquisitions, in 1997, the Manuscripts Division began tracking the incoming quantities – just an overall count – of legacy computer media contained in new accessions. By the end of the decade we had recorded over 7,000 “items”Increasingly our b-d material comes from faculty, artists, writers, organizations Today, we have over 26,000 items of legacy media recorded in our backlog. [Univ. Archives has ~700 listed]
The other element was an event in February 2009. A staff member on our digital team (Michael Olson), who had previously worked in Manuscripts, attended the Digital Lives Project’s first conference at the British Library. Two things occurred : He heard about a study* done at the B.L. on data loss in legacy computer media (3% per year) and … He saw that the B.L. was exploring the use of forensic tools for capturing data from media. Based on this and coupled with the weight of our growing backlog of media – we decided on two courses of action: *McLeod, Rory paper “Risk Assessment; using a risk based approach to prioritise handheld digital information” 2008
First we purchased forensic hardware and software to enable us to capture and view legacy media and files. Hardware from Digital Intelligence (FRED) Software – we purchased and tested both FTK and En-Case forensic software. This framed the nucleus of our digital lab … And yet, most forensic equipment is geared toward current/modern media. So, we searched Ebay for old floppy disk drives to use with FRED
Next, we partnered with 3 other institutions (U. Va., Yale and Hull) - as part of the AIMS Project - funded by the Andrew W. Mellon Foundation. (AIMS Born-Digital Collections: An Inter-Institutional Model for Stewardship) The goals of the project were to:to process b-d material from 13 (mostly legacy) collections and to deliver the b-d material in some fashion by the end of the grant
Each repository hired a Digital Archivist. Peter Chan was hired at SUL in January 2010 and began actual work on imaging the disks for our 4 collections and trying out various methods for “processing” the data. We choose collections that contained different types of media and content. They are: Robert Creeley papers (poet – mostly email, some writing)Stephen Jay Gould papers (paleontologist, author – writing and some data sets)Peter Routledge Koch papers (fine press printer – a mix of files: email, text, image, and design files like Adobe’s InDesign. This was the only collection with files transferred directly from a donor’s current computer)Xanadu Project records (early hypertext project – software program on 6 hard drives)
It is now 1.5 years later and we have created a viable (although under constant development) workflow for accessioning and processing b-d materials using forensic tools. This is the more detailed workflow for collections that would be “fully” processed… We are also working on a minimal processing workflow and one that would fully “accession” the data – i.e. remove from physical storage media - and store for later processing.
One of the collections that has informed our development of b-d practice is that of Stephen Jay Gould, which contains both paper (analog) – over 500 linear feet – and 3 cartons of digital material. 98 3.5-inch floppy disks61 5.25-inch floppy disks4 sets of punch cards3 computer tapes In total, over 550 linear feet have been rec’d in 8 accessions.His papers and the audio/video are being processed concurrently – by archivist, Jenny Johnson - and will be done this August This month, the processing team discovered another 5 cartons of punch cards in the 2008 accn (21 sets). [This recent find won’t be resolved by end of grant]
Using the two different capture stations – FRED & the floppy/zip station – we created disk images of all the disks : 8 sets of punch cards were successfully read by our neighbors at the Computer History Museum. 1 set was unreadable – as it had no sorting key.We also began tracking loss our own loss statistics - “success or failure” of captures - in a spreadsheet; which we link to our accession records in Archivists’ Toolkit.Loss rate for floppies in Gould are 5% - loss in other collections was higher.Creeley: 6% loss: 1 out of 12 CDs unreadable; 3 out of 53 floppies unreadable. [1987-2004?]Xanadu: 4 of 6 hard drives inoperable – or 67% damaged: [PC’s report: There were mechanical or electrical problems with other drives (one didn't spin after it was powered up and one gave a "dong" sound after it was powered up). We are not sure what the problems with the remaining two drives are – they do spin after power up but we cannot access the data.] Cost to recover ~ $10,000 (2.5K / drive)
To process the materials during our initial trial, we used Windows Explorer.Folders were created that mirrored “series” and “titles” in EAD and files were moved from original media folder into appropriate place. This however changed data associated with the files – such as original path, etc.At this point, Peter Chan attended a week long session on the use of forensic software at Digital Intelligence – focusing on FTKWhile much more robust than we needed for archival work, he decided that many of the tools in FTK could easily be adapted for archival processing. We discovered that this practice mirrored work beginning at both BL and Oxford.
Technical metadata for the disk images are displayed here. The are arranged by floppy disk and display file format (where identifiable), file size, checksum, creation dates, etc. One can change the view to add additional columns, such as duplicate or primary file, etc.
The embedded viewer in FTK – from the same company that does Quick View Plus – allows you to quickly see the contents of many of the files
Here are two quick screen shots showing archival HIERARCHY using FTK’s “bookmark” feature.Series or Subseries can be added as metadata to individual or groups of files by highlighting or checking the boxes of the files in the lower panel.
Description for the three different formats in Gould will be merged at the end of the summer or early fall – paper, audio/video and born-digital files – but the level of description will be different.Gould’s papers are processed to the folder level for most of the collectionThe audio and video are listed at the item level to facilitate any future digitizationThe born-digital material will have Series level description with notes about original mediacapture and processing methods loss/damaged media and delivery methods
Here is a partial view of our working draft for processing notes for Gould b-d “series”
We encountered different issues in our other “AIMS” collections - the main one I will mention is the Robert Creeley collection… His papers originally contained : 53 floppies, 5 zip disks, and 3 CDsInitially the computer media was segregated into a separate collection – but will need to be merged into the main collection record and finding aid in the fall.After processing with FTK, the disk images garnered: Identified 50K emails Identified 8 files related to health records Identified 69 files with SS#A recent addenda complicates the processing of Creeley’s born-digital material : rec’d in May 2011 containing b-d media will need to be processed – and may allow us to have more complete set of emails, drafts, etc.7 computers3 zip drives121 optical discs422 3.5-inch floppy diskettes1 Zip 250 USB Drive1 Olympus C-4000 Camedia Digital Camera & flash cards1 20-gigabyte iPodWe have yet to analyze the data in the new accession and compare to original data but two issues cropped upHow to process and deliver multiple computers over creators life cycleData was captured from various CDs and computers to create an overview of the b-d material before transfer to SUL – what got changed in the process?![image from wikipedia taken by Elsa Dorman]
In processing initial computer media, PC used folder titles on the disks as keywords for files
Using Creeley’s initial text data, we have worked with two individuals – one working in the Digital Humanities – who took the header info from the 50K emails and created a network graph (Elijah Meeks) : Header information from Robert Creeley’s 50,000+ emails emphasizing the connection between the poet and Gerard Malanga.
To wrap up:Donors & users expect us to acquire, organize, preserve and provide access to b-d collectionsSpecial Collections staff capture, appraise, arrange and describe b-d materials AND contribute to requirements for both access and delivery as well as arrangement and description toolsOur digital group will preserve in our preservation repository (SDR) and provide public access and invite participation – Hypatia (under development)