The following is a guest post by Kristin Snawder, a 2011 Junior Fellow working with NDIIPP.
When I came to the Library I never imagined that my first project would be to face off with the hard issues underlining digital preservation policy development.
Policy development was new to me, and the task seemed daunting. But I dug in, determined to do the best job possible.
The project involved analyzing a set of published digital preservation policies from libraries and archives around the world. My task was to determine what these policies cover, what they do not, and what level of detail they provide.
The 13 selected policies represent a diverse group of institutions:
- Columbia University Library
- Cornell University Library
- Florida Digital Archive
- Georgia Archives
- Library and Archives Canada
- National Library of Australia
- State Library of North Carolina (3 policies)
- State Library of Queensland (Australia)
- State Library of Victoria (Australia)
- The Royal Library, The National Library and Copenhagen University Library (Denmark)
- University of Michigan, Inter-University Consortium for Political and Social Research
I read them all and set to work creating a taxonomy for categorizing pertinent sections and issues in each policy. My list started out long and overly broad. By focusing on categories that would be inclusive across all the documents but specific enough to be applied with a certain degree of accuracy, I was able to narrow them down to 15:
- Content Scope
- Preservation Model/Strategy
- Storage, Duplication and Backup
- Security Management
- Rights and Restriction Management
- Access and Use
- Financial Planning
- System Parameters
- Staffing and Training
- Roles and Responsibilities
I used the taxonomy to create a crosswalk for the policy documents. My judgments took some time and quite a bit of discussion. The objective was to apply a category when it received reasonably full treatment in a document. I had to make subjective decisions and return to the policies for closer reading, but I feel comfortable with the results—which turned out to be quite interesting.
An unexpected result was that only two of the 13 policies mentioned staffing and training in any detail. This topic attracts lots of attention in professional circles and would seem to play a key role in implementing institutional policy over time. Also, with digital preservation requiring technical proficiency and an understanding of the changing information environment, I assumed that there would be emphasis on staff and training. An explanation for the lack of inclusion and specificity may be that institutions prefer to consider the matter separately, or perhaps it requires further development.
It was illuminating to see that most of the policies included a general preservation model or strategy. Even though we are still in the early days of working out digital preservation strategies, it would seem that many institutions are comfortable in outlining a specific approach. Much the same can be said regarding the categories of content scope, selection and appraisal, and access and use. Overall, there was close to a 50/50 split between the number of categories included in a high percentage of policies and the number of categories that included in a low percentage of policies.
From working on this project, I have come to realize that digital preservation policies are highly variable. Considering that institutions differ in terms of their mandates, requirements and local practices, it is unlikely that we will see a “one size fits all” policy template.
As organizations continue to refine their approach to digital preservation, an awareness of how existing policies compare is helpful. Now slightly more experienced, I am glad to have survived my policy face off!