This is a guest post by Julie Seifert.
As part of the National Digital Stewardship Residency, I am assessing the Harvard Library Digital Repository Service, comparing it to the ISO16363 standard for trusted digital repositories (which is similar to TRAC). The standard is made up of over 100 individual metrics that address various aspects of a repository, everything from financial planning to ingest workflows.
The Harvard Digital Repository Service provides long-term preservation and access to materials from over fifty libraries, archives and museums at Harvard. It’s been in production for about fifteen years. The next generation of the DRS, with increased preservation capabilities, was recently launched, so this is an ideal time to evaluate the DRS and consider how it might be improved in the future. I hope to identify areas needing new policies and/or documentation and, in doing so, help the DRS improve its services. The DRS staff also hope to eventually seek certification as a trusted digital repository and this project will prepare them.
When I started the project, my first step was to become familiar with the ISO16363 standard. I read through it several times and tried to parse out the meaning of the metrics. Sometimes this was straightforward and I found the metric easy to understand. For others, I had to read through a few times before I fully understood what the metric was asking for. I also found it helpful to write down notes about what they meant and put it in my own words. I read about other people’s experiences performing audits, which was very helpful and gave me some ideas about how to go about the process. In particular, I found David Rosenthal’s blogs posts about the CLOCKSS self-audit helpful, as they used the same standard, ISO16363.
Inspired by the CLOCKSS audit, I created a Wiki with a different page for each metric. On these pages, I copied the text from the standard and included space for my notes. I also created an Excel sheet to help track my findings. In the Excel sheet, I gave each metric its own row and , in that row, a column about documentation and a column that linked to the Wiki. (I blogged more about the organization process.)
I reviewed the DRS documentation, interviewed staff members about metrics and asked them to point me to relevant documentation. I realized that many of the actions required by the metric were being performed at Harvard but these actions and policies weren’t documented. Everyone in the organization knew that they happened but sometimes no one had written them down. In my notes, I indicated when something was being done but not documented versus when something was not being done at all. I used a Green, Yellow, Red color scheme in the Excel sheet for the different metrics, with yellow indicating things that were done but not documented.
The assessment was the most time-consuming part. In thinking about how to best summarize and report on my findings, I am looking for commonalities among the gap areas. It’s possible that many of the gaps are similar and several gaps could be filled with a single piece of documentation. For example, many of the “yellow” areas have to do with ingest workflows, so perhaps a single document about this workflow could fill all these gaps at once. I hope that finding the commonalities among the gaps can help the DRS fill these gaps most effectively and efficiently.