The following is a guest post by Jimi Jones, Digital Archivist with the Office of Strategic Initiatives.
The Federal Agencies Audiovisual Working Group held a meeting on July 28, 2011. The meeting was attended by professionals from several agencies including the Library of Congress, the National Archives (NARA) and the Smithsonian. The Library of Congress was represented by staff from the American Folklife Center, the Office of Strategic Initiatives and the Library’s Packard Campus for Audio-Visual Conservation.
Some of the highlights of the meeting included:
- Kira Cherrix of the Smithsonian discussed their use of a tool that was developed by NARA to analyze the logging metadata output by their SAMMA video digitization equipment.
- Carl Fleischhauer discussed the Library of Congress’ work on its MXF AS-AP application specification. Carl also gave a brief overview of several audiovisual metadata schema including, among others, PBCore.
- Kate Murray talked about the extensive Quality Assurance/Quality Control (QA/QC) initiative at NARA for their Digitization Services Branch. This initiative will, among other things, include (from Kate’s presentation): “establishing division-wide quality baselines utilizing specific quality control thresholds for all current products, to establish appropriate technical metadata (embedded and/or external) for each product, to develop a system to optimize relevant equipment and systems including calibration and monitoring and to identify system infrastructure requirements needed to support the QA/QC effort (metrics gathering, metrics reporting, system alerts, audit trail, business records).”
- Kate also talked about NARA’s current efforts related to video metadata. Kate and her colleagues are looking at current audiovisual metadata schema to see what kinds of technical information they would need to harvest from the AVI files they produce from their analog video assets. The goal would be to then embed that information into a “chunk” of that AVI file. Among of the tasks of this exploration, according to Kate are:
- The development of an XML-compliant technical metadata schema for complex reformatted video objects including appropriate controlled vocabularies.
- The development and pilot testing of an XML metadata export/extraction tool that will organize and assemble data to meet the schema specification.
- The identification and evaluation of appropriate areas within the AVI file header to embed limited and controlled metadata for preservation purposes.
- The development and pilot testing of a tool that supports embedding, validating and exporting of metadata in AVI files.
- Chris Lacinak of Audiovisual Preservation Solutions talked about his proposed audio digitization equipment performance testing. This testing is actually two related efforts. The first relates to testing the performance of audio analog to digital converters. Chris plans to test several different converters in order to judge the pass-fail standards of the International Association of Sound and Audiovisual Archives. Some of the models of analog to digital converters that Chris will test are in use at the Library of Congress and the National Archives. Chris’ second effort is an exploration of the problem of “interstitial errors.” These errors in audio digitization manifest themselves as dropped samples between the digitization of the analog source and when that information is written to the recording medium – the computer’s hard drive, for example. Chris discussed his proposed method for testing for these errors. He also talked about his desire to send out a survey (he has since sent this survey to several listservs) to gauge how widespread the perception of the problem is. The survey also sought to get a sense of how interested users would be in solution products for these kinds of errors.
- Jimi Jones of the Library of Congress discussed the European Broadcasters’ Union’s revision of the Broadcast WAVE format specification. The big change in this new version (version 2.0) is the inclusion of several metadata fields related to the “loudness” of an audio file. The broadcast application of this loudness is fairly obvious – for example, to regulate the loudness of audio broadcasts for the comfort of listeners – but the use of this metadata in a preservation/cultural heritage setting is not yet clear. However, it is apparent that version 2.0-compiant audio materials will soon begin to find their way into heritage institutions. To that end, Jimi has drawn up a revision of the Federal Agencies’ Broadcast WAVE guidelines document to include – as optional fields – these loudness fields. Jimi unveiled the revised guidelines document at this meeting and asked for volunteers to review the document. This activity continues. The current version of the guidelines – from 2009 – can be found at http://www.digitizationguidelines.gov/guidelines/digitize-embedding.html. The new version – once it is approves by the working group – will be posted on this page.
- Hot on the heels of Jimi’s presentation was a discussion by Dave Rice of Audiovisual Preservation Solutions. Dave talked about his plans to update the BWF MetaEdit tool. There are a few small issues that Dave plans to fix. There will also be some changes to the tool to make it conform to the new version of the Broadcast WAVE specification.
In all this was a very well-attended meeting of the Audiovisual Working Group. The group and its members have several initiatives in the works so keep eyes on the FADGI website!