Reading the (Same) Signals: Using FADGI’s ADCTest for Quality Control in Outsourced Audio Digitization

This is the second in a series of updates from the Federal Agencies Digital Guidelines Initiative (FADGI) Audio-Visual working group. See That’s Our Cue! Updates for the FADGI Embedded Metadata Guidelines and BWF MetaEdit for the Cue Chunk in Broadcast Wave Files for the first installment. This post is co-authored by Kate Murray, Digital Projects Coordinator in Digital Collections Management and Services and Rebecca Chandler, Senior Consultant at AVP.


This month, the Federal Agencies Digital Guidelines Initiative (FADGI) and AVP have released a new version of ADCTest with an additional feature – a built-in signal generator. In this version, 0.2 build version 303, we have added functionalities to support users who wish to test the performance of ADCs that are not immediately accessible to them.

First released in 2018, ADCTest is a free Windows-based open source software application that tests and reports on audio analog-to-digital converters (ADCs), in accordance with limited scope FADGI test and performance methods for low cost audio ADCs used in preservation reformatting workflows. Note that more robust ADCs, which usually come with a higher pricetag, follow a more comprehensive testing protocol as described in the FADGI high level performance guidelines. The high-level performance guideline includes 12 metrics, several with very exacting measurements. The low-cost approach is capable of examining seven metrics including frequency response, dynamic range and crosstalk, each with modified aim points. It’s in this low-cost/low-barrier approach that ADCTest fits well into the testing protocol.

The ADCTest app was designed for everyone from archivists to audio engineers, generating its own test signals and providing users with simple pass/fail reporting along with more detailed results for advanced users.

ADCTest automates the testing protocol for all criteria listed in the FADGI low cost guidelines with a goal towards answering three basic questions:

  1. Is my ADC/system failing?
  2. How does my calibrated, healthy ADC perform relative to the guideline and other ADCs?
  3. Is my ADC/system performing optimally relative to its own specifications?

ADCTest comes preloaded with FADGI guidelines and parameters but, should you feel inclined to get your hands dirty, you can customize those tests and parameters to be in alignment with your own institution’s standards for ADC performance.

In the autumn of 2019, FADGI circulated a survey asking users to weigh in on the changes they would like to see in ADCTest. 57% of respondents requested a “compact stand-alone signal generator.” This is in reference to improving the efficiency of one of the use cases that shaped the creation of ADCTest.

Our initial release of ADCTest focused on two use cases: “test your own ADC” and “test someone else’s ADC.” Both use cases were served by that initial release in 2018, but to be honest, the second use case workflow, which we refer to as the “Offline” use case, was a bit clunky. So at the urging of ADCTest users, we set out to improve that experience in this new release.

Imagine that you are an archivist who wishes to outsource the digitization of your institution’s collection of ¼ inch open reel audio tapes. As part of your Statement of Work (SOW), your vendor has agreed to perform tests on their ADCs to ensure there are no significant failures. The SOW requires that you must deliver a stimulus file to the vendor to use in testing. The stimulus file is an audio file composed of specific audio signals that, once run through the vendor’s ADC, should return with results within certain defined parameters. There are more technical details about the stimulus or “stim” file which you can read about in the user guide, but for many non-expert users, you don’t need to know anything more than that. ADCTest is built to be user friendly and will create and test that stimulus file for you so you can tell at a glance if there are ADC errors impacting your digitized audio files.

Fig. 1: Creating a new test project

Open ADCTest and create a new test project (figure 1). Fill in a title and choose your file path. Be sure to select “Offline” as the project type, as this will enable you to generate that stimulus file. Select the Sample Rate you have specified for your preservation master files in your Statement of Work. To be in alignment with digitization standards for analog audio materials (specifically IASA Guidelines on the Production and Preservation of Digital Audio Objects), select 96000 here.

The remainder of the fields may not be relevant to you if you do not know what type of digital-to-analog converter (DAC) and ADC your vendor plans to use. You may skip those fields, if you prefer.

Next up, you have the opportunity to generate that stimulus file. You may choose to enable/disable certain tests under the “Test procedures” column, but if you don’t have a lot of engineering experience, just leave it as is. The Offline test defaults to having every test enabled, with the exception of the crosstalk tests, also known as x-talk .

Note:  Crosstalk tests require the tester to pull cables and insert a shorting plug at various points in the testing process. In the Offline test, this would be a tricky undertaking requiring precise timing of the plug insertion during testing so we chose to have those default to disabled. You may choose to enable them by right clicking on the tests and selecting enable.

You can also request that the vendor perform the crosstalk tests using ADCTest themselves and send those reports to you along with the response.wav file.

Figure 2: Generating the offline stim file.

Next, you click the “Generate offline stim file” button (figure 2). This generates a 32-bit WAVE file named “testSequence.wav” in your project folder. You send that .wav file over to your vendor and ask them to run it through their ADC(s), recording a response file called “response.wav” which they then return to you. Ask the vendor to send you one for each ADC they plan to use on your assets, naming each file to clearly identify the corresponding ADC it was run through.

Place the “response.wav” file returned from the vendor in your project folder. In ADCTest, click the “Analyze offline response file” button at the bottom center (figure 3).

Fig. 3: Analyzing offline response file.

Fig. 4: Pass/fail results.

ADCTest checks to make sure all enabled tests elicit responses from the vendor’s system that fit within the FADGI guidelines. Any failures will be pink and will allow you to view more detailed results by clicking on the test (figure 4). Yellow “retest” results indicate there may have been a glitch in the signal or response file and the specific test(s) should be selected and rerun. You would then reach out to your vendor with any findings you wish for them to investigate.

Using ADCTest for outsourced audio digitization ensures that both you and your vendor have a shared understanding about specific ADCs’ performance over time to build confidence in your digitized audio files.

In keeping with FADGI’s goal to provide wide access to improved tools and workflows, ADCTest carries a BSD-3 license and the low cost guidelines carry a CC0 1.0 Universal license for worldwide use and reuse.

For more information about ADCTest, including instructions for downloads, please see http://www.digitizationguidelines.gov/guidelines/digitize-audioperf.html.

That’s Our Cue! Updates for the FADGI Embedded Metadata Guidelines and BWF MetaEdit for the Cue Chunk in Broadcast Wave Files

This is guest post, the first in a series of updates about the recent work of the Federal Agencies Digital Guidelines Initiative (FADGI) Audio-Visual working group, is co-authored by Kate Murray, Digital Projects Coordinator in Digital Collections Management and Services, audiovisual archivist and technologist Dave Rice, and Jérôme Martinez, Founder and President of MediaArea.net. The […]

Supporting the Acquisition of Openly-Available e-Serials from the Duplicate Materials Exchange Program: An Interview with Junior Fellow Alex Reese

For thirty years the Library of Congress has offered undergraduate and graduate students from across the country the opportunity to work on projects focused on expanding access to and use of the Library’s collections. As a result of the COVID-19 pandemic, the Junior Fellows program continued to be entirely virtual in 2021. The Digital Content […]

Review With Us: By the People and Smithsonian Transcription Center team up for crowdsourced transcription

Today’s guest post is from Caitlin Haynes, the Program Coordinator for the Smithsonian Transcription Center in Washington, D.C. You can read Caitlin’s original post from the Smithsonian here.* During the month of August 2021, we teamed up with the community managers and volunteers at By the People, the Library of Congress’s crowdsourced transcription program, to focus […]