Authenticity Amidst Change: The Preservation and Access Framework for Digital Art Objects

The following is a guest post by Chelcie Juliet Rowell, Digital Initiatives Librarian, Z. Smith Reynolds Library, Wake Forest University.

In this edition of the Insights Interview series for the NDSA Innovation Working Group, I was excited to talk with collaborators on Cornell University Library’s Preservation & Access Framework for Digital Art Objects project:

  • Madeline (Mickey) Casad, Curator for Digital Scholarship and Associate Curator of the Rose Goldsen Archive
  • Dianne Dietrich, Digital Forensic Analyst and Physics & Astronomy Librarian
  • Jason Kovari, Head of Metadata Services
  • Michelle A. Paolillo, Digital Curation Services Lead

Chelcie: Tell us about the Preservation & Access Framework for Digital Art Objects (PAFDAO) project and the wicked problem it tackles.

Jason: PAFDAO is a recently completed NEH-funded research and development project to preserve CD-ROM based new media art objects from the Rose Goldsen Archive of New Media Art. Many of these items were created in the early 1990s and are at significant risk of data loss. On top of that, the vast majority of the works in question could not be accessed by researchers at all without the use of legacy hardware and software, which can be difficult and costly to sustain for public use. The project developed preservation and access solutions for these highly interactive and complex born-digital artworks stored on fragile optical media. Our primary aim was to create a solution that worked for these artworks while being generalizable for the rest of the Archive and, most importantly, the community as a whole.

Chelcie: The PAFDAO project’s test collection included more than 300 interactive born-digital artworks created for optical disc and web distribution, many of which date back to the early 1990s. Using one particular artwork as an exemplar, describe for us the aesthetic experiences it was intended to elicit, as well as its numerous digital objects and dependencies.

Shock in the Ear by Norie Neumark with visual concepts by Maria Miranda and music by Richard Vella. Used with permission.

Shock in the Ear by Norie Neumark with visual concepts by Maria Miranda and music by Richard Vella. Used with permission.

Mickey: An excellent example of the kind of work we focused on can be found in Shock in the Ear, a 1997 CD-ROM artwork by the Australian artist Norie Neumark.

Visually, this work presents the user screens that look like painterly collages, with embedded images and handwritten text rendered in deep, saturated colors. It’s also an intricate work of immersive, abstract sound art, with sounds sometimes sharp and unsettling, sometimes deep and enveloping, sometimes alarming. The sounds often fade in and out of aural focus, change in volume, or blend into one another as the user moves a cursor around the screen. By the same token, new images might be revealed, fade in and out of focus, or change color subtly in concert with the changing soundscape of the work, as the user explores rollover areas of the interactive screen.

As the user moves through the artwork, he or she hears personal accounts of experiences of shock or trauma: for example, the story of an accident, an experience of electroshock, or the shock of cultural displacement. These are distinct narratives, and they are related by distinctive voice characters, but there’s not a consistent match of voice to story. You hear only fragments of these stories at a time. Clicking on the screen might take the user deeper into one story, or it might cause the thread to abruptly switch to another, randomly selected fragment of another narrative. The user can also intermittently be trapped in the program’s own timeline, forced to wait while looking at a clock face, before new screens and new sections load.

All in all, this adds up to a sophisticated aesthetic meditation on the experience of shock and its aftermath, where the elements of randomness, multimedia content, and interactivity are all absolutely essential to the work’s impact. It’s an incredibly refined realization of the artistic potential of the CD-ROM medium.

The work consists of hundreds of image files, sound files, and programs to coordinate the interactive collaging of all this media content in the rollover responsive screens as well as the structural progression of the user’s movement from screen to screen.

Shock in the Ear by Norie Neumark with visual concepts by Maria Miranda and music by Richard Vella. Used with permission.

Shock in the Ear by Norie Neumark with visual concepts by Maria Miranda and music by Richard Vella. Used with permission.

Dianne: Like Neumark’s Shock in the Ear, many artists used — or collaborated with programmers using — software called Macromedia Director to create their artworks. Director allowed them to embed all sorts of multimedia files into their artwork, including audio files, video files, and still images. I remember once seeing a book on Macromedia Director that boasted the fact that its goal was to help users publish professional quality, standalone CD-ROMs to distribute their interactive content. We found that so many of the works still had so many specific dependencies, like, they only ran on a Mac, or they needed a particular version of QuickTime, or a certain Netscape browser plugin, and so forth. We did find patterns in the collection as a whole, which helped us predict what kinds of works would have certain dependencies.

Chelcie: The project team conducted a survey of users of media archives in hopes of better understanding the research questions animating their work. What sorts of questions are researchers asking? How do these questions reveal a more nuanced understanding of the concept of the digital preservation concept of “authenticity”?

Mickey: We developed the survey to be a fairly broad and qualitative investigation. Initially, we had hoped that this would help us identify distinctive profiles of media art researchers with specific kinds of needs — for example, what percentage of researchers are likely to want access to historical hardware. But the responses we received didn’t really coalesce like this. In part this is because the community of researchers for this kind of artwork is still relatively small and disparate, though we expect this to change as the cultural significance of this work becomes more and more apparent, and as the technological and institutional impediments to archiving it become less daunting.

We were surprised to see less emphasis on code and hardware than we might have expected, but, again, as an archiving institution, we expect the field of new media research to evolve rapidly in the years ahead, as archives begin to address the technological challenges of providing access to new media collections, and as researchers become more and more technologically sophisticated and also more accustomed to the idea of digital artifacts as objects of cultural study.

“Authenticity” is a tricky question. Dianne has written elsewhere about distinctions between forensic authenticity, archival authenticity, and cultural authenticity. Forensic authenticity and archival authenticity refer to different ways of ensuring that an object is what it claims to be, and has not been falsified or corrupted. Cultural authenticity is a much more complex issue that needs to be taken into account, especially with art collections. It has a lot to do with the patron’s faith in the archiving institution, and this is one of the things that came out of our researcher survey. Respondents wanted to know that a work would be presented in a way that gave due respect to its original context, and to the artistic vision of its creator, and we needed to adapt our preservation strategies to better address this need.

Dianne: It is important to point out that “authenticity” wasn’t so cut and dry even when these artworks were being created. We have countless examples of works where the system requirements say something like, “Compatible with either Mac or PC” — and so, even back in the day, artists knew that people were going to experience these works on a variety of personal computer setups.

Mickey: There’s a lot of “variability,” built into these artworks, just in their very nature. And anything iterative or random or built to run on multiple platforms offers its own subtle defiance of the idea of singular artistic uniqueness — which might be part of a layperson’s definition of an artwork’s authenticity. So there’s a lot of grey area, and a lot of work about this grey area being done in the arts archiving community. We knew we would need to get this right — or as right as we could, working at scale with a large research collection.

Chelcie: The project team also developed a questionnaire for guiding interviews with artists about what they view as the most significant properties of their artworks. How is the questionnaire intended to inform the preservation and access strategy for a particular artwork?

Mickey: The artist questionnaire is simple and flexible, which can be found in Appendix C of the white paper. It’s based on questionnaires and user interview processes developed by other media arts archiving organizations (for example, the Turbulence.org questionnaire or the Variable Media Questionnaire), but very much tailored to our preservation workflow, our need to work at large scale, and the requirements of a research archive. It allows us to open a conversation with artists about the most significant properties of their artworks.

Shock in the Ear by Norie Neumark with visual concepts by Maria Miranda and music by Richard Vella. Used with permission.”

Shock in the Ear by Norie Neumark with visual concepts by Maria Miranda and music by Richard Vella. Used with permission.

In the case of the artwork described above, Shock in the Ear, artist Norie Neumark confirmed for us that sound quality was of vital importance in any rendering of the artwork. She also gave us important information about the work’s compiling environment and production history, and ultimately sent us working files and an updated version that we will archive alongside the 1997 version for future researchers. We were able to disclose some of the rendering shortcomings that might be associated with our use of emulation as an access strategy, and she agreed to this with the important caveats about rendering sound quality.

Dianne: From my perspective, I think the artist interview can also serve as an important tool to prompt for more technical details. For instance, we can ask whether the artist still has working documentation or source code for the artworks, or whether they’ve upgraded the work to a newer platform. It’s also a helpful way to probe what the artist views as important to preserve about their work, knowing that any access strategy is likely to alter the user experience in some way. For instance, we might notice that when a work is viewed on an LCD monitor, the colors are different than they would appear on a CRT. We might spend a lot of time trying to mediate that problem before realizing, say, through an artist interview, that it’s not their priority. Perhaps rendering the speed of the work faithfully is most important to them. I wouldn’t say that the artist interview dramatically changes a preservation or access strategy for a particular artwork, but rather clarifies and prioritizes which metadata is most important for the ongoing preservation of an artwork.

Chelcie: The project’s objective was to provide the “best feasible” access to artworks, and the project team was somewhat surprised to conclude that emulation presented the “best feasible” preservation strategy. What interpretive possibilities are lost in a “feasible” implementation? On the other hand, what interpretive possibilities are preserved?

Dianne: Many of these works consisted of a collection of multimedia files, and it is not impossible to navigate through the files contained in each CD-ROM and still be able to view many (but not all) of them on current computers. Of course, the most important part of the artwork was usually a hardware-specific executable file that ran a program that then responded to interactions from the user. And so what emulation preserves is the experience of interacting with the artworks and seeing how all of the individual assets on a CD-ROM relate to one another.

Adriene Jenik, Mauve Desert, 1997, Shifting Horizons Productions. Screenshot by Dianne Dietrich, Mac OS 9 in SheepShaver.

Adriene Jenik, Mauve Desert, 1997, Shifting Horizons Productions. Screenshot by Dianne Dietrich, Mac OS 9 in SheepShaver.

Emulation can be a challenging concept because it can radically alter the user experience. In our example, we’ve migrated data off of physical media, so users no longer have to load an actual CD-ROM into a drive to start an artwork. There are other more subtle changes, too: the colors are different and the speed of the work is usually faster. However, one of the ways I looked at these artworks — and digital files more generally — is that they’ve always been subject to a variable user experience. Many of the works could run on a number of hardware/software combinations, and that was built in, I think, to the artists’ expectations. Everybody had a different monitor with a unique calibration, or a different mouse, or keyboard. Given the range of personal computing environments that has always existed, it is helpful to see emulation as an extension of the variance in experience that was always the case for these materials. We have taken the time to document some of the changes that emulation introduces, including speed, color rendering, the effect of new input devices (like trackpads) and so forth.

Chelcie: If any component of a new media artwork fails, the entire artwork can become unreadable. Metadata supports the rendering of the work, as well as its interpretation. How does the framework define the appropriate metadata to describe interactive born-digital artworks?

Jason: The framework includes a combination of descriptive (MARCXML), technical (DFXML) and preservation (PREMIS) metadata for each artwork as well as a classifications document to help guide rendering and restoration decisions on a “class”-level. The root of the metadata requirements stemmed from an assessment of the nature of what we’re preserving: disk images; we were careful to not over-describe objects but still gathered metadata about the disk images as well as the files on the disk. Further, we considered data derived from the user survey to determine whether the metadata were rich enough to support user needs. If rendering of an artwork fails, we believe we’ve captured a good depth of description (both artistic and technical) to allow staff to begin identifying methods to make the artwork renderable.

Dianne: One thing we learned during the project was that, because there are so many utilities for characterizing digital objects, there was so much metadata and description that we could have created, and so we had to select what we thought was essential for our future colleagues.

Chelcie: What objects does the framework recommend should constitute the package (SIP or AIP) stewarded by a collecting institution?

Michelle: The works affected by the grant have item-level representation in our catalog, so for each conceptual work, we created an aggregate named after its bibliographic identifier and described by the MARCXML of that record. In this container, we placed other aggregates, named “disk_images” and “coverscans”, and optionally “derivative_disk_images”. “disk_images” contain the bit-faithful copies of the disks we made, along with their technical and PREMIS metadata, and any informational files relevant to that particular work (for example, notes describing issues with technical playback). “coverscans” are raster images that document the physical packaging of the original disks, which are often creative artworks in and of themselves. Optionally, if there were opportunities to make derivatives, most often, to create operating environment changes that led to improved playback, those would be included in “derivative_disk_images”. In addition, all configured emulators, environments, and hardware ROMs were included in the deposit. (The complete deposit structure is described in Appendix A of the white paper.) We have additional narrative documentation that explains how to relate artwork system dependencies with emulation environments that complements the structured metadata we created. The overall deposit reflects the complexity inherent in these objects, so there is a lot of cross referencing to assist our future colleagues in matching any given work with the appropriate emulator and ROM to play it back.

It is worth noting that the deposit of any given work is driven by its affiliation to a specific Archives collection, and PAFDAO involved 12 of these, as the Rose Goldsen Archive of New Media Art is a collecting area encompassing several collections. The intellectual value of these assets drove the deposit arrangement. While their associated technical analysis and description are important, at the end of the day, these assets are valued for their properties as exemplary works of a time, and assets within their collections. Our mission as a research institution guides us to preserve these objects within the context of their collections.

Chelcie: What project outputs do you anticipate will prove the most useful to institutions other than the Goldsen Archive that are preparing to meet the preservation challenges of similar materials?

Dianne: We wrote up a classifications document outlining the various (and often overlapping) characteristics of the works in the testbed collection. For each classification, we included a short summary, implications for access, and projected restoration issues — that is, what we expect it would take to get a work running in a current system.

Since we focused so much on emulation in this project, I think the most valuable part of that document is the explanations of the issues we encountered when trying to provide access (using emulation) for each type of artwork. As a companion, we also wrote up a fairly comprehensive how-to document for locally configuring the various emulation software we tested, which includes how we approached determining a compatible emulation environment for various artworks.

While we used a combination of file type analysis and manual review to classify artworks and determine suitable emulation environments, I’m personally pleased to see work done elsewhere in developing frameworks for classifying digital materials to support automatic detection of compatible emulation environments. Similarly, as Emulation as a Service develops into a viable solution for many archives, I still think there’s value in the detail we provided in our emulation documentation, since we used the same emulators that EaaS is running in the backend of their system.

Chelcie: What opportunities for conversation or collaboration do you foresee, either among cultural institutions or between cultural institutions, games communities, and hobbyists?

Dianne: I really like that we can tap into a kind of collective nostalgia for our older technology in order to help preserve these important artifacts. We were so fortunate that emulators exist for the exact environments we needed for these artworks — and really, that’s the result of enthusiasts who wanted to preserve the computing environments of their youth. As more people get involved in this space, there’s a greater awareness of not only the technical, but social and historical implications for this kind of work. Ultimately, there’s so much potential for synergy here. It’s a really great time to be working in this space.

To learn more about the PAFDAO project, check out the project team’s white paper, Preserving and Emulating Digital Art Objects.

Stewarding Academic and Research Content: An Interview with Bradley Daigle and Chip German about APTrust

The following is a guest post by Lauren Work, digital collections librarian, Virginia Commonwealth University. In this edition of the Insights Interview series for the NDSA Innovation Working Group, I was excited to talk with Bradley Daigle, director of digital curation services and digital strategist for special collections at the University of Virginia, and R. […]

Digital Forensics and Digital Preservation: An Interview with Kam Woods of BitCurator.

We’ve written about the BitCurator project a number of times, but the project has recently entered a new phase and it’s a great time to check in again. The BitCurator Access project began in October 2014 with funding through the Mellon Foundation. BitCurator Access is building on the original BitCurator project to develop open-source software […]

Insights Interview: Josh Sternfeld on Funding Digital Stewardship Research and Development

The 2015 iteration of the National Agenda for Digital Stewardship identifies high-level recommendations, directed at funders, researchers, and organizational leaders that will advance the community’s capacity for digital preservation. As part of our Insights Interview series we’re pleased to talk with Josh Sternfeld, a Senior Program Officer in the Division of Preservation and Access at […]

Digital Preservation in Mid-Michigan: An Interview with Ed Busch

Conferences, meetings and meet-ups are important networking and collaboration events that allow librarians and archivists to share digital stewardship experiences. While national conferences and meetings offer strong professional development opportunities, regional and local meetings offer opportunities for practitioners to connect and network with a local community of practice. In a previous blog post, Kim Schroeder, […]

Collecting and Preserving Digital Art: Interview with Richard Rinehart and Jon Ippolito

As artists have embraced a range of new media and forms in the last century as the work of collecting, conserving and exhibiting these works has become increasingly complex and challenging. In this space, Richard Rinehart and Jon Ippolito have been working to develop and understand approaches to ensure long-term access to digital works. In […]

Digital Preservation Capabilities at Cultural Heritage Institutions: An Interview With Meghan Banach Bergin

The following is a guest post by Jefferson Bailey of Internet Archive and co-chair of the NDSA Innovation Working Group. In this edition of the Insights Interview series we talk with Meghan Banach Bergin, Bibliographic Access and Metadata Coordinator, University of Massachusetts Amherst Libraries. Meghan is the author of a Report on Digital Preservation Practices […]

Astronomical Data & Astronomical Digital Stewardship: Interview with Elizabeth Griffin

The following is a guest post from Jane Mandelbaum, co-chair of the National Digital Stewardship Alliance Innovation Working group and IT Project Manager at the Library of Congress. As part of our ongoing series of Insights interviews with individuals doing innovative work related to digital preservation and stewardship, we are interested in talking to practitioners from other […]