The following is a guest post by Emily Reynolds, a 2012 Junior Fellow.
One of the many highlights of the DigitalPreservation 2012 conference last month was the Preserving Digital Culture panel, which featured speakers discussing the preservation of born-digital art and other creative output. While much of the conference addressed the often automated management of big data, these speakers addressed materials that require much more individual attention. Similar to preserving physical artwork, digital art must be preserved meticulously to maintain the artist’s original vision.
Doug Reside spoke about the complexities of preserving digital materials from the playwright Jonathan Larson. Excavating layers of digital text, he was able to reconstruct several iterations of the play RENT as it was written. With digital text, determining the exact sequence of versions that the creator went through becomes much more complicated than with printed text, as materials can easily be overwritten or slightly modified.
Megan Winget talked about which properties are most essential to capture in preserving new media. She described digital preservation as a wicked problem; her ideas in this regard are outlined in an earlier post. Winget referenced the article Twisty Little Passages Almost All Alike: Applying the FRBR Model to a Classic Computer Game, which ties many of these issues into videogame preservation. As the distinctions between versions of items become less clear, the need for systems that will help to manage these networks of complex works becomes ever more important.
Ben Fino-Radin discussed the Rhizome ArtBase, a web-based art archive. The project began as a web index of contemporary digital art, but the linked information often disappeared. The ArtBase now captures artwork to preserve it independently of the original website. Because each work is captured exactly as the artist intended, Rhizome works directly with artists to determine what constitutes a successful capture. They employ a flexible, individualized strategy, using a variety of tools for each project. The focus on individual art objects, as well as the importance of capturing all properties of the original work, distinguishes the project from most web harvesting projects.
The Digital Archaeology project, which Jim Boulton spoke about, takes a somewhat different approach to preserving web content thought to be culturally important. Boulton collected hardware and software contemporary to several older websites so they could be displayed in their original context. This isn’t an approach that can be replicated on a mass scale, and was only intended for exhibits at Internet Week Europe and Internet Week New York.
Slide presentations from the panel can be found on the NDIIPP DigitalPreservation 2012 website.
I’d like to hear more about what Megan Winget has to say on this matter. This is something I have been curious about for years. There are two issues (maybe three):
1) Preservation of the aesthetics/content of the work in question. By this, I mean preserving the story/meaning. With video games, there are multiple formats for a given game (note that this also applies to any software program). If we look back to the coin-popping 80s, a game would first appear in the arcade, and then appear later on game consoles. I liken it to when you see a movie in the theater, and within the year, it comes out on VHS–I mean DVD–I mean Blu Ray (or streaming on Netflix).
Added a further complication to this is the difference in multiple versions. Just like a movie (Stanley Kubrick’s Eyes Wide Shut or Ridley Scott’s Blade Runner) may have multiple versions both in the theater and for home (and also appear on VHS, Laser Disc, DVD, Blu Ray, etc.) the same can be said for games like…Double Dragon and Strider.
So which versions are preserved? All of them? This raising the second issue.
2) Preserving the hardware. Preserving early film reels, and using them, requires specific projectors. Should those, too, be preserved? There’s always the concern of trying to capture the “original” feel of a work. So projectors, BETA/VHS, etc. need to be used and preserved. Or do they?
It brings up the question of whether old game consoles and computers should be preserved and maintained (costly) in order to play, say, Oregon Trail on the Apple II or King’s Quest on an IBM with the floppy disc(s). The acids and corrosive elements within the systems that make these games playable/accessible would also need to be addressed.
3) With all of these complications upon complications (multiple formats, requiring multiple systems), there are two solutions which require…editorial discretion and go against the idea of preserving/maintaining the best original copy:
Use of emulation to make works playable/viewable/usable on a single system. Always, there is concern of what is lost by deviating too far from the original intent.
Secondly, there is the question of only preserving the “best version.” If a work has 7-10 versions, which ones should be acquired and preserved/emulated?
Much of this is tied to determining the importance of preserving “digital zeitgeist.”
Regarding hardware preservation vs emulation, I would just add that gaming hardware is means to an end. I somehow doubt that many people bought an XBOX because they wanted the hardware. More realistically, it was a means to a game like the Halo series or another game only available on that system. Today it is common practice to get an emulator to play a game for an obsolete platform. For some of us, that’s the only way we knew the game. Certainly something is lost when we don’t get to use the original controller or the game doesn’t display exactly correctly, but given that so many of the games today are ported between systems: Xbox to PS3 to PC to iPhone, it would seem that maintaining the original platform is hair splitting. From a cost/benefit perspective, does it make sense to maintain, repair and order parts for a legacy platform? Who will have access to the platform? Emulators at least make the most sense as an “access” copy. Maybe forward compatibility can be seen as a form of damage control.
As for versioning, this becomes a real mess with online games today like I expect happens with Second Life. Do we maintain version 1.1, 1.2, 1.2.1 and so on? When each of these can be 10GB, a reasonable compromise might be to maintain the original release and the last stable version. Originals have a nostalgic appeal for many, see http://www.project1999.org/. The last stable version may have the best game balance and fewest bugs.
Easy for me to say. I’m glad though that there is such an NDIIPP project.