Top of page

Picturing Perpetual Beta for Digital Preservation

Share this post:

The perfect digital preservation system does not exist. It may someday, but I don’t expect to live to see it.

Norfolk Record Office, on Facebook
Norfolk Record Office, on Facebook

Instead, people and organizations are working on iterations of systems, and system components, that are gradually improving how we steward digital content over time. This concept of perpetual beta has been around for a while; Tim O’Reilly explained lucidly in What Is Web 2.0 in 2005.

I gave a presentation recently in which I was expressing hope that prospective infrastructure developments for stewarding big data would bring benefits to the work of libraries, archives and museums to preserve digital content.

My intent was to convey that change should be iterative along a path to radical. In the spirit of avoiding bulleted presentation slides wherever possible, I searched for graphics that might help tell the story.

Yvonne, by Lawrence Luckham
Yvonne, by Lawrence Luckham

The one I ended up using was a picture from the Norfolk Record Office (UK) that showed delivery of a computer system some years ago. In it’s day, the Elliot computer was an advanced machine that cost the modern equivalent of nearly a million dollars. It read paper tape at 500 characters per second and had a CPU that was stored in a “cabinet about 66 inches long, 16 inches deep and 56 inches high.”

The picture got a good response from the audience and I wondered if perhaps I should have used others, perhaps from a later era, such as this one from Bell Labs in the late 1960s. This IBM mainframe was many iterations ahead of the Elliot, but any computer big enough to hide in surely needed to be delivered by truck as well.

These pictures are useful in illustrating a point that Clay Shirky and others made some time ago: the system should never be optimized. In other words,  iteration and change should be embraced as a design principal. Any system surely can be improved–often radically– in the future. And, as time passes and successful migrations occur (our intent) the way we used to do things will inevitably will seem quaint in retrospect.

Comments (2)

  1. I was thinking about this all weekend. One of the fundamentals they reemphasize in archives and preservation courses is that nothing lasts–everything will eventually deteriorate and be lost to time. The challenge, of course, is extending the lifespan of a given information source/medium in order to *somehow* see to it that the information is handed down to future generations (if possible).

    Looking to the paintings in caves in Spain and France, they have survived for 20-40,000 years because they haven’t been exposed to sunlight. Sumerian clay tablets written in Cuneiform are about 5,000 years old and have survived because of the material, lack of exposure to the elements, and the lack of overall handling.

    Fast forward to now, and I guess the question is what is important: the preservation of the digital medium as it stands? By nature, because it is continuously changing/evolving, there’s no solid foundation upon which to fix a solid preservation effort in terms of hardware or software. (There are best practices, of course.)

    This is the catch-22 of the electronic medium. It’s light, can carry heavy amounts of digital information, but in comparison to, say the Rosetta Stone, its lifespan is extremely short, and each time the information is accessed further contributes to its eventual loss.

    If its accepted that “all information is eventually lost” is not a defeatist tone, but merely a pragmatic one, then maybe there are other alternatives to consider. I was wondering if maybe going back to microfilm as a means of providing indices or abstracts for digitized information (websites?) may prove beneficial in the way that encyclopedias offer an abbreviated alternative to reading a juicy 600-page biography. More importantly, microfilm has a life expectancy of approximately 500 years, so it can weather the storm of technological advances–all one needs is a magnifying glass and some light.

    http://www.dcc.ac.uk/sites/default/files/documents/Microfilm_2011_Final.pdf

  2. “Quaint” is a good way of looking back on “Perpetual Beta”…remember the boxes of software that used to sit on the shelf beside our PC”s?

    Tim O’Reilly certainly put his patterns succinctly which have stood the test of time and should be core to most business’s these days – and not just in software.

    I also agree that radical change and disruptive innovation could occur in the future so business’ need to keep an eye out for that too!

    http://bronwynshimminclarke.wordpress.com/2013/05/01/is-perpetual-beta-a-good-thing/

Add a Comment

Your email address will not be published. Required fields are marked *