Top of page

Tending the Machines

Share this post:

The computers that store and serve our digital collections are multiplying rapidly to keep up with our escalating data demands. All the while the servers guzzle power, radiate heat, crowd bandwidth and exert an unprecedented burden on the power grid. During these tough economic times, as institutions scrutinize their operations for budget-trimming opportunities, they must evaluate their data centers for effectiveness and waste. In a recent story, the Federal government announced its plans to close 800 of its data centers to help reduce its energy budget. The encouraging news is that modern advances in data-center technology lean toward cost savings and energy efficiency.

Server Farm
Server Farm by Burnt Pixel on Flickr

Data centers filled with arrays of computers have grown over the decades to become a 21st century heavy industry. Out of sight and out of mind to most of us who demand their services, data centers require intelligent engineering solutions to maintain them and keep them cost effective. Most cultural institutions, such as the Library of Congress, maintain their own data centers – servers and storage devices that reside on the premises in server rooms. By contrast, commercial data lords have mega-data centers in national and international locations, each center sprawling across acres of land.

Regardless of the size and scale, all digital-information institutions and companies face similar challenges: storing and serving data and maintaining and powering the machines. Access to data depends on the health of the machines and the environment in which the machines reside.

As technology evolves and servers become more compact, more units can fit in the same amount of space that the old units were in. But along with the additional server muscle comes the increased physical density of machines in a given space and more hot components to cool.

Indeed, one of the highest maintenance priorities of a data center is keeping the machines cool. Racks and racks of continuously running servers throw off heat that, if not contained, damages the servers. Chris Jordan, formerly of the San Diego Supercomputer Center and now data management and collections group lead at the Texas Advanced Computing Center at the University of Texas at Austin, said, “Cooling is typically more than a quarter of infrastructure costs.”

Cooling is mostly done by air conditioners forcing air over chilled water coils (and depending the size and design of the cooling unit, water can become another resource to factor into a facility’s overhead costs). To reduce energy costs, many data centers are taking advantage of their site’s geographic location and cooling their servers with outside air or partnering with distant data centers that reside in cooler climates.

Banhof
Photo by reggestraat on Flickr

As an alternative to building-centric data centers, a new generation of data centers are built into shipping containers, such as those hauled by big-rig trucks. These modular data centers are gaining wide acceptance for their flexibility, energy efficiency and cost effectiveness. The smart, climate-controlled containers are filled with servers and other computer equipment to run the center, plus the wiring and cooling systems. Some modular data centers have prefabricated standardized components for quick assembly and rapid deployment. Some arrive pre-assembled, configured and ready to plug in.

Some modular data centers utilize a combination of air conditioning and cool air from outside the container. The container’s on-board air economizers “know” when to cool off with outside air (and reduce power consumption) if the outside temperature is right or when to turn on the air conditioning if it isn’t.

San Francisco-based Internet Archive, for example, uses a modular data center stored 45 miles away in Santa Clara, California. It is an early generation water-cooled air conditioned unit. Kris Carpenter, director of the Web Group at the Internet Archive, said the container is efficient, fully functional and busy. Carpenter said, “The container has 3 petabytes of data, all web data. When you browse the global Wayback machine, it pulls up content from the Santa Clara facility.”

On a side note, the Internet Archive – in a display of Bay Area know-how – warms its San Francisco headquarters with the heat radiating from its servers. IA occupies a cavernous building, formerly a Christian Scientist Church, in a neighborhood blanketed part of the year in chilly fog and swept by cool offshore winds year round. The temperature averages around 56 degrees. “We are often cold in the main building,” Carpenter said. “We have an air circulation mechanism that’s able to pull that cool air through the equipment and vent hot air back into the main building.”

Cloud storage is an attractive option for cultural institutions because the cloud service bears much of the data-center infrastructure cost. And as cloud data centers locate to cooler climates, into mountainsides and by hydroelectric power sources, those cloud services optimize their own operations costs. Also, the cooler countries that attract the cloud data centers – such as Canada, Finland, Sweden and Switzerland – benefit economically.

Fortunately there are many prudent solutions these days for not only cooling but power conservation. David Sagstetter, senior network administrator of the Minnesota Historical Society, favors hardware virtualization as a power-saving solution, eliminating the need to run all of the machines full power around the clock. With virtualization, one machine does the light work of several others at the appropriate times. “At night, after most of the workers have left and the server usage has slowed down, a number of servers will automatically power down and one server will run several virtual machines,” Sagstetter said.

It still comes down to risk-assessment and budget. Can your institution face rising costs and shrinking budgets and still guarantee the integrity of your data center? Kris Carpenter tells a cautionary tale about a national library that couldn’t pay any more for power and was forced to reassess their data center and reduce waste. In the end, their austerity measures worked but it was a difficult and painful process.

Carpenter envisions a network of data centers devoted solely to memory and cultural institutions, another collaborative challenge for the benefit of digital preservation. She said, “But how can we partner to shoulder the responsibility? We’re not forced to deal with it just yet but we should start discussing it.”

Add a Comment

This blog is governed by the general rules of respectful civil discourse. You are fully responsible for everything that you post. The content of all comments is released into the public domain unless clearly stated otherwise. The Library of Congress does not control the content posted. Nevertheless, the Library of Congress may monitor any user-generated content as it chooses and reserves the right to remove content for any reason whatever, without consent. Gratuitous links to sites are viewed as spam and may result in removed comments. We further reserve the right, in our sole discretion, to remove a user's privilege to post content on the Library site. Read our Comment and Posting Policy.


Required fields are indicated with an * asterisk.