The extremely air-conditioned computer farms known as data centers are the gas-guzzling jalopies of the technology world. Some require 40 or 50 times more power than comparably sized office space.
So with energy costs high and environmental friendliness making for good public relations, more tech companies are touting ways they are “greening” data centers, which serve up Web pages, swap Internet traffic, and process and store business information.
But it’s a lot easier to put out a news release than to build a data center with a significantly smaller environmental footprint. Even as efficiency improvements are reducing the energy gulped by many kinds of hardware, the industry’s overall electricity consumption could double from 2006 to 2011 as demand grows.
“It’s somewhat analogous to someone who decides to purchase an energy-efficient automobile and says, ‘Gee, I’m using 30 percent less gasoline with this, that means I can drive 30 percent more miles than I used to, and still do something for the environment,”’ said Charles King, an analyst with Pund-IT Research. “It’s an interesting philosophical question.”
A new report from the Environmental Protection Agency estimates that the easiest, least expensive changes to data center operations — involving tweaks to software, layout and air conditioning — could boost efficiency by 20 percent.
But even that level of improvement would still lead to higher overall electric use in the coming years. Going further, and actually reducing information-technology’s strain on the electric grid, will require a more aggressive commitment. The EPA says 45 percent improvement — enough to lower electricity usage by 2011 — can be achieved with existing technologies.)
To understand the scope of the problem, it helps to grasp why data centers are so power hungry.
Depending on the configuration and the equipment involved, as little as 30 to 40 percent of the juice flowing into a data center is used to run computers. Most of the rest goes to keeping the hardware cool, since heat saps performance.
Unlike in other office space, that A/C cranks year-round, to overcome the 100-degree-plus air that the computers themselves throw off. That challenge has increased in recent years with the rise of compact “blade” servers that are crammed into server racks.
This is why big data centers can devour several megawatts of power, enough for a small city.
Neil Rasmussen, chief technical officer of data center equipment supplier American Power Conversion Corp., calculates that even a 1 megawatt data center will ring up $17 million in electric bills over its 10-year life span. Even so, few data centers have taken obvious steps to reduce that load.
“I don’t know too many people who have really tackled this and pushed things to the limit with what we already have,” said Mark Bramfitt, a program manager for Pacific Gas & Electric Co., a California utility that has offered cash incentives for data centers to reduce their energy bills — and found relatively few takers. “We’ve got a whole lot of room for improvement.”
For example, almost all the energy that goes into the air conditioning systems is used to run giant chillers that make the air pumped through the rooms’ raised floors a brisk 55 degrees or so, sometimes as low as the 40s. Such extremely cold air is blasted in to guarantee that no single server’s temperature gets much above the optimum level, which is around 70 degrees.
But the A/C doesn’t have to be so cold if the layout of server rooms is better designed to improve air flow, smoothing out all the various microclimates that can develop.
And in many places, the outside air is plenty cold enough much of the year, for free. Yet only recently have data centers adopted systems that can take filtered outside air for cooling the computer rooms.
To be fair, some data centers are buried too deep within buildings to gulp fresh air. But the main reason for the A/C over-reliance is that data centers were built for one thing — to maximize the performance of the Web sites, computer programs and networking equipment that they run. If the air conditioning is colder than necessary, so be it.
“There are probably two key metrics for the IT guy: no downtime — if the boss’ e-mail doesn’t work, he hears about it right away — and ‘no security breaches on my watch,”’ says Eric Birch, executive vice president of Degree Controls Inc., which sells a system that increases electronics cooling efficiency. “They normally do not know, don’t care and aren’t measured by their electric bill.”
In fact, in many companies, any given department’s responsibility for the overall utility bill is determined by such factors as employee head count or square feet of office space. By that measure, the IT department comes out way ahead.
Steve Sams, a vice president for data-center services at IBM Corp., estimates this is still the state of affairs 70 to 80 percent of the time. The tech shops “aren’t actually paying their real energy bill,” Sams says. “What it shows me is how immature we are in this area as an industry.”
Not until recently have the industry’s concerns about the issue crystallized. Chip manufacturers such as Intel Corp. and Advanced Micro Devices Inc. have ratcheted up the electrical efficiency of their microprocessors — a metric that no one cared about until the past few years. IBM and Hewlett-Packard Co. have invested in better ways to manage cooling systems.
One data-center operator, Rackspace Inc., just announced a new facility in Slough, England, powered by renewable sources such as biomass. Some smaller providers have gone solar, including California-based Affordable Internet Services Online (AISO) Inc., which recently ran the Web infrastructure for the Live Earth concerts.
The “green” value in other steps is harder to discern.
One commonly talked-about effort involves virtualization, which lets one computer handle the functions of multiple machines at once. Rather than having dozens of servers operating at far less than their maximum level of utilization, data centers can use virtualization to consolidate those same machines’ functions on just a few computers.
The result can be striking — in its solar-powered center, AISO uses virtualization to mimic the functions of 120 servers on just four machines — and clearly it saves electricity.
IBM claims it will save $250 million — partly in reduced power costs — by taking 16,000 internal servers out of action and shifting their work on 30 big mainframe computers. IBM also is using virtualization and better cooling technologies to expand the capacity of its largest data center, in Boulder, Colo., by nearly half, while leaving it with “the same energy footprint” as before, Sams says.
That’s a solid step, to be sure. But it’s fair to say the information-technology industry — which, after all, makes machines packed with toxic chemicals — will have to do much more to truly merit green credentials.
By consolidating servers and taking other steps to overhaul its internal computing setup, Sun Microsystems Inc. says it cut IT energy usage by a few percentage points in its last fiscal year — and expects to slice it by another 20 percent this year. Dave Douglas, Sun’s vice president for eco-responsibility, says the company is still learning how to make deeper improvements.
“We view green as a destination, in which case we focus on what we have left to do,” he says. “When we look at that, we say, geez, we’re not really green yet. We’re greener — but we’re not green.”