The heat is rising--and costs, too--as tightly packed servers consume gobs of electricity
Nine million servers hum in computer rooms across the United States, driving our information-obsessed, transaction-fueled economy every second of every day. It's an astonishing display of computer-processing power--and an insatiable electricity hog that's become a huge expense for many companies.
If racks and racks of Unix, Windows, and Linux servers deliver megaflops of computational speed, megawatts of power consumption are the price businesses pay. Data center electricity costs are soaring as companies deploy growing numbers of servers, consuming ever more power, and, in the process, throwing off heat that needs to be cooled using still more juice.
The problem could get worse before efforts to contain it catch up. Data center electricity costs are already in the range of $3.3 billion annually, and the number of servers in the United States will jump 50% over the next four years, IDC predicts. The data center utility bill exceeds the cost of acquiring new computers for some companies. And it can cost more to cool a data center than it does to lease the floor space to house it. Edward Koplin, a principal at engineering firm Jack Dale Associates, estimates the average annual utility cost for a 100,000-square-foot data center has reached $5.9 million.
Ironically, state-of-the-art computers
are part of the problem. Blades, the fastest-growing segment of the server market, can be packed into a smaller space than rack-mounted servers. As density increases, however, the amount of heat produced by blades and their processor cores rises, and you have computing's double whammy--pay once to power servers and a second time to cool them.
The more servers, the bigger the problem, and that's got Web powerhouses such as Google and Yahoo working furiously to find solutions. In a paper published three years ago, Google engineers foresaw the challenge, calculating that an 80-unit rack of midrange servers, each with two 1.4-GHz Pentium III processors, required about 400 watts of electricity per square foot. The dilemma: Most data centers couldn't handle more than 150 watts per square foot.
That was three years ago, yet Google principal engineer Luiz Andre Barroso, one of the authors of the report, is still worried. "Although we're generally unhappy with the power efficiency of today's systems, our concerns are more with trends than with today's snapshot," says Barroso via E-mail. Power efficiency is now a design goal for many tech vendors, and Google wants the industry to "execute aggressively," he says. New computer models, however, must be both energy- and cost-efficient, he warns. "A solution that has superior power efficiency but worse overall cost efficiency is unlikely to be competitive," Barroso says.
Meanwhile, technology vendors are reinventing themselves as air-cooling specialists to bring data centers-turned-saunas under control. Hewlett-Packard last month introduced its first environmental-control system, which uses water cooling to lower temperatures. "Data centers, compared to Moore's law, have been fairly slow-moving animals and haven't changed much in the last 20 years," says Paul Perez, VP of storage, networking, and infrastructure for industry standard services at HP. "Moore's law is running smack into the wall of physics."
Feel The Heat
At Pomona Valley Medical Center, the problem reached a melting point when the data center temperature spiked to 102 degrees, causing hard drives to go on the blink. The medical center had been centralizing servers scattered across the 426-bed facility, and as the heat rose in its data center, two 5-ton air conditioners couldn't keep up. "We had box fans hanging from the ceiling. It was the most ridiculous thing you've ever seen just to try to move air around that room," CIO Kent Hoyos says.
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.