But by 2010, if current trends continue, the U.S. will still be a server power-hog, only not quite as big of a pig -- relatively speaking, says new research released on Tuesday.
The Asia-Pacific region -- excluding Japan -- will increase its server-related power consumption to 16% from 10% of worldwide utilization in 2000 and from 13% today, while the U.S.'s chunk of the power consumption pie will shrink to about 33%, says the report, entitled "Estimating Regional Power Consumption By Servers, A Technical Note."
That's mainly because the Asia-Pacific region -- including China, India, Indonesia, but excluding Japan -- are developing quickly and increasing their hunger for electricity to power and cool computers, says the report's author, Jonathan Koomey, project scientist at Lawrence Berkeley National Laboratory and consulting professor at Stanford University.
The new study, commissioned by chip maker AMD, is follow-up research to another Koomey report released earlier this year, which also estimated that the worldwide energy use by severs and related gear doubled from 2000 to 2005 -- but excluded analysis of how other regions of the world, outside the U.S, consumed that power.
In total, electricity to power computer servers and related infrastructure worldwide reached 123 billion kWh in 2005, which is equivalent to fourteen 1,000-megawatt power plants, or 14 typical nuclear or coal-burning plants, says Koomey.
Estimated growth by 2010 in server energy consumption worldwide (excluding Japan) will total another ten 1,000 megawatt power plants, predicts Koomey.
Japan was analyzed separately from the rest of the Asia-Pacific area because Japan is already "developed," says Koomey. Japan consumed about 12% of server and related cooling electricity from 2000 to 2005. Add Japan with the U.S. and Europe, and those regions consumed about three-quarters of all server-related electricity worldwide from 2000 to 2005, says Koomey.
Europe consumed about 25% of the worldwide energy used to power servers from 2000 to 2005, and its power consumption is growing about 17% annually. The average worldwide growth rate (including the U.S.) is about 16%. In Asia-Pacific (excluding Japan), the growth rate in annual server electricity consumption is about 23%.
U.S. server-power consumption in 2005 was 1.2% of all electricity used in the country, or about the same amount of electricity used to power all U.S. residential TVs, he says. If you add in routers, switches, disk arrays and other assorted gear, then total data center power consumption in the U.S. was about 1.5% of electricity used here, he says.
If trends continue, from 2005 to 2010, energy consumption worldwide for servers, cooling equipment and related infrastructure gear will grow about 76%, says Koomey.
However, that growth could be reduced by about 20% if certain energy-efficiency processes and technologies are deployed, including virtualization software, better management of hot and cold areas in data centers, as well as changes in corporate policies, including budgeting, says Koomey.
"A lot of changes aren't technology, but are institutional and people changes," he says. That includes combining the budgets of IT and facilities expenses of data centers, so that IT leaders have more incentive to deploy energy-saving technologies and processes.
Currently, at most companies, the cost to power data centers isn't part of the IT budget. Rather than looking at data centers as a "cost per square foot," companies need to look at data centers as "cost per kilowatt," Koomey says.
"Floor area isn't as important as cost per kilowatt," he says.