Data Center Energy Consumption Has Doubled Since 2000
Worldwide, the electricity consumption for data center servers in 2005 was equivalent to 14 power plants, according to a new study.
The energy consumed by data center servers and related infrastructure equipment in the United States and worldwide doubled between 2000 and 2005, according to a new study.
What's driving this consumption? From users, it's their hunger for everything Web, from video on demand and music downloads to Internet telephony and more, says the study's author, Jonathan Koomey, a consulting professor at Stanford University and staff scientist at Lawrence Berkeley National Labs. The study was commissioned by microprocessor vendor Advanced Micro Devices.
The spike in power consumption was also caused by a number of other trends, especially the proliferation of "lower-end servers" costing less than $25,000 in the United States and worldwide, says Koomey.
In fact, the jump in overall power consumption was only 5% to 8% associated with power use per unit. A jump in the volume of servers in data centers is accountable for 90% of the growth in power consumption, Koomey says.
"Trends in software -- such as the move to Linux and distributed platforms and away from operating systems that charge per server"--have fueled the demand for greater numbers of low-end servers, he says.
For the study, Koomey estimated power use per type of server multiplied by total install base of specific servers, for which analyst firm IDC provided data.
While there's been a move toward lower-end servers, there's been a shift away midrange servers, and high-end server volumes didn't change much from 2000 to 2005, Koomey says.
In U.S. data centers in 2000, there were approximately 5.6 million servers installed in total, including about 4.9 million low-end servers, 663,000 midrange servers, and 23,000 high-end servers, according to Koomey's study.
By 2005, U.S. data centers totaled 10.3 million servers installed, including 9.9 million low-end servers, 387,000 midrange servers, and 22,200 high-end servers.
Worldwide, data centers in 2000 made up about 14.1 million installed servers, including 12.2 million low-end servers, 1.8 million midrange servers, and 66,000 high-end servers.
However, by 2005 that worldwide server total climbed to about 27.3 million, including nearly 26 million low-end servers, 1.2 million midrange servers, and 59,000 high-end servers.
That spike in data center servers contributed to a doubling of energy consumption rates in a mere five years, according to the study.
In the United States the power consumption in 2005 for servers and related equipment in data centers was equivalent to about five 1,000-megawatt power plants, or your five typical nuclear or coal power plants, says Koomey. Worldwide, the electricity consumption for the servers was equivalent to 14 power plants.
The total electric bill to operate those servers and related infrastructure equipment was $2.7 billion in the United States and $7.2 billion worldwide, says Koomey.
Total power consumed by data center servers in 2005 represented 0.6% of all electricity consumption in the United States in 2005. When you throw in the power consumed by data center's auxiliary infrastructure equipment, including network and cooling gear, that figure jumps to 1.2% of all electricity consumed in the United States. That's about equivalent to all color televisions in the United States.
Koomey suggests a number of things that companies can do to get a better handle on their energy consumption in their data centers, including deploying of virtualization software, improving cooling strategies, being better aware of the total cost of ownership of operating computer gear, and something as simple changing power supply products.
"Power supply is a big thing, and that's just something you can drop in," he says.
About the Author
You May Also Like
2024 InformationWeek US IT Salary Report
May 29, 20242022 State of ITOps and SecOps
Jun 21, 2022