5 Data Center Trends For 2013
By Charles Babcock
InformationWeek
1. Location Drives Energy Efficiency
There is one data center concern that overwhelms all others: the need for energy efficiency. At one time, energy costs were viewed as a given, compared to the expenses in hardware purchases and labor for operations. But as hardware became more efficient and automated procedures more prevalent, the cost of energy has steadily risen to capture 25% of total operating costs, and it now sits close to the top of the list.
In addition, there is a clash building between environmentalists versus smartphone and tablet users and data center operators. As the evidence builds for global warming, the unbridled growth of computing in many forms is coming under attack as a wasteful contributor to global warming. Indeed, such an attack was the theme of a landmark New York Times story published Sept. 22, "The Cloud Factories: Power, Pollution and the Internet."
[ Our analysis of the New York Times story was one of InformationWeek's top 12 stories of 2012. Catch up on the other 11 at Best Of InformationWeek 2012: 12 Must-Reads. ]
This clash will take place even though data center builders are showing a remarkable ability to reduce the amount of power consumed per unit of computing executed. The traditional enterprise data center uses just under twice as much electricity as it needs to do the actual computing. The extra amount goes to run cooling, lighting and systems that sustain the data center.
A measure of this ratio is PUE, or power usage effectiveness. An ideal PUE would be 1.0, meaning all the power brought to the data center is used for computing -- probably not an achievable goal. But instead of 2.0, Google showed it could build multiple data centers that operated with a PUE of 1.16 in 2010, reduced to 1.14 in 2011.
Each hundredth of a point cut out of the PUE represents a huge commitment of effort. As Jim Trout, CEO of Vantage Data Centers, a wholesale data center space builder in Santa Clara, Calif., explained, only difficult gains remain. "The low-hanging fruit has already been picked," he said in an interview.
Nevertheless, Facebook illustrated with its construction of a new data center in Prineville, Ore., that the right location can drive energy consumption lower. The second-biggest energy hog, just below electricity used in computing, is power consumed for cooling. Facebook built an energy-efficient data center east of the Cascades and close to cheap hydropower. By using a misting technique with ambient air, it can cool the facility without an air conditioning system.
It drove the PUE at Prineville down to 1.09, but a Facebook mechanical engineer conceded few enterprise data centers can locate in the high, dry-air plains of eastern Oregon, where summer nights are cool and winters cold. "These are ideal conditions for using evaporative cooling and humidification systems, instead of the mechanical chillers used in more-conventional data center designs," said Daniel Lee, a mechanical engineer at Facebook, in a Nov. 14 blog.
Most enterprise data centers remain closer to expensive power and must operate year round in less than ideal conditions. Facebook also built a new data center in Forest City, N.C., (60 miles west of Charlotte) where summers are warm and humid, attempting to use the same ambient air technique. To Lee's surprise, during one of the three hottest summers on record, the misting method worked there as well, although at higher temperatures and humidity. Instead of needing 65-degree air, it will operate at up to 85 degrees. And instead of having a maximum 65% relative humidity, it can function with 90%. That most likely resulted in the need to increase the flow of fan-driven air. Nevertheless, a conventional air-conditioning system with its power-hungry condensers would have driven the Forest City PUE far above the Prineville level.
The equipment and design used to achieve that PUE are available for all to see. In 2011, Facebook initiated the Open Compute Project, with the designs and equipment specifications of its data centers made public. Both Prineville and Forest City follow the OCP specs.
Thus, Facebook has set a standard that is likely to be emulated by more and more data center builders. In short, 2013 will be the year when the Open Compute Project's original goal is likely to be put into practice: "What if we could mobilize a community of passionate people dedicated to making data centers and hardware more efficient, shrinking their environmental footprint?" wrote Frank Frankovsky, Facebook's director of hardware design and supply chain, in a blog April 9.
Google is another practitioner of efficient data center operation, using its own design. For a broad mix of older and new data centers, it achieved an overall PUE of 1.14 in 2011, with a typical modern facility coming in at 1.12, according to Joe Kava, VP of Google data centers, in a March 26 blog. In 2010, the overall figure was 1.16. Charles Babcock is an editor-at-large for InformationWeek.
Federal agencies must eliminate 800 data centers over the next five years. Find how they plan to do it in the new all-digital issue of InformationWeek Government. Download it now (registration required).
1 | 2 | Next Page »
| To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy. |
InformationWeek Reports
Cloud Implementer's Checklist
Once your agency has completed the business case for a private cloud, how do you actually move ahead with your data center transformation? Our report provides a practical set of steps to get you there, including a "to do" list that will be helpful to anyone on your IT team who's involved in the project. By the time you're done, your data center should be home to a more flexible, on-demand IT services.
Cloud Compliance in Government
Compute clouds created for government data centers must adhere to a range of specifications designed to support data and system security, privacy, and governance. FISMA, HIPAA, SOX, and SAS 70 are just some of the requirements that have to be taken into account as federal IT pros deploy a shared-services cloud model. In this report, we identify the key specs that need to be factored into any federal cloud architecture.
Government Cloud Platform Strategy
This report analyzes the key IT infrastructure considerations that must be taken into account for implementing cloud services in federal data centers: software/hardware environment, multi-tenancy, security, virtualization, and management tools. We also explain the key important role that APIs play in supporting hybrid scenarios that tap into public cloud services.
The Business Case for Government Clouds
This report assesses usage scenarios, barriers, and other variables that factor into the decision of whether and how to implement cloud computing in federal environments.



Subscribe to RSS