New technology, security, and reliability requirements are changing the data-center infrastructure.
Recent legislation such as the Sarbanes-Oxley Act makes it imperative that companies not risk losing data or even risk downtime that could jeopardize accessing information in a timely fashion, says Michael Fluegeman, VP at Syska Hennessy Group, a data-center consulting company. "You can't operate without a rock-solid infrastructure," he says. So companies "are digging deeper and deeper and spending more and more money to make these facilities more robust and safe from every possible threat of downtime or loss of critical information."
Major upgrades are in order for many data centers in the coming years to accommodate new technologies and improve operating efficiencies.
The data center is hot, hot, hot. Many data-center professionals worry that new equipment is being bought without adequate concern for power or cooling requirements.
Despite challenges, companies haven't moved en masse to utility-computing models.
To help with reliability, Randy MacCleary, a VP and general manager of Liebert's uninterruptible-power-supply business, expects to see as many as three-quarters of data centers introducing dual-bus UPS systems within the next five years. Only about a third of data centers now include these systems, which permit one UPS bus and its associated distribution system to be shut down for maintenance while the load continues to be supplied by the second UPS bus. Another step to improve reliability is to create partitioned data-center environments, he says. For example, in a 10,000-square-foot data center, a company could create four 2,500-square-foot sections, with each section addressed by a separate dual-bus UPS system. If for some reason one section were to fail, the three remaining sections would be able to carry the data center forward with minimal disruption.
Despite some challenges, nearly half the companies in the survey say they're not looking to utility computing to offload those problems. Vanguard, which recently surpassed $800 billion in assets, says utility computing is a possibility in the future, but right now it's doing just fine running two data centers that consist of more than 100,000 square feet of raised floor capacity, with more than 1,000 Unix and Windows systems and mainframes, for what it cost to run one data center in 2001. Vanguard doubled its capacity without raising costs chiefly through automation and server consolidation. "We've been able to grow our multifunction business without increasing costs for our data center," Samanns says. "We have redundancy of the data center at no cost to shareholders."
Not every company runs its data center as efficiently as the servers in those centers run their most important applications. But now's the time for that to change so these data centers are ready for the future, whatever it may bring.
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
Top IT Trends to Watch in Financial ServicesIT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.
Join us for a roundup of the top stories on InformationWeek.com for the week of September 18, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."