Recent legislation such as the Sarbanes-Oxley Act makes it imperative that companies not risk losing data or even risk downtime that could jeopardize accessing information in a timely fashion, says Michael Fluegeman, VP at Syska Hennessy Group, a data-center consulting company. "You can't operate without a rock-solid infrastructure," he says. So companies "are digging deeper and deeper and spending more and more money to make these facilities more robust and safe from every possible threat of downtime or loss of critical information."
Despite some challenges, nearly half the companies in the survey say they're not looking to utility computing to offload those problems. Vanguard, which recently surpassed $800 billion in assets, says utility computing is a possibility in the future, but right now it's doing just fine running two data centers that consist of more than 100,000 square feet of raised floor capacity, with more than 1,000 Unix and Windows systems and mainframes, for what it cost to run one data center in 2001. Vanguard doubled its capacity without raising costs chiefly through automation and server consolidation. "We've been able to grow our multifunction business without increasing costs for our data center," Samanns says. "We have redundancy of the data center at no cost to shareholders."
Not every company runs its data center as efficiently as the servers in those centers run their most important applications. But now's the time for that to change so these data centers are ready for the future, whatever it may bring.
-- with Martin J. Garvey
Illustration by Steve Lyons
CPU Cool: Getting Faster But Not More Power-Hungry
and Utility Interest: The Model's Catching On