Public cloud infrastructure, the pay-as-you-use server and storage services sold by the likes of Amazon, Microsoft, and Rackspace, is getting some serious test-drives this year. About one-fourth of IT execs in our Global CIO Survey say they plan a major project this year to bring the public cloud into their IT infrastructure, compared with just 11% who have such services in place now.
But there's a cloud of doubt hanging over cloud infrastructure. IT execs, especially at big, established companies, are asking themselves: Are these services really less expensive than the capacity provisioned from my own data center?
Art Wittmann, InformationWeek's resident cloud curmudgeon, laid out a simple thesis in a column earlier this year: CPU performance and drive storage capacity keep climbing at logarithmic rates, but cloud infrastructure vendors aren't providing all of those implied cost savings back to their customers. "The Moore's Law advantage is immense and isn't something you should give up lightly, but some cloud providers are asking you to do exactly that," Wittmann wrote (see "Does The Cloud Keep Pace With Moore's Law?").
A big part of the problem is that the pricing structures are difficult to compare. Enterprise data centers have an annual budget that accounts for a mix of operational and capital expenses. Cloud services are usually based on a per-hour or other per-use charge. Even if IT organizations can approximate the hardware and software costs to run a given computing task on premises, they still need to figure in the job's share of the data center space and electricity bill. What share of network management, storage management, and system administration time should be allocated to one on-premises IT service? Usually, IT has only limited visibility into those factors on a service-by-service basis, so it becomes a guesstimate.
"It's a problem that has always existed," says Larry Godec, CIO at First American Financial, which had an acute case of hard-to-decipher costs, thanks to heterogeneous systems and duplicate applications that came with several acquisitions of title insurance and mortgage settlement companies. Shortly after becoming CIO in 2010, Godec implemented a cloud-based IT accounting system, Apptio, to get a better view into the cost of his systems. It required lots of data on existing applications, the hours of programmer time that went into building them, and the resources on which they run. There's a service modeling component to show all of the costs for internally provisioned services, and Godec thinks that feature will help it compare those to public cloud services, a process it has just started. But even a year into production, his staff continues adding information to the modeling system to get the assessment right. "It's an ongoing process," says Godec, who has since taken a seat on the Apptio board. "I wouldn't call it immature, but we still have a lot of work to do."
Even if IT organizations can get a handle on their in-house costs, comparing the pricing from the various infrastructure vendors can be confusing.
Despite this uncertainty, a good number of IT pros still seem confident in the cloud's cost saving potential in the long term. In the InformationWeek IT Spending Priorities Survey, we asked 58 IT pros who are using or adopting public cloud services about the economics. While 59% think cloud computing will raise costs short term, 74% say it'll lower costs long term. Fifty-nine percent think it'll lower operating costs long term, and half think it'll lower capital costs.
Those lower operating costs are what Jim Comfort, IBM's VP of cloud management, hangs his hat on. The real payoff is from a more efficient operating environment, from easier management and economies of scale in a standardized, x86-server cloud data center, he says. But since most companies can't say with much clarity what it costs them to deliver a particular IT service, measuring the payoff from moving to the cloud becomes a murky exercise.