Practical Analysis: What Cloud Computing Really Means - InformationWeek
03:15 PM
Art Wittmann
Art Wittmann
Connect Directly

Practical Analysis: What Cloud Computing Really Means

If cloud adherents are using the technology as a means to drive the hard work of process reengineering, then bravo. But I don't give them so much credit.

InformationWeek Green - Apr. 19, 2010 InformationWeek Green
Download the entire Apr. 19, 2010 issue of InformationWeek, distributed in an all-digital format as part of our Green Initiative
(Registration required.)
We will plant a tree
for each of the first 5,000 downloads.

Art Wittmann I can't tell whether all the talk about cloud computing, and the benefits attached to it, comes from talkers who are fox-like clever or just woefully misguided. Hype is hype, and there will always be those who make excessive claims about whatever the new hot thing is. Cloud computing won't cure the common cold, even if someone somewhere claims that it will.

The first excessive claim was brought to my attention by colleague John Foley, who puzzled over an analyst note pointing out that federal data center useful storage rates are a pathetically low 12% and, you guessed it, claimed that cloud computing is the fix.

At the same time, we just got the results of our survey of federal government workers and contractors on their plans for cloud computing. There's a lot of interest, planning, and, for lack of a better word, hope for cloud computing. The feds do like to plan--but not every plan becomes reality and not every envisioned benefit pays off. Here, the comments from our survey respondents are instructive. One contractor bemoaned the efforts of federal overseers to quash any effort he made to improve efficiency or save money. He wanted to spin up a new service for FEMA in Amazon's Elastic Compute Cloud, but was told he needed to do it in a Department of Homeland Security data center, where, he noted, the lead time to allocate a new server was one year.

While it isn't hard to imagine why EC2 might not be an appropriate place for a FEMA application, that 12-month provisioning time is enough to stop you in your tracks. I can imagine a backlog of requests, and I can imagine some lag time introduced by the need to coordinate server, storage, and networking considerations, but I can't imagine those things adding up to a year's delay. Knowing that it's DHS, I can imagine that applications have to be profiled and assessed for security and privacy before any resources are allocated. My point is that the actual provisioning might take days or weeks, but there's another 40 to 45 weeks of stuff going on here that probably has nothing to do with provisioning. So if the feds were 100% cloud tomorrow, how much would this provisioning problem actually change? Saving 10 to 12 weeks is nothing to sneeze at, but there's other systemic bureaucracy that's a far bigger deal.

The storage issue packs the same problem. It's not technology that causes a 12% utilization rate; it's organizational policies, and procedures. Sure, technologies like storage virtualization, thin provisioning, and data deduplication could help here (note that cloud computing isn't on my list), but the real issue is that storage needs to be managed differently.

While the federal government can be a poster child for waste and inefficiency, it exhibits just a more extreme set of maladies common in non-governmental organizations as well. For server provisioning, the majority of the delay from request to delivery won't be fixed by cloud computing per se; it'll be fixed by changing the way IT looks at the job of provisioning services. The problem is that most organizations have no stomach for reengineering their procedures to wring out the waste and delay, but they can get excited about a new technology like cloud computing.

So if the fox-like clever adherents of cloud computing are using the new technology as a means to drive the really hard work of process reengineering, then bravo. The thing to remember, especially for large organizations, is that the process mapping and engineering need to come first. If you implement technology without knowing your process, you can bet you'll be reworking that technology almost immediately.

Art Wittmann is director of InformationWeek Analytics, a portfolio of decision-support tools and analyst reports. Write to him at

To find out more about Art Wittmann, please visit his page.

More than 100 major reports will be released this year. Sign up or upgrade your membership.

Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
How Enterprises Are Attacking the IT Security Enterprise
How Enterprises Are Attacking the IT Security Enterprise
To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
Register for InformationWeek Newsletters
White Papers
Current Issue
Digital Transformation Myths & Truths
Transformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Flash Poll