Cloud // Infrastructure as a Service
Commentary
1/3/2012
01:01 PM
Commentary
Commentary
Commentary
Connect Directly
RSS
E-Mail
50%
50%
Repost This

Top 12 Cloud Trends Of 2012

Five years into the cloud computing phenomenon, we're much more aware of the limitations and consequences. Here are 12 trends to watch in the coming year, starting with numbers 7 to 12.

Only a few years ago, cloud computing didn't exist. Or rather, it existed by a dozen other names--such as virtualization, managed hosting, or simply The Internet. Today, it's the must-have feature of every product or service, from mobile phones to cameras to TVs.

Nobody knows this better than enterprise IT professionals, who have to deal with a rising tide of hyperbole and insatiable consumer expectations even as their budgets shrink and the role of technology in business grows. What nobody disputes, however, is that on-demand IT is here to stay.

While companies have been relying on software as a service and third-party tools for decades, it has been roughly five years since clouds entered the enterprise IT psyche, introduced by public providers such as Amazon, Google, and Salesforce.com and via private stacks from VMware, Microsoft, and Citrix. Five years is plenty of time to mature. We're much more aware of the limitations and consequences of utility computing. Here, then, are a dozen insights into what the next year will bring, nearly half a decade into the cloud era. In Part I of this two-part series, I'll cover trends seven to 12, in reverse order. In Part II, I'll cover cloud trends one to six.

Cloud Trend No. 12: Infrastructure, Code, And Data Are Intertwined.

We talk about writing code, storing data, and managing infrastructure, but these three things will soon be one and the same.

While much of the emphasis around cloud computing has been on virtual machines, it's really about data.

-- Compared to the cost of moving bytes around, nearly every other part of computing is free, according to research done by Microsoft nearly a decade ago.

-- Data is what we're worried will leak out. The reason the analogy between clouds and the electrical grid falls apart is that when someone steals your electrons, they don't have your corporate secrets.

-- Availability is a data problem. I can have 50 instances of an application running around the world. That's easy. But getting them to cooperate on sharing and updating a single user record is the hard part. The more copies we make, the more the data can be corrupted, get out of sync, and so on.

-- With the ability to scale out horizontally, we can make applications fast for millions of users. Scaling the data, however, is an entirely different matter. Ask any architect where the bottleneck is, and more often than not they'll point to data and I/O.

Tomorrow's applications will include three kinds of code: instructions for the business process itself; instructions for how to handle the data needed; and instructions for how to manage growth, shrinkage, and failure.

Consider, for example, a customer service application that can run on both public and private clouds. This application accesses a database that contains both innocuous information (the name of a customer, or his purchase history) as well as heavily regulated information (his social security number).

When the application is running in a trusted on-premises environment, the call center operator has access to all of the data. The operator can, after properly verifying a caller's identity, make changes to it. But when the application is running in a different environment--such as a public cloud used as part of a disaster recovery plan--the application can't access the social security data.

To accomplish this task, we need to encrypt the information not at the device or file level, but at the table or field level. The application needs to run with different permissions depending on its circumstances. It also needs to be smart enough to tell the operator what's happening, so that the operator can explain the situation to a caller.

Similarly, if a data center has a problem, the application can re-launch in another data center. But as machines and programs come online, they need to adapt to the new environment: different unique names, addresses, latencies, and so on. We do this through Devops, and platform orchestration systems like Chef, Puppet, and Pallet.

When a machine moves to a new location, it needs to take with it the data required to run. The more data it takes, the better it can run quickly. But the more that it takes, the longer it will take to relocate--and the more it will cost. As a result, there's a tradeoff to be made when moving a workload: just enough metadata and application logic to function, but not so much that things slow down.

There are nascent standards that let programmers declare how data should be handled, making workloads move about the world efficiently, adapting to changing circumstances.

Cloud Trend No. 11: What Can't I Put In The Cloud?

Several weeks ago, I was at a doctor's office with 10 physicians and two assistants. One wall of the waiting room was lined with manila file fodders, each emblazoned with colored stickers and numbers.

I spent a half hour waiting to see the doctor, and in that time, I saw at least three data errors. In one case, a doctor picked up the wrong folder, opened it, and then realized her mistake. In another, an assistant dropped a folder, spilling patient records across the floor. And in a third, an assistant couldn't find a patient's record because it had been misfiled.

The benefits of electronic health records are huge. In addition to overcoming these kinds of errors, health practitioners can work together on a patient, transferring information from specialist to specialist. And researchers can mine the information to understand the efficacy of a cure or the spread of a disease.

[ Understand how your peers are using the cloud. Read our exclusive research on the state of cloud computing. ]

Today, we're concerned about putting data in the cloud. For large organizations, that might be a real concern, but for small organizations like doctors' offices, police precincts, and schools--all of which deal with regulated data--leaving information out of the cloud could be a huge mistake.

We criticize the cloud, but we don't compare apples to apples. We don't really understand the costs of paper medical records, evidence stored on analog tape, student information saved in a single spreadsheet. In 2012, we'll start to do a real comparison of on- and off-cloud solutions, and realize that, for many businesses, the real question is what can't be better done in a cloud.

Cloud Trend No. 10: Inception, The Brain In The Vat, And Hardware.

An ever-increasing percentage of our enterprise applications run in virtual environments. We no longer use virtualization solely for increased utilization--that is, putting several virtual machines on one physical one in order to make the best use of its processing capacity. We also do it for operational efficiency, because it's easier to work with virtual bits than physical atoms.

Between the virtual machine and the bare metal on which it runs is a hypervisor, a piece of code whose core function is to trick the operating system into thinking it's running on bare metal. In some cases, companies add another layer beneath the hypervisor, to further streamline operations.

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Becky@FracRack
50%
50%
Becky@FracRack,
User Rank: Apprentice
2/8/2012 | 9:50:33 PM
re: Top 12 Cloud Trends Of 2012
This is a great article for Cloud trends. I particularly like #7 which talks about the large use of Cloud for DR. I also found another relevant article on DR in the Cloud here http://www.fracrack.com/blog/2...
LordWabbit
50%
50%
LordWabbit,
User Rank: Apprentice
2/4/2012 | 12:09:18 AM
re: Top 12 Cloud Trends Of 2012
Cloud = mainframe. We are going backwards with new buzzwords. All these years coding for distributed servers and we are going back to consolidated operating systems with multiple cores. Mainframes - yay - I used to work on them. At least with mainframes you had the option to keep your data in house behind your own protection. With the cloud anyone can take a crack at it. Today companies like verisign are being hacked daily and they are supposed to be trust authorities. How safe do you think your data will be on a mainframe (sorry, cloud) that is accessible by millions of people. When anon get a hold of an administrator password it will not matter how compartmentalized your data is. At least when you are behind hardline firewalls your are not subject to random hacks "I pick you pikachu". The hackers have to be in the same geographical location, have access to your hard lines, have the technical ability to both hack your comms and hack your encryption (a rare combination). For any company where every transaction means money (Banks, stock exchanges, clearing houses etc.) doing business in the cloud is clear insanity. For music stores and book sales, go for it. It saves a lot on infrastructure costs. Shared mainframe time. Also if a music store goes down it means 10,000 people out of a job, not an entire bank and all of it's investors out of their homes. Also don't trot out the adage that the data is encrypted, everyone knows that the data cannot be attacked directly. They attack the people with the passwords to the data. Key loggers etc. It takes one little mistake in the wee hours of the morning when you are half awake to accidentally load a key logger. One person in your trust chain gets compromised and you may as well not have encryption. Why? Because encryption keys should be cycled. But they aren't. They are normally hard coded, because they are a pain to change. All it takes is one annoyed exeployee and you may as well be sending clear text. DES has been compromised. MD5 look up sum tables are prevalent, it may not be your password, but it ends up appearing the same to a computer due to failures in the MD5 hash algorithm. So the attacker does not need your password, he needs the sum of your password, he needs your encrypted data - and with your encrypted data he can decrypt the rest of your data. Maintaining decent solid, rapidly changing security is beyond our current programming models. The hackers are ahead of us on this one, and until we catch up there will be a lot more hacking to come.
Back on topic - I realize I deviated but i will leave it because it seriously applies to the whole 'cloud' / 'mainframe' concept. If you are a little startup by all means go cloud. If you deal in billions - I would stay away and hire a good CIO.
ahendryx213
50%
50%
ahendryx213,
User Rank: Apprentice
1/12/2012 | 3:04:03 PM
re: Top 12 Cloud Trends Of 2012
Good post especially point 8 on the SLA detente.

With nearly all of the Cloud Providers that I've worked with as s Consultant it's amazing how their SLAs are only centred around uptime and availability. This is quite ironic in that you wouldn't buy a PC for example based on the fact that it will turn on and stay turned on when its performance is atrocious!

It is here where I think the Cloud will certainly need to mature as more and more critical applications are considered for it. SLAs need to be refined around performance metrics as opposed to just uptime and availability.

With the company that I work for namely Virtual Instruments we can uniquely measure the infrastructure performance of critical applications deployed in the cloud by looking across the SAN fabric. What we've found to be incredibly successful is enabling our Cloud provider clients to in fact mature their SLAs based on performance metrics such as response times.

We've also helped these Cloud Providers to help establish SLAs for their end users who didn't necessarily have any in place for their low tier apps. This has been a clear differentiator for them in gaining new customers and convincing them to deploy more key applications into the Cloud.

From what I'm seeing 2012 will certainly be the year where Performance will take more precedence in the drive towards the Cloud and that means a better grasp of SLA distinction and definition.

Archie Hendryx
lzelkha570
50%
50%
lzelkha570,
User Rank: Apprentice
1/4/2012 | 8:18:47 PM
re: Top 12 Cloud Trends Of 2012
One of the issues with cloud computing and addressed in the article is scale -- and sharding of data is increasing as a way to deal with this problem. We've got a general blog post on what database sharding is and how it can be implemented. (http://www.scalebase.com/datab...
2014 Private Cloud Survey
2014 Private Cloud Survey
Respondents are on a roll: 53% brought their private clouds from concept to production in less than one year, and 60% ­extend their clouds across multiple datacenters. But expertise is scarce, with 51% saying acquiring skilled employees is a roadblock.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Elite 100 - 2014
Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators.
Video
Slideshows
Twitter Feed
Audio Interviews
Archived Audio Interviews
GE is a leader in combining connected devices and advanced analytics in pursuit of practical goals like less downtime, lower operating costs, and higher throughput. At GIO Power & Water, CIO Jim Fowler is part of the team exploring how to apply these techniques to some of the world's essential infrastructure, from power plants to water treatment systems. Join us, and bring your questions, as we talk about what's ahead.