The private cloud based on a heavily virtualized environment in the enterprise data center is going to have a limited shelf life, according to a new study.
The number of IT managers planning that approach was down to 28% of respondents in a recent survey and dropped to 16% as they looked forward to what's next in their future.
Use of public cloud computing, on the other hand, continues to mushroom. Two years ago, 20% of a similar set of respondents expected to get half or more of their IT services from the cloud. Today that number has nearly doubled to 39%.
At the other end of that question, those expecting to get 10% or less of their IT services from the cloud has dropped from 32% to 11%. The anticipated era of IT services emanating from the public cloud is registering on the radar of practical, caution-minded IT managers and system administrators.
The report, 2017 State of Cloud, was generated by InformationWeek and Interop ITX, organizer of the independent Interop show each year in Las Vegas. Interop ITX and InformationWeek are both owned by UBM Americas. The report was written by Joe Emison, CIO of Exceligent, a supplier of U.S. real estate information, who reviewed the by invitation-only responses from 307 IT managers and technology professionals at a variety of companies. Those responses were collected in December. Forty percent of responses were from companies with more than 1,000 employees. Half of the responders had a title of CIO, VP of IT, IT manager or IT director.
One of the surprising findings of the survey was that 85% of respondents reported using more than one cloud supplier and Emison concluded that number is likely to increase. At the same time, a majority of respondents said they hadn't adopted software that orchestrates workloads or manages their cloud operations, indicating future cloud deployments will impose increasing complexity and amount to a new management problem.
Another surprise was that among this sample group, use of both the Google Cloud Platform and Amazon Web Services was going up, while use of Microsoft Azure was going down. AWS use increased among the respondents from 39% to 52%. Google Cloud increased from 23% to 38%. But Microsoft Azure declined 10% from 48% to 38%. Since respondents were using more than one cloud, the 2015 vs. 2016 figures across providers don't add up to 100%.
Emison was wary of the prospects for hybrid cloud operations, where virtual machines running in the enterprise data center transfer smoothly into the public cloud, when needed. His skepticism was evident even though 2016 was the year that AWS announced it would run a VMware-like environment on its EC2 cloud so that enterprises managing their infrastructure through vSphere virtualization would have a look-alike environment in the cloud. That offering supposedly is finding many takers.
"Real practitioner experience (including my own) provides strong cautionary tales against going halfway to the public cloud," he wrote. Organizations that seek a hybrid operation want to be able to launch virtual machines either on their own hardware or in the public cloud. They wish to do so from a familiar management console, such as VMware's vCenter. They also want to be be able to "cloud-burst" or send a heavily taxed portion of an application into the cloud where it can be scaled up, as demand warrants.
"Unfortunately, neither of these use cases seem to work well in practice…. Our survey showed a decline in past years in the number of respondents who claimed they could deploy on either the private or public cloud – from 30% to 26%," Emison wrote.
The use of traditional tools to launch VMs in the public cloud had a limited ability to take advantage of general cloud services, while the VMs launched there had all the disadvantages of a remotely operated and managed server – less reliable, limited ability to set configurations, he said.
The "most painful" lessons have been learned by those trying to run part of an application in a private cloud and another part in the public cloud. Such practitioners "have learned there are many assumptions about network reliability that simply aren't true. Latency and dropped connections between the two clouds cause countless errors that are extremely hard to debug and fix," Emison said.
Respondents also said there were security issues embedded in such operations which pose as "the most significant challenge in private and hybrid cloud adoption so far," the report said. Sixty percent of respondents still list security in cloud operations as one of their top concerns, it noted.
Based on the respondents' feedback, Emison concluded that running applications divided between public and private environments introduced too much complexity, increased the challenges of application design and introduced unpredictable network latencies. Skipping the hybrid phase and moving straight to public cloud "solves these issues… Attempting to bolt a private cloud onto a public cloud does not," Emison noted.
Container Management Evolving
Meanwhile, some of these same issues of workload management are leading to widespread interest in Docker containers as a solution to some of the problems, the report continued. Only seven percent of respondents were actually using containers, but 50% said they were interested in doing so.
That interest has lead to the test driving of Docker Swarm, a container cluster management system, but Swarm has yet to show that it's part of the overall solution, the report continued. That in turn has led to "increased interest in Kubernetes and Cloud Foundry," Emison said.
But it's still early in container evolution and container management. Whatever tool is selected it is "almost guaranteed" to be obsolete in the near future, he said, ending that phase of the report on an unpromising note.
"Serverless computing" is an approach to cloud computing that will gain traction in the future, the report continued, through services such as AWS' Lambda. Functions or microservices can be placed on AWS servers with a software event triggering their activation and use by a running application. The process is sometimes called FaaS or functions as a service.
For such an approach to work with existing applications, however, their code must be re-architected into a FaaS-based operation. That will not be a lay-up on the AWS platform, which still brings a lot of complexity to the problem, Emison warned.
Both Google and Microsoft offer FaaS-type services. When any of the FaaS services can be employed, they make some applications easier to build in the cloud. Emison cited the example of Twilio as a cloud-based third-party who could make voice and text-based applications easier to build without the need to build code to deal with telephony card or hooking into telephone or cellular systems. Algolia does something similar for search, Cloudinary for image manipulation and Auth0 for authentication. But the serverless field is not for the faint of heart.
However the public cloud evolves, its growth at this point is assured and Amazon "will continue to be the default choice of many for the near future," the report concluded. What's less clear is how all this activity – whether via VMs or containers – will be managed and which tools will become a default choice.
It remains a world in which companies must beware of competitors or disrupters whose software overturns the established way of doing things and puts existing companies at a disadvantage. "Software will continue eating the world, and organizations that can leverage the cloud will continue to beat those that cannot," Emison concluded.
Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Cybersecurity Strategies for the Digital EraAt its core, digital business relies on strong security practices. In addition, leveraging security intelligence and integrating security with operations and developer teams can help organizations push the boundaries of innovation.