Cloud computing is beginning to transform the way enterprises buy and use technology resources, and that was evident at the Interop 2014 conference and exhibition in New York this week. Cloud experts and practitioners of all stripes were in attendance and provided some insights -- positive and negative -- on where this trend is heading.
In a workshop on Designing Infrastructure for Private Clouds, Ivan Pepelnjak, the network architect for UpSpace.net AG, a consulting company in Slovenia, hit upon one of the defining characteristics of cloud computing. "Cloud is all about self-service. … You need to be able to allow your [internal] customer to change the rules on the load balancer and firewall. When someone says, 'This will never fly in my organization,' just tell them, 'Your developers are already using Amazon.'"
In weighing an open source internal cloud versus commercial products, he asked: "How expensive will it be to operate cloud? On the open source side you will have total control. But can you afford that? You either pay vendors or pay staff."
A questioner later in the session noted his academic institution had commissioned a staff-implemented, open source cloud to save money, but when it was up and running after a year of work by five to six IT staffers, the institution decided "it needed technical support. We proceeded to sign a contract for support with Red Hat" that erased most of the savings of a year's worth of work.
The audience member, who didn't identify his employer, said open source cloud builders should be realistic when they undertake the project themselves and realize, when they get to their goal, they may do something -- hire a vendor for support -- that they attempted to avoid at the start, he said.
[Learn more about the role of containers in platform as a service. Read Are Docker Containers Essential To PaaS?]
Paul Savill, senior VP of Level 3 Communications, said during the Next Generation Applications workshop that he once received a call from a customer complaining of a network slowdown. It was "a major beverage company" that was seeing a slowdown in a network application that required coast-to-coast message exchanges. The service had been provided at 14 milliseconds, but the network added 2 more milliseconds of latency so the round trip took 16 milliseconds. The customer's fine-tuned application felt the impact of the extra 2 milliseconds and customer requests backed up in the queue. The service level agreement called for a maximum of a 40-millisecond round trip so the service provider didn't think increasing the latency to 16 milliseconds was a problem. But the customer did and complained. "The performance of the network is critical to cloud services. If you think the network is just the last piece and not that important, then you're going to make some mistakes," he warned.
A debate on the future of PaaS and whether a file packaging system for Linux containers, Docker, is really a needed element of a platform-as-a-service, drew an emphatic response from Krishnan Subramanian, director of strategy for Red Hat's OpenShift platform. Red Hat is heavily invested in a partnership with Docker and is bringing out a version of Red Hat Enterprise Linux that's geared to running Docker containers, Atomic RHEL. "Offering PaaS that doesn't support Docker Linux containers is the equivalent of selling snake oil," said Subramanian."
Michael Biddick, CEO of Fusion PPT and a writer for InformationWeek, spoke on hybrid clouds. "Hybrid operations provide us with agility, scalability, and economic benefits. But things will get in the way of delivering those benefits. There's concerns about compliance, security, and the previous investment in legacy systems. It's difficult too because people don't fully understand it and they resist it," he said.
Biddick later made some pointed comments on the value of public cloud SLAs. "The service level agreements have you agreeing to a lot of things that aren't really too good for you. … We see service credits given and cash payments, but rarely in significant amounts. … They contain language like 'will make a commercially reasonable effort to provide continuous service.' … It's almost irrelevant after an hour of downtime for your business to get a $200 check back," he said.
Docker makes it much easier to turn a cloud workload into a set of files that is portable across cloud environments, said Shashi Kiran, senior director of market management for Cisco, in a session called Managing the Hybrid Cloud. Linux "containers are probably the next major push that we will see that will drive hybrid cloud," Kiran said. On the same day, Cisco announced it had added 150 data centers to its Intercloud partnership that has a group of companies offering compatible OpenStack cloud services. BT, the former British Telecom, NTT Data, Deutsche Telekom, and Equinix were among the biggest service providers in the group. Cisco sees containers as one way to make workloads portable across such a group. "Portability across hybrid environments. That's Intercloud," he said.
In the same discussion, Payal Chakravarty Jain, senior product manager for IBM Cloud and Smarter Infrastructure, said: "We think DevOps will mature and become the standard way to practice IT operations and development in the future. … New startups are way ahead that way."
Mark Russinovich, CTO of Microsoft Azure, said in an interview that Microsoft expects Azure in the future to host many Linux workloads running in Windows Hyper-V virtual machines. Up until now, Microsoft hasn't branded itself as a welcoming and hospitable place for Linux. It's also developed for its own internal use a way to run Windows workloads in containerized form -- without adding the overhead of virtual machines. He said Microsoft uses its "Drawbridge system to provide greater security and isolation for our internal operations."
You've realized the easy gains from SaaS. Now it's time to dig into PaaS, performance, and more. Get the new Your Next Cloud Move issue of InformationWeek Tech Digest today. (Free registration required.)