Commentary
12/29/2014
08:36 AM
Charles Babcock
Charles Babcock
Commentary
Connect Directly
Twitter
RSS

9 Cloud Trends For 2015

In 2015, expect to see the cloud mature as a platform for hybrid operations, with cloud data centers achieving new efficiencies through containers.



Cloud Storage: 8 Ways You're Wasting Money
Cloud Storage: 8 Ways You're Wasting Money
(Click image for larger view and slideshow.)

It's that time of year again: From the enterprise's land rush to the cloud to more obscure software-defined security, here are nine cloud computing trends we expect to see in 2015.

At Amazon Web Service's November Re:Invent conference, a rumor circulated in the hallways of the Sands Convention Center that Amazon had hired dozens of experts in enterprise operations and enterprise service development over the last year.

Amazon never announced such a move, and indeed it may be apocryphal. But one person who shared that tidbit was Ellen Rubin, the astute former VP of products at CloudSwitch who now serves as CEO of ClearSky Data, a Waltham, Mass.-based firm in stealth mode. (CloudSwitch was sold for an undisclosed sum to Verizon in 2011.)

Regardless of whether the report was true, AWS took a decidedly enterprise-oriented turn in November, adding services such as AWS Aurora, a MySQL database service on steroids that can replace Oracle or an AWS Code Deploy service to make it easier to launch your workload on EC2. And that turn marks our first prediction for 2015:

1. Next year will mark the start of the land rush of enterprise workloads moving into the cloud.
This has been said before, but so far the cloud migration has been mostly just talk. This time we're referring not just AWS, but Google Compute Engine, Microsoft Azure, and the managed service veterans CenturyLink Savvis, Verizon Terremark and Rackspace.

[Want to learn more about DigitalOcean and OVH.net? See 6 Cloud Startups To Watch.]

2. Containers will gain momentum.
One reason we expect to see the cloud rush intensify in 2015 is that containers have helped solve some of the problems the cloud poses for IT operations. Developers already love containers, but operations teams need to be able to containerize different parts of an application, move them into different types of cloud infrastructure, and manage them as discrete units while keeping the parts acting as a whole.

The Kubernetes project, Docker's own expanding set of orchestration features, CenturyLink's Panamax container, and perhaps CoreOS' Rocket container runtime will all help bring about practical implementations of this denser form of computing. AWS will bring AWS Container Service out of preview and into beta in 2015. Will it make it generally available before the end of the year? That's a big maybe.

(Source: Kurdistann)
(Source: Kurdistann)

3. Containerization and the public cloud's newfound enterprise orientation will result in "continuous delivery" replacing agile programming movement as the most sought-after skill set among cloud customers.
The gains of agile programming are somewhat lost if operations does not follow through with frequent code updates. Organizations have struggled with the disconnect between development and operational teams since the two were created. Now cloud services will compete by creating services that enable consolidation of these two fiefdoms.

At Re:Invent, Amazon took the lead in this space. Its Service Catalogue, Code Deploy, EC2 Container Service, and Code Pipeline, along with its established Elastic Load Balancer, Elastic Map Reduce, and other services are creating a new, DevOps-oriented way for enterprises to offer continuous delivery without building it themselves. The cloud helps them get around the considerable barriers to an on-premises approach, and both developers and operations see it as a promising neutral ground.

Formerly isolated cloud products are being connected together in a service chain by platform-as-a-service providers such as IBM Bluemix or Heroku and Engine Yard, and by Amazon as part of its infrastructure services. Companies that require fast, reliable updates to retail, financial services, healthcare, or trading applications, for example, will want to do them in the cloud. That way, they're more likely to get to automated code deployment, managed code updates, application performance management, and automated analysis of lingering code issues. In addition, the more uniform environment of the cloud often allows the deployment environment to closely resemble the development environment.

Developers already show a willingness to embrace the advances available through the cloud. The most recent North American developer survey by Evans Data found that 80% of the developers who had experience with DevOps said continuous delivery was critical to the future success of their organizations, and 77% cited achieving the DevOps part of their responsibilities inside cloud infrastructure, or with tools available as software-as-a-service in the cloud.

4. Price leadership will shift to second tier.
A two-tier public cloud structure will increasingly take shape throughout 2015. The top tier will continue to be Amazon, Azure, SoftLayer, and Google Compute/App Engines. But independent developers, startups, small businesses, and tech-savvy dabblers will gravitate to a low-price, minimalist infrastructure tier. A slew of them are now springing up, seeking to capitalize on the model set by DigitalOcean, which is now the third-largest hosting provider in the world, according to Netcraft.

That doesn't mean DigitalOcean is the third-largest cloud provider in terms of revenue or physical data center servers. In its ranking, Netcraft is counting the number of public-facing IP addresses inside a service and rating services based on this measure. Netcraft counted 124,000 such servers at DigitalOcean, which earned its number-three ranking.

Two years ago Netcraft counted just 140 such serves at DigitalOcean, which means the provider has advanced from 568th in January 2012 to 3rd at the end of this year. This raw number, however, says nothing about

Next Page



the size of the public-facing site or the amount of traffic that uses it. Amazon remains the number-one public-facing site overall.

What's Digital Ocean's secret? Advanced technical infrastructure, heavily based on solid state disk-equipped servers and simple services delivered at high speed. A current customer can spin up a cloud workload in 55 seconds or less; a new customer can get their first virtual server running at the same rate or less. One open source KVM server, termed a "droplet," costs less than a penny per hour, or $5 per month, to run.

While it lacks the load balancing, cluster management, and other advanced services of Amazon, DigitalOcean's service nevertheless maintains a strong appeal to developers. DigitalOcean's footsteps are being followed by other developer-oriented services, including Atlantic.net, Linode, OVH.net and Exponential-e (in Europe), and Hot Drupal, among others. The trend: Get started with a lightweight, fast cloud service. As your company grows larger, move up to the big boys.

5. Clouds will spawn enterprise risk investments.
This prediction is based on a discussion with Allan Leinwand, the architect of Zynga's initial use of Amazon services and the Zynga private Z Cloud. Now VP and CTO of cloud platform at San Francisco-based service desk SaaS provider ServiceNow, Leinwand sees CIOs using both public and private cloud services to give business teams a chance to either succeed or fail quickly. "In the past, the CIO got you the servers, desktops, and printers you needed," he noted. "Now they're providing the services that can drive the business forward."

In effect, he says, cloud service providers are enabling CIOs to act more like series A investors. They invest small amounts in high-risk business teams that might find the next big thing -- or they might fail quickly and minimize the loss.

6. Software-defined security will protect workloads.
Software-defined security will become part of the software-defined data center and accompany workloads into the cloud. If we can define the network, the storage system, and containers and virtual machines on host servers in software, then we can also define the security that must accompany them with software systems. Guarding the data center perimeter is a hardware-based, if not fortress-based, concept. Security needs to follow with the software asset.

In the software-defined data center, software mapping systems identify system perimeters and feed intelligence into a central monitoring system. That mapping capability must be extended to define the permissions and activities allowed to the software system, with a surveillance agent ensuring that it adheres to only those activities. Any exceptions must trigger an inspection and potential intervention.

Figuring out what happened after several million user identities and credit card numbers have been offloaded to Russia is not security. Building security that can accompany the movements of a system, regardless of where it's about to be run, is an extension of the logical mobility that came into the data center with the first virtual machines.

7. Custom chips for cloud partners will take hold.
At Re:Invent, Amazon CTO Werner Vogels announced that Intel has designed a chip exclusively for Amazon, for use in cloud servers in partnership with Amazon. The chip, he said, will become the heart of Amazon's future cloud hosts.

This comes on the heels of Google's expressed interest in IBM's OpenPower Consortium, in which the specifications for the Power chip are shared with a group of companies. Members such as Google can then work with IBM to design a custom version that can be used to launch the third generation of warehouse-scale computing, Gordon MacKean, chairman of the OpenPower Initiative and chief of Google's platform engineering group, explained at a press conference in San Francisco last April. "We're ready to start innovating," he added.

OpenPower follows Facebook's project to create a cloud server specification through the Open Compute Project. At the Open Compute conference last January in San Jose, Andrew Feldman, corporate VP of AMD, said, "By 2019, ARM will command 25% of the server market." He also predicted that energy-sipping ARM CPUs in the form of purpose-designed chips "will be the norm for mega-datacenters," such as those used by Facebook, Microsoft and Google. For that to happen, more 64-bit (rather than 32-bit) ARM chips are needed. AMD produced Seattle, its first 64-bit ARM, in January. Feldman also acknowledged that the ecosystem of software developers for ARM "is in its gangly youth" compared to the mature x86 software market.

John Engates, CTO at Rackspace, observed in a recent blog post that in addition to low energy use, ARM in the past has been accompanied by "low performance and a fragmented software ecosystem. That changed this year. 2014 saw big news around ... Cavium bringing out a server-grade ARM platform (ARMv8 Data Center and Cloud processors). 2015 will be the year that alternative silicon really begins to rise. Watch this space."

8. The public cloud will go hybrid.
It's not enough for a cloud provider to simply connect their data centers to the Internet -- they need multiple private-line carriers to bring enterprise data into the facility in a secure and compliant way. Equinix has been Amazon's preferred hub connecting its users to a private line, with a typical Equinix facility joining 50 to 60 private carriers. A few months ago, Amazon added Verizon and AT&T data centers as well. IBM's SoftLayer unit built out a private line network when fiber was cheap, and Google and the fast-growing Exponential-e in the UK followed suit. Access to private-line service can also be accomplished through a third party, such as Digital Realty Trust's Global Cloud Marketplace or an Equinix-like provider found in secondary markets, such as Cologix facilities in Minneapolis, Dallas and Jacksonville, Fla. With private-line access readily available, hybrid is the new normal for public cloud operations.

9. IoT meets big data.
Intel launched its Internet of Things Platform on Dec. 10 to help manage the connectivity and security of proliferating sensors and devices. The value of the secure and connected devices, however, will only increase in 2015 as the Internet of Things meets big data. As data streams into Hadoop and other big-data repositories, groups like the Industrial Internet Consortium, in which Intel, IBM, GE and Cisco are members, can collect information from large manufacturing operations and facilities and discover ways to reduce energy consumption, save time, and shave costs off manufacturing lines. Managing large buildings and individual homes will likewise benefit from analysis of data collected from local networks. It's been primarily an abstraction so far, but the Internet of Things will become real in 2015.

Attend Interop Las Vegas, the leading independent technology conference and expo series designed to inspire, inform, and connect the world's IT community. In 2015, look for all new programs, networking opportunities, and classes that will help you set your organization's IT action plan. It happens April 27 to May 1. Register with Discount Code MPOIWK for $200 off Total Access & Conference Passes.

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Email This  | 
Print  | 
RSS
More Insights
Copyright © 2020 UBM Electronics, A UBM company, All rights reserved. Privacy Policy | Terms of Service