Amazon Web Services feels the breath of competition, responds with lower cost, "bursty" T2 instances.
Cloud Contracts: 8 Questions To Ask
(Click image for larger view and slideshow.)
Amazon Web Services has introduced a lower cost, general-purpose instance type for users who want both a modest CPU and also an ability to burst CPU capacity when needed. That describes the T2 micro, small, and medium instances, launched Tuesday.
A T2 micro has a single virtual CPU, which means it's given a share of an Intel Xeon core equivalent to a 2007 Xeon CPU running at 1 GHz. That's known in Amazon's lexicon as an EC2 Compute Unit (ECU). The clock speed of the today's Xeon cores in EC2 is more like 2.5 GHz, but the virtual CPU will get only a share of the core. Likewise, the T2 medium is assigned 2 ECUs, or two virtual CPUs, or a larger share of a core.
Despite the virtual CPU limits they've been assigned, each instance type can burst into full-core use when the workload demands it. In the announcement Tuesday, Amazon EC2 VP Matt Garman said the T2s "optimize their performance and cost for applications that don't use the full CPU capability frequently, but require the full CPU resources for short bursts."
It's not clear from the announcement for how long "short bursts" may be or whether more frequent use will result in the customer being reassigned to an instance type more appropriate for the workload being run.
The lack of clarity about such issues indicates that Amazon has placed a spotlight on first-time and small-use customers. It appears that the growing popularity of services such as Linode and Digital Ocean, serving the large programmer population working in and around New York City, has caught Amazon's eye.
New York-based Digital Ocean has implemented $5-a-month virtual servers and solid-state drives as standard storage. Its small servers spin up rapidly and exhibit fast I/O performance based on solid state. At the Structure 2014 conference in San Francisco June 20, Adrian Cockcroft, the former Netflix cloud architect, now technology fellow at Battery Ventures, noted the rise of such service providers in 2012 and 2013. He said of Digital Ocean: "I'll take a twenty-fold growth rate any time."
Amazon is not about to let such upstarts grow into real challengers without some attempted counter measure. Its T2 micro, small, and medium instances come with 1, 2, and 4 GBs of memory respectively, giving it an edge over Digital Ocean. The latter's version of a $5 micro server comes with only 512 KBs of memory; to get to a GB, the customer has to pay $10 a month; two GBs, $20 a month; and 4 GBs, $40 a month. Those prices also include storage and network bandwidth, something that will cost extra on Amazon. Nevertheless, T2 appears aimed at Digital Ocean's weak point: solid-state memory is still more expensive than spinning disks.
Amazon's pricing comes in just under Digital Ocean's for comparable memory sizes, when only the instance price lists are compared. That closes a much larger gap that had existed with its larger M3 instance sizes, for example, while opening the option of "bursty" CPU power. The T2 micro is priced at 1.3 cents an hour, which comes out to $9.50 a month versus Digital Ocean's $10.
The T2 small is priced at 2.6 cents per hour, and the T2 medium at 5.6 cents per hour, with monthly rates of about $19 and $38 respectively.
But Marty Puranik, CEO of rival service Atlantic.net in Orlando, Fla., said Amazon's prices don't include persistent storage, high performance I/O, and network bandwidth, the way Digital Ocean and Atlantic.net prices do. "Amazon prices don't include data transfer, any local storage, or support," he said. And if you pay Amazon extra to get them, the upstarts have a price advantage again.
Puranik said Amazon's price list is the equivalent of "quoting the price of a car without the wheels included."
The concept of a bursty CPU isn't new with Amazon. Rackspace divides up its CPUs into virtual units to provision cloud users. But even when a customer pays for only part of a CPU, he gets the whole core if his application demands it, if no other customer is already using it. As with the ill-defined "bursts" of Amazon's T2, customers can't be sure how this will work out in practice; probably fine, unless constant price cutting starts to load up cloud hosts with so many workloads that they're pressing against the CPU's ceiling.
Indeed, the T2 instances are Amazon's answer to smaller customers worried about performance in the cloud, as well as price. AWS is by far the most successful cloud vendor, but Digital Ocean, Atlantic.net, and others are making inroads based on solid-state drive performance. Atlantic.net boasts instances at prices ranging from $4.97 to $9.93 a month. In addition, its SSD-based virtual servers spin up in 30 seconds, it claims on its website.
Getting new customers in the cloud, particularly the valued developer communities that will grow into using other resources, is as much about performance as it is about price.
InformationWeek's new Must Reads is a compendium of our best recent coverage of the Internet of Things. Find out the way in which an aging workforce will drive progress on the Internet of Things, why the IoT isn't as scary as some folks seem to think, how connected machines will change the supply chain, and more. (Free registration required.)
Charles Babcock is an editor-at-large for InformationWeek, having joined the publication in 2003. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse ... View Full Bio
Multicloud Infrastructure & Application ManagementEnterprise cloud adoption has evolved to the point where hybrid public/private cloud designs and use of multiple providers is common. Who among us has mastered provisioning resources in different clouds; allocating the right resources to each application; assigning applications to the "best" cloud provider based on performance or reliability requirements.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?