Google Compute Engine Ignores VMware, Microsoft Users
Google's new cloud infrastructure service is geared to Linux workloads running in KVM. What does that mean for VMware, Microsoft, and Citrix users?
Google I/O: 10 Awesome Visions
(click image for larger view and for slideshow)
Google's Compute Engine represents a potentially a high value infrastructure service, particularly for users of Linux and KVM open source code. All of the cloud workloads hosted by Google's new service will run in KVM virtual machines.
But that may leave VMware, Microsoft, and Citrix users feeling left out. KVM is an efficient hypervisor incorporated into the Linux kernel, but it still represents the smallest installed base of existing virtualization users.
Windows users might feel doubly slighted. Not only is there's no provision for Microsoft's Hyper-V hypervisor in Compute Engine, but Windows workloads themselves are not welcome. That would seem to be an oversight, since Windows and Linux now occupy 80-90% of the enterprise data center, by some estimates.
Even Google's Linux acceptance is limited at this stage. Google infrastructure-as-a-service (IaaS) will take jobs running on CentOS and Ubuntu. Red Hat helped Google architect its cloud to make efficient use of KVM, according to Joe Beda, lead software engineer for Compute Engine, in a talk at the Google I/O developer conference Thursday in San Francisco. Maybe support for Red Hat Enterprise Linux will come later.
Google, with its search engine data center expertise, has the smarts to offer competitive infrastructure. Executives at Amazon Web Services are no doubt warily eying what Google is up to, but AWS's EC2 is not in any immediate danger. Amazon has years of experience in running both Windows and Linux workloads.
One important differentiator did emerge. Google Compute Engine automatically encrypts communications between virtual machines and will let all of a customer's virtual machines talk to each other in a secure fashion, even when they're located in separate data centers.
That's because Compute Engine's three data centers are connected by private communications lines, not the public Internet. Those data centers span three locations: the "central U.S." and two unnamed, but geographically separate, locations on the East Coast, said Beda. Google customers will be able to commission backup copies of virtual machines or disaster recovery systems with some ease because of the secure links in place between them. "If packets come in from an IP address that's your virtual machine, we will guarantee they came from there," said Beda during his talk. Google will charge $0.01 per GB of bandwidth consumed.
Similar arrangements can be made through other service providers, but they may require invoking higher priced private cloud services from the provider and the customer contracting for private line service between locations from network providers.
Google is charging less than Amazon or Microsoft for comparable servers, but it appears to be offering a slightly heftier resource package. For example, its base unit, referred to as n1-standard-1-d, includes 3.75 GB of RAM and 420 GB of local disk for a charge of $0.145 an hour. Amazon's "medium" Linux server is nearest in price at $0.16 per hour, and it includes 3.7 GB of RAM and 410 GB of storage. Microsoft charges $0.24 per hour for a medium-sized Windows server with 3.5 GB of RAM and 490 GB of storage.
InformationWeek Elite 100Our data shows these innovators using digital technology in two key areas: providing better products and cutting costs. Almost half of them expect to introduce a new IT-led product this year, and 46% are using technology to make business processes more efficient.
The UC Infrastructure TrapWorries about subpar networks tanking unified communications programs could be valid: Thirty-one percent of respondents have rolled capabilities out to less than 10% of users vs. 21% delivering UC to 76% or more. Is low uptake a result of strained infrastructures delivering poor performance?
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?