When it comes to transforming the data center, VMware is writing the story. And the theme is automation.
There are 800,000 IT staffers now carrying the title of virtual machine administrator, said VMWare CEO Paul Maritz in an address to 19,000 attendees at VMworld Monday. What he didn't say was that he's trying to make them network administrators, database administrators, and cloud administrators as well.
Virtualization is changing the data center, creating a new administrative title, and then slowly expanding that administrator's powers. Clear signs of it show in the scope and reach of VMware's product suite, including vSphere 5 for setting up the virtualized environment; vCloud Director for provisioning the private cloud; vCloud Data Center Manager for provisioning the public cloud; and vFabric, a connected, application deployment environment.
VMware is continuing to aggressively push out, from what was originally a narrow base of consolidated data center servers into multiple data center functions.
A better way of saying that is VMware has ceased to be the provider and supervisor of hypervisors. It's become a manager of operations, and the hypervisor is just a means of getting there. The goal is more fluid, automated operations--and we're certainly not there yet.
VMware itself thought it was farther down the path of converting thinking to virtual assets than proved to be the case recently. It attempted to put a virtual memory limit of 48 GBs on a virtual machine running under its high end Enterprise Plus license for vSphere 5, and experienced some fierce push back from customers, who objected to such a restriction.
I remember VMware a year ago moving pricing from physical to virtual asset as it charged $50 per virtual machine, instead of per host CPU, as it launched vCenter Operations. VCenter Operations offered a wide range of system management functions (including, for the first time, by anyone) the ability to do capacity management as an extension of configuration management.
VMware was bringing together two skills into one management console and far ahead of its time. It was also invading the space of the traditional system managers, and therefore, I felt, VMware had put a low price on it to gain headway there. Perhaps that's why there was little objection voiced to the physical-to-virtual pricing move at that time.
Maritz Speaks To Pricing Controversy
But vSphere 5 is at the core of the virtualized data center and a change in its pricing scheme did not go down well with customers. I asked Maritz about that move in the Q&A that followed his address.
VMware, he answered, is taking the lead in moving from pricing based on physical assets to pricing based on virtual assets. As data centers become more cloud-like, "reliance on a single physical attribute like CPU will make less and less sense. It will not be the way the customer is looking at the world."
I agree that that's likely to be the case in the future. But based on recent events, it's also clearly not the way customers are looking at the world today. They are familiar with and can calculate their future by their accustomed vSphere pricing.
Maritz basically agreed. "Everytime you try something new, you have to expect some feedback. We got that feedback and we had to adjust." VMware doubled the memory limit to 96 GBs three weeks after its initial licensing move.
But he added: "I don't think that's the end of this journey… The whole industry is going to have to come to grips with moving to a virtual asset, cloud-based pricing model." To do so, "it will have to use some measure of dynamic workload as a pricing mechanism… Obviously, there are going to be some course corrections along the way."
I can agree that focusing on virtual assets is going to prove the more vital measure than CPUs in the long run. But what's a fair way to charge customers for virtual asset use? Charging simply by virtual machine punishes those who plan to get light use out of their virtual servers and rewards those who utilize them to the max. The latter are often the largest and most successful customers. It's a thorny issue and won't easily be resolved.
Charging by use of virtual memory is one such measure. I think VMware will have to keep it as a pricing mechanism in Infrastrucuture 5, but it would be wise to keep that limit moving upward with upgrades of Infrastructure 5. The virtual world is about paying for use; just don't expect to convert everyone to the concept overnight when they're used to physical asset pricing.
On the product front, VMware Monday picked up the gauntlet when it comes to moving into new areas of the data center. For example, databases have resisted virtualization so far. They're I/O intensive, and unless you're careful, virtualization introduces I/O latencies. A database server's work fluctuates, so in the past it wouldn't do to have it sitting on a multi-tenant server with noisy neighbors who are I/O hogs.
VMware has taken on the issue squarely and on Monday launched vFabric Data Director, a way for the virtual machine administrator to launch a virtualized database system in minutes when a user asks for one. Where's the DBA in the process? There isn't one, unless he helped compose the database golden image used in the launch.
By making database systems available in virtual machines, business user needs can be answered more quickly, if it's clear what data the user needs. vFabric knows how to connect a database to a data source, and the administrator can check its safeguards and policies against the prospective user's rights and privileges.
VFabric Data Director is also capable of providing a single supervising framework for many virtualized database systems, something that has tended to involve different DBAs in the past. It can provision virtual machines that run databases from different vendors, giving a virtual machine administrator with database administration skills the ability to see what's going on in several different systems at one time.
Global Web Of Data Centers
At the end of the show's first day, it's clear to me that VMware is gaining traction inside the data center and using it to not build a private cloud architecture, but to bridge that architecture to an external, public cloud. Once such a hybrid operation is set up, VMware says it's the right party to supervise its operation. Again, I would emphasize the word "operations." Virtualization is no longer just about server consolidation and efficiency. It's about efficient and elastic operations--a new and rapidly evolving way of doing things.
If virtualized resources can be pooled into the private cloud with vCloud Director, VMware hasn't stopped there. It's stepped outside the data center to somewhat preposterously tell service providers that they need adopt more VMware-enabled operations. Selectively, but on a large enough scale to make a difference, many service providers have responded, creating an alternative to Amazon Web Services' infrastructure as a service.
VMware started this process with vCloud Express, and several parties adopted it as a somewhat underpowered but quick way to start provisioning VMware virtual machines in their public infrastructure. VCloud Express died a lingering death, as best I know, but that's not because service providers lost interest. On the contrary, they insisted on upgrading to vCloud Director and Data Center Services.
BlueLock was one such early adopter. Aaron Branham, director of information technology, said in an interview that BlueLock uses vCloud Director with the Xsigo Director I/O appliance to put as many as 100 virtual machines on an HP DL 585 host. Xsigo juggles converged communications I/O, off-loading the hypervisor's software switch from a time consuming task.
If an Indianapolis public cloud supplier can achieve this concentration of virtual machines on standard hardware, it's easy to see the economies of scale that can be taken advantage of in the public cloud--and further gains possible inside the enterprise data center.
BlueLock, by the way, was one of four VMware-compatible public cloud suppliers that self-selected to form a network of global data centers for VMware customers. SingTel in Singapore, SoftBank in Japan, and Colt in Europe are now linked so that customer workloads can be transferred between them, while staying under the same contract. This is a harbinger of things to come. In the future, the failure of a data center will not be a catastrophic event, as allied data centers form daisy chains of connections around the world.
I have a strong sense of how effectively VMware is spearheading these server-side changes. I am less certain how effectively VMware has upgraded its VMware View to the 5.0 version. View is the name it gives to its virtual desktop infrastructure. My overall sense is that Citrix Systems is competing hard and effectively for desktops. It has expertise with the desktop protocols that deliver the best desktop experience, and it has effectively tackled the task of creating security on roving clients. I would not count Citrix or Microsoft out on the desktop.
But when it comes to transforming the data center, VMware occupies the driver's seat. There are many reasons to be wary of a proprietary vendor in that position, and they apply as much to VMware as they did at one time to IBM or to Microsoft. But you can't say VMware is failing to execute on its strategy or failing to bring innovative software to market.
Charles Babcock is an editor-at-large for InformationWeek.
Read our new report, State Of The IT Service Desk: Change Management Remains Key. Download the report now. (Free registration required.)
InformationWeek Elite 100Our data shows these innovators using digital technology in two key areas: providing better products and cutting costs. Almost half of them expect to introduce a new IT-led product this year, and 46% are using technology to make business processes more efficient.
The UC Infrastructure TrapWorries about subpar networks tanking unified communications programs could be valid: Thirty-one percent of respondents have rolled capabilities out to less than 10% of users vs. 21% delivering UC to 76% or more. Is low uptake a result of strained infrastructures delivering poor performance?
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?