Traditional software licensing models are being derailed by virtualization and multicore processors. So how much is this going to cost us?
Last July IBM revamped its server software licensing to deal with multicore processors. Out went conventional per-processor or per-socket licenses. In came a more complex, but notionally fairer, scheme based on exactly how much computing capacity an application uses. Of course, measuring capacity is difficult, so IBM introduced a free planning and tracking tool, the Tivoli License Compliance Manager. Theory was, IT could see in real time how much a performance boost would cost, then IBM could audit use quarterly and send a bill.
There was just one problem: virtualization. The management tool ran at the application layer, so it couldn't always tell a virtual machine from a real one, meaning it misreported the amount of computing capacity that virtualized applications used. IBM withdrew the tool and suspended quarterly audits earlier this year. It plans to release an updated version that works with virtualization by mid-2008. Until then, customers are on their own, with no easy way to know whether they're complying--or even how much they owe. "Right now, we're on the honor system," says Roger Kerr, software business strategist at IBM.
While IBM's experience is the most embarrassing we've heard, the company isn't alone. Multicore processors and virtualization are nails in the coffin for standard software licensing models, but there's no agreement on a replacement. And the problem isn't confined to the data center. Licensing issues have already slowed development of Intel's virtualization technologies aimed at desktop management, while Microsoft is using desktop virtualization as a way to drive adoption of its Software Assurance subscriptions.
While it's tempting for enterprise IT to chuckle at this state of affairs, you need to pay attention: Alternative licensing schemes range from the familiar, like open source and SaaS, to untested models like pricing based on memory or virtual cores. At best, they could mean lower costs and more flexibility. But let's be real--when have software vendors embraced low costs and flexibility? Worst case, the hardware savings from the server consolidation that virtualization enables will be gobbled up by software licensing charges.
(click image for larger view)
As "Counting The Costs" (at right) shows, most server software is still licensed per socket or per CPU, which essentially mean the same thing. The reasoning is simple: Chips are easy to count and unlikely to change during the life of a server, and these licenses give IT a strong incentive to use the most powerful multicore chips available. But then, getting the most out of software has always required high-performance hardware. The only difference is that Intel and Advanced Micro Devices are now more likely to boost performance by increasing cores than increasing megahertz.
Per-chip licensing makes sense for software that runs on clearly defined hardware. This used to mean every OS and most apps, but virtualization changes all that by adding a hypervisor that shields the OS from the underlying hardware (see "Virtualization Five Ways," below). So it's no surprise that VMware has also adopted per-socket licensing, as have open source hypervisor vendors XenSource (acquired by Citrix Systems) and Virtual Iron.
What does surprise us is that Microsoft and Sun Microsystems both stick to the same model for virtualized Windows and Solaris--that is, treating each VM as a physical server with the same number of sockets as the underlying hardware. On its own, this would be a powerful deterrent to virtualization. But both vendors have offsetting reasons. In Microsoft's case, higher-end versions of Windows Server 2003 include licenses for extra virtual instances of the software on the same CPU--one on the Standard Edition, four on the Enterprise Edition, unlimited on the Datacenter Edition. The same will apply to Windows Server 2008. All Microsoft server licenses also include downgrade rights, meaning a virtual instance can be replaced by Windows 2000 or Windows NT.
In Sun's case, Solaris 10 includes Containers, a rival technology to hypervisor-based virtualization. Containers are closer to desktop streaming than VMware's approach in that they isolate applications without needing separate OS instances for each. That saves on resources but means only Unix apps can run. In addition to Solaris, Containers also support Linux.
Unlike Sun, Microsoft makes no distinction among virtualization technologies or vendors. This is likely because its own Hyper-V (formerly Viridian) hypervisor isn't due to ship until the third quarter of 2008. Microsoft says it will continue its hypervisor-agnostic policies even after Hyper-V is available, but in practice, most Windows Server 2008 users will make the switch.
Although Microsoft earlier this month announced that it would unbundle the Hyper-V technology from Windows Server, a change from its previous claims that Viridian is an integral part of Windows Server 2008, the two are still designed to work together, and customers who choose to buy the server without it save only $28. Hyper-V represents a challenge to VMware; Microsoft already competes with hypervisors through Virtual Server 2004, a free tool that can run other operating systems on top of Windows Server 2003, rather than alongside it. At present, however, Virtual Server supports only Windows as guest OSes, though Microsoft has said it will support SUSE Linux.
BEA Systems is thus far the only vendor to abandon per-socket licensing, albeit only for LiquidVM, a virtualized Java platform that cuts out the OS and runs directly on VMware. LiquidVM is licensed per instance, regardless of whether a VM runs in a few spare CPU cycles or consumes all the resources in a cluster. This model looks relatively easy to game and likely will be attractive to very large customers.
[Interop ITX 2017] State Of DevOps ReportThe DevOps movement brings application development and infrastructure operations together to increase efficiency and deploy applications more quickly. But embracing DevOps means making significant cultural, organizational, and technological changes. This research report will examine how and why IT organizations are adopting DevOps methodologies, the effects on their staff and processes, and the tools they are utilizing for the best results.
Digital Transformation Myths & TruthsTransformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.