A recent Reuters news item that equates virtualization and cloud computing is symptomatic of the ongoing confusion between the two terms. Without splitting hairs, are virtualization and cloud really the same?
A recent Reuters news item that equates virtualization and cloud computing is symptomatic of the ongoing confusion between the two terms. Without splitting hairs, are these really the same?
Here's the statement from the article "Dell To Buy Storage Company 3PAR for $1.15 billion"): "International Business Machines Corp has been expanding its services business, as have other rivals like Hewlett-Packard Co and Oracle Corp. Such companies have also been stepping up investment in cloud computing, or "virtualization," a technology that enables users to access data and software over the Internet and corporate networks."
(The underlining is mine. Also, let us ignore that last bit about technology enabling access to data and software over the internet -- which could be one of several "technologies," for example IP.)Turning to that modern oracle, Wikipedia, is helpful. Wikipedia doesn't provide a single definition for virtualization; instead, it defines (if you can call it that) various types of virtualization, e.g. hardware virtualization, virtual memory, storage virtualization, operating system-level virtualization, etc. Clicking through to Wiktionary for "virtualization" doesn't shed any additional light. But a careful reading of Wikipedia's coverage turns up two concepts repeatedly: simulation and abstraction.
Wikipedia does provide a definition for cloud computing: "Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid." Not entirely adequate, but not bad.
The fact is, virtualization and cloud computing are related but entirely different concepts. Virtualization is simulation; the request fulfillment is in terms of a simulation -- make-believe that behaves like real. Several "make-believes" can be produced from one "real" -- often without significant impact in terms of performance, and at far lesser cost, hence the attraction of virtualization. Loosely speaking, anything can be virtualized -- memory, storage, operating system services, etc. The primary objective (and hence benefit) of virtualization is to "do more with less" and save on costs.
Cloud computing is something else entirely -- it is, literally, computing in the (Internet) cloud. Components that make up the application -- for example, application components, databases, external/real-time data feeds, etc. -- are typically dispersed across the Internet and made available in the form of services. A "cloud app" is then in effect an assembly of such cloud-based services. In turn these services may use virtualization, but that's only incidental. Internal clouds -- which are seeing in increasing prevalence -- are just that; mostly internal services, hosted internally within the organization. In that sense, internal clouds are an extension of classic SOA architecture. By "merely" dispersing these services across the internet, we begin to truly realize the benefits of SOA. Cloud computing allows us to not just "do more with less" -- it enables us to "do more," period.
Virtualization and cloud computing make a good team, but they complement rather than supplant each other. And they are not identical.A recent Reuters news item that equates virtualization and cloud computing is symptomatic of the ongoing confusion between the two terms. Without splitting hairs, are virtualization and cloud really the same?
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Cybersecurity Strategies for the Digital EraAt its core, digital business relies on strong security practices. In addition, leveraging security intelligence and integrating security with operations and developer teams can help organizations push the boundaries of innovation.