The road to virtualized everything is paved with open everything.
When most people first learn about server virtualization, the first question that usually comes up is “Does that mean I don’t have to buy as many servers?” And the answer is yes. The machines you buy are more robustly configured, but there are fewer of them.
This is, of course, short-term thinking. Given more exposure, people come to appreciate the flexibility of having server “instances” that can be instantiated instantly, used, and collapsed just as quickly. This opens a whole host of new capabilities for developers as well as operators and users.
The Long-Term Benefit
The major benefit of virtualization that will pay off more and more over the next several years comes from the portability of server instances. No longer is the operating system bound to a specific machine, nor are any of the other resources required to service the workload. In an environment where more and more workloads are being offloaded to cloud servers, the ability to rapidly move whole servers and their workloads to a more efficient location is one of the key ways that operators will preserve their budgets now and in the future.
The Next Step Of Abstraction
Even as the monolithic server has been replaced with the virtualized server containing many virtual machines, the monolithic application is also being replaced with many microservices packaged in containers along with all the resources they need, including the OS kernel, the libraries, even the storage specifications.
Here again a primary goal is portability. The new application is, in and of itself, virtualized. Any given microservice can be instantly instantiated, used in service of the currently required function, and then discarded. Should a microservice malfunction for any reason, it can be instantly discarded and replaced. This adds tremendous functional resilience above and beyond portability.
Since everything every microservice needs is packaged along with it in its container, it is no longer dependent upon resource availability from the server or, in fact, the network. It is suddenly self-sufficient, and being self-sufficient means it becomes completely portable.
Shifting operations from one cloud to another, from one data center to another, or even from one virtual machine to another, becomes nearly effortless. Each container knows what storage it needs and where to get it, no matter where the container itself may currently reside.
Source: Red Hat
There was a time when the specific machine from which you logged into a network determined what your experience would be, how your desktop would look, and where all your resources would be found. That time passed with the introduction of desktop virtualization.
With the advent of microservices in containers using software-defined storage, we have completed a process that began back then, when desktop virtualization was new. We are no longer confined by which device, or by which application and how far it is from the storage, or where anything is running. We can optimize resource utilization by maximizing our use of whatever is available by moving everything to the most efficient place currently available.
Abstracting even the concept of the desktop, we find ourselves with virtualized devices running virtualized applications in virtualized containers running on virtualized storage across a virtualized network.
The reality of this move toward virtualized everything is supported by a number of data points in the industry. For instance, the explosive growth of OpenStack's Infrastructure-as-a-Service proposition and the rapid adoption of Red Hat's OpenShift as a Platform as a Service. In addition, we're seeing convergence of virtualization technology with other infrastructure layers of the stack. Case in point is the close integration between Red Hat's enterprise virtualization and software-defined storage technologies.
One could argue that virtualizing the infrastructure control plane and standardizing on x86 hardware has been one of the most subtle but critical market trends of our time, clearly evidenced by some of the traditional hardware players desperately seeking to reinvent themselves while others have left the low-margin business for greener pastures. In the end, customers win because they pay for the smarts in the software rather than the tight bundling of software to hardware, which kept them locked into IT vendors for many years. In other words, the road to virtualized everything is paved with open everything.
Irshad Raihan is a product marketing manager at Red Hat Storage. Previously, he held senior product marketing and product management positions at HP and IBM. He is based in Northern California and can be reached on Twitter @irshadraihan. View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As a connective hub in a global network of enterprises, partners, and open source communities, Red Hat helps create relevant, innovative technologies that liberate resources for growth and prepare customers for the future of IT. Learn more at redhat
Cybersecurity Strategies for the Digital EraAt its core, digital business relies on strong security practices. In addition, leveraging security intelligence and integrating security with operations and developer teams can help organizations push the boundaries of innovation.