Docker containers, backed by an unlikely group of allies, are suddenly the talk of the cloud community. What do containers represent in terms of IT's existing investment in VMware?
Containers emerged at the DockerCon event in San Francisco this week as a technology that is backed by a surprisingly broad spectrum of users, including Google, which says its search engine and all other applications run in containers.
Docker is a particular format for Linux containers that caught on with developers since its inception 15 months ago. Both Amazon Web Services and Microsoft are moving quickly to make Docker containers welcome guests on their respective cloud hosts.
Containers, sometimes described as lightweight virtualization, promise to move software around more easily and level the playing field between clouds. Does that mean IT should abandon its adoption of virtual machines and replace them with containers? What do containers represent in terms of IT's existing investment in VMware and other hypervisor-based management?
One way to answer the question is to look at one of the clearest predecessors to Docker that casts light on what it means. Docker has nothing to do with hypervisors and little to do with the first containerized operating system, Solaris. Rather, it more closely resembles the simple Red Hat Package Manager or RPM. Because open source code was frequently modified, Red Hat early on standardized how discrete modules of code could be packaged to assign them dates of issuance and version numbers so that a package manager system could check for compatibility with other modules and assemble thousands of modules into an operating system (Linux). The importance of RPM is not in the technology -- which is fairly simple -- but in the agreement it enforces among Linux developers to work together in a standard way. Docker does something similar, only for complex applications and on a much larger scale.
In the future, containers are expected to be nested. A software component that makes up a layer in one container might be called by another in a remote location. An update to the same layer might be passed on to any other containers that use the same component.
Ben Golub, CEO of Docker Inc., the firm that sponsors the Docker project, likes to draw an analogy with a shipping container: Docker makes it possible to move software around and handle it in a predictable way. But "shipping" falls short of all that Docker enables on the operational front.
Docker creates a sandboxed runtime on the computer on which it lands. It occupies a defined memory space and has access only to specified resources. A container sets up networking for an application in a standard way and carries as discrete layers all the related software that it needs. This tweet from Red Hat Dan came out of the second day of the conference: "A container is like Vegas, what happens in a container stays in that container."
The one exception is that the application in the container must rely on its new host to provide the operating system, which the host already has. A restriction is that the number of the Linux kernel that the application moved from must match the number that it is moving too, a relatively simple standard to meet in exchange for a big gain in workload portability.
In addition to portability, Docker injects a DevOps flavor to the workload package. DevOps requires a higher level of cooperation between developers and operations managers. By accepting the Docker format, developers can produce code without worrying much about where it's going to run. Developers who change code can find their changes automatically tested and added to the correct layer in the Docker workload, without the developer being burdened with maintenance. Operations managers can accept code that's already been tested, certified it's been formatted in a standard way, and guaranteed to be isolated from other code in a production environment. With Docker, developers and operations, two groups that have perennially been at war, can sit down at a table where a truce could break out and make it easier way for both sides to get their jobs done.
On the opening day of the conference, Microsoft CEO Satya Nadella tweeted about a blog post at Microsoft.com about Docker running on Azure, noting Docker was "developer goodness."
With IBM, Google, Rackspace, Red Hat, and many other backing the emergence of Docker containers, it wasn't surprising that Stuart Miniman, principal research contributor and tech analyst at Wikibon, said in another tweet: "Fun fact -- Docker currently has 42 employees. Is it the answer to life, the universe, and everything?"
If enterprise IT is already committed to virtualization, will Linux containers supplant that? Can Docker with 42 employees displace
Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio
How Enterprises Are Attacking the IT Security EnterpriseTo learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.
2017 State of IT ReportIn today's technology-driven world, "innovation" has become a basic expectation. IT leaders are tasked with making technical magic, improving customer experience, and boosting the bottom line -- yet often without any increase to the IT budget. How are organizations striking the balance between new initiatives and cost control? Download our report to learn about the biggest challenges and how savvy IT executives are overcoming them.