Cloud // Platform as a Service
Commentary
4/17/2014
12:44 PM
Charles Babcock
Charles Babcock
Commentary
Connect Directly
Twitter
RSS
E-Mail
100%
0%

Red Hat Linux Containers: Not Just Recycled Ideas

Red Hat and its partner, Docker, bring DevOps characteristics to Linux containers, making them lighter-weight vehicles than virtual machines for cloud workloads.

Some people accuse Red Hat of dusting off an old idea, Linux containers, and presenting them as if they were something new. Well, I would acknowledge Sun Microsystems offered containers under Solaris years ago and the concept isn't new. But Docker and Red Hat together have been able to bring new packaging attributes to containers, making them an alternative that's likely to exist alongside virtual machines for moving workloads into the cloud.

And containers promise to fit more seamlessly into a DevOps world than virtual machines do. Containers can provide an automated way for the components to receive patches and updates -- without a system administrator's intervention. A workload sent out to the cloud a month ago may have had the Heartbleed vulnerability. When the same workload is sent in a container today, it's been fixed, even though a system administrator did nothing to correct it. The update was supplied to an open-source code module by the party responsible for it, and the updated version was automatically retrieved and integrated into the workload as it was containerized.

[Want to learn more about Linux containers? See Red Hat Announces Linux App Container Certification.]

That's one reason why Paul Cormier, Red Hat's president of products and technologies, at the Red Hat Summit this week, called containers an emerging technology "that will drive the future." He didn't specifically mention workload security, rather, he cited the increased mobility a workload gains when it's packaged inside a container. In theory at least, a containerized application can be sent to different clouds, with the container interface navigating the differences. The container checks with the host server to make sure it's running the Linux kernel that the application needs. The rest of the operating system is resident in the container itself.

Is that really much of an advantage? Aren't CPUs powerful enough and networks big enough to move the whole operating system with the application, the way virtual machines do? VMware is betting heavily on the efficacy of moving an ESX Server workload from the enterprise to a like environment in the cloud, the vCloud Hybrid Service. No need to worry about which Linux kernel is on the cloud server. The virtual machine has a complete operating system included with it.

Paul Cormier at Red Hat Summit 2014. (Source: Red Hat)
Paul Cormier at Red Hat Summit 2014.
(Source: Red Hat)

But that's one of the points in favor of containers, in my opinion. Sun used to boast how many applications could run under one version of Solaris. In effect, all the containerized applications on a Linux cloud host are sharing the host's Linux kernel and providing the rest of the Linux user-mode libraries themselves. That makes each container a smaller-sized, less-demanding workload on the host and allows more workloads per host.

Determining how many workloads per host is an inexact science. It will depend on how much of the operating system each workload originator decided to include in the container. But if a disciplined approach was taken and only needed class libraries were included, then a host server that can run 10 large VMs would be able to handle 100 containerized applications of similar caliber, said Red Hat CTO Brian Stevens Wednesday in a keynote at the Red Hat Summit.

It's the 10X efficiency factor, if Stevens is correct, that's going to command attention among Linux developers, enterprise system administrators, and cloud service providers. Red Hat Enterprise Linux is already a frequent choice in the cloud. It's not necessarily the first choice for development, where Ubuntu, Debian, and Suse may be used as often as Red Hat. When it comes to running production systems, however, Red Hat rules.

Red Hat has produced a version of Red Hat Enterprise Linux, dubbed Atomic Host, geared specifically to run Linux containers. Do we need another version of RHEL? Will containers really catch on? Will Red Hat succeed in injecting vigor into its OpenShift platform for developers through this container expertise?

We shall see. But the idea of containers addresses several issues that virtualization could not solve by itself. In the future, containers may be a second way to move workloads into the cloud when certain operating characteristics are sought, such as speed of delivery to the cloud, speed of initiation, and concentration of workloads using the same kernel on one host.

Can the trendy tech strategy of DevOps really bring peace between developers and IT operations -- and deliver faster, more reliable app creation and delivery? Also in the DevOps Challenge issue of InformationWeek: Execs charting digital business strategies can't afford to take Internet connectivity for granted.

Charles Babcock is an editor-at-large for InformationWeek, having joined the publication in 2003. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
<<   <   Page 2 / 4   >   >>
alex_freedland
50%
50%
alex_freedland,
User Rank: Apprentice
4/18/2014 | 8:40:10 PM
A view from OpenStack perspective
RedHat is looking to address a real problem: RHEL is unsuitable for fast moving cloud ecosystems. Its release cadence puts it years behind the essential new capabilities are introduced into the world of cloud. By introducing containers, RedHat is looking to create a smaller and faster moving core, while keeping its old OS for legacy apps.

While elegant and wise marketing move for RedHat itself (that will most definitely improve agility and performance of workloads), it does not solve the fundamental disconnect between the RedHat's desire to impose its operating system as a standard for OpenStack, and the community's need to have its own Linux kernel that is developing at the same pace as OpenStack itself.

 
mthiele570
50%
50%
mthiele570,
User Rank: Apprentice
4/18/2014 | 5:48:50 PM
Great discussion with no obvious answer to RHat Container good or Container bad
When VMware first offered server virtualization the more advanced users among us were having some of these same debates (was VMware the company, was the tech real, was it not just recycled mainframe tech?). The reality is its too early to tell whether RHAT containers will be either the company or technology winner. However, what I will postulate is that this technology is an elegant midway technology to fill the gap between legacy virtualization and what's next (a better OS?). The fact is most virtualization (especially after all the extras are added) becomes to expensive, top heavy or both. I see the future of virtualized environments as more closely mirroring HPC environments, as its a natural progression from where most of us are today. In the HPC(ish) world the ability to use a lighter more cost effective solution like containers will likely have a very strong appeal to many. If I have to prognosticate further I'd say that the biggest risk to RHAT is more likely that an alternative h/w abstraction solution will arise before containers have gained a big enough foot hold to be considered the defacto solution a la VMware.
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
4/18/2014 | 3:35:31 PM
Containers, what they're good for, debated here
Piston founder, CTO Joshua McKenty summarizes pros, cons of Docker containers in a set of slides, liink below in his perceptive comments. Thanks for additional light shed by Jonathan Feldman, CIO of City of Ashville, N.C., and Joe Emison, CTO of BuildFax in discussion below. Joe points out drawbacks of containers, says they favor systems admin over developers. I still lean in, Joe. Thanks to Rich Wolski, founder, CTO, Eucalyptus, who got this debate going with points on strengths/weaknesses of virfual machines vs. containers. Rich gives tip of hat to remarks by CTO Brian Stevens at Red Hat Summit, concluded Thursday. Enomaly founder in Toronto, Reuven Cohen, not too impressed with containers; he's now top tech advocate, Citrix. Glad to see TeaPartyCitizen join in and don't forget Andrew Binstock, editor of Dr. Dobb's, all below. I know there are others who want to join this debate, please do so. Ah, Alex Freedland, CEO of Mirantis, did so. Thanks!
TeaPartyCitizen
50%
50%
TeaPartyCitizen,
User Rank: Apprentice
4/18/2014 | 3:28:42 PM
Re: Too many features, not enough benefits
It costs money. Duh.
jemison288
50%
50%
jemison288,
User Rank: Ninja
4/18/2014 | 3:02:58 PM
Re: UX Matters
Josh-- I agree with what you're saying, but you do gloss over the fact that developers favor "working" over "beautiful interfaces".  The challenge with Docker and PaaS is that you end up having to spend a lot of time shoehorning configurations into them, and the "interface" becomes more of a blockade to functionality than an asset.  (For trivial applications, this is not an issue, but developers generally aren't working with trivial applications).
joshuamckenty
50%
50%
joshuamckenty,
User Rank: Apprentice
4/18/2014 | 2:17:21 PM
UX Matters
I put together a slide deck on Docker for some of our investors and partners a few months back, and it's been useful in clarifying the confusion between containers (an infrastructure technology), Docker (a unified user experience around lightweight and introspected configuration management of containers), and various PaaS options (which deal with the myriad environment details of running multi-server applications, including scaling and upgrades).

http://www.slideshare.net/joshuamckenty/but-what-about-docker

People (whether they're so-called Developers or SysAdmins, an increasingly blurry line) don't use software - they use interfaces. And the Docker interfaces are beautiful. At least when you're getting started.

Whether Red Hat can, through any amount of either honest contribution or grandstanding, convince developers that the best way to consume either containers or PaaS (or VMs, for that matter) is to buy RHEL, remains to be seen.

To the best of my knowledge, no significant IT disruption has been succesfully commercialized by the legacy vendors it disrupted. This is why Red Hat is the major Linux vendor, instead of IBM or Microsoft. It's why most folks buy Hadoop from Hortonworks or Cloudera.

Is PaaS an important and transformative IT disruption? Absolutely.

Are containers likely to play a part in that story? I'd give it even odds.

Will a legacy OS vendor such as Red Hat or Microsoft become a dominant player in the PaaS space? History tells us that it's unlikely.
jemison288
50%
50%
jemison288,
User Rank: Ninja
4/18/2014 | 1:43:27 PM
Re: Too many features, not enough benefits
This is classic "think like a sysadmin".  In the AWS world, I can have as many boxes as I want right now.  Why is the "box" a limiting factor?  If you're talking about "how many different environments on a physical machine", you're already losing to AWS.  The box doesn't matter.  What matters is enabling the developers.
TeaPartyCitizen
50%
50%
TeaPartyCitizen,
User Rank: Apprentice
4/18/2014 | 1:26:34 PM
Re: Too many features, not enough benefits
Using containers is the only way of putting Dev, QA, SIT and Production on the same box. There is no configuring of a machine. With Docker it is bare metal provisioning. One writes a script to provision the machine from bare metal. No hokus pokus. The company is no longer truck sinsitive. It's infrastructure as code. No heavy hypervisers, no loading of an OS and no redundant resources. Containers make life simpler for developers and DevOps. I could even be the differenciation which Linux needs.
Laurianne
100%
0%
Laurianne,
User Rank: Author
4/18/2014 | 11:30:42 AM
Re: Too many features, not enough benefits
Thanks for weighing in, Joe and Jonathan. Sounds like we better explore this subject with further opinion columns.
jfeldman
50%
50%
jfeldman,
User Rank: Strategist
4/18/2014 | 10:12:45 AM
Re: Too many features, not enough benefits
To be clear, Heroku does call their dynos "containers." https://devcenter.heroku.com/articles/dynos
<<   <   Page 2 / 4   >   >>
Google in the Enterprise Survey
Google in the Enterprise Survey
There's no doubt Google has made headway into businesses: Just 28 percent discourage or ban use of its productivity ­products, and 69 percent cite Google Apps' good or excellent ­mobility. But progress could still stall: 59 percent of nonusers ­distrust the security of Google's cloud. Its data privacy is an open question, and 37 percent worry about integration.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Must Reads Oct. 21, 2014
InformationWeek's new Must Reads is a compendium of our best recent coverage of digital strategy. Learn why you should learn to embrace DevOps, how to avoid roadblocks for digital projects, what the five steps to API management are, and more.
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
A roundup of the top stories and trends on InformationWeek.com
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.