Rackspace Carina Promises Simplified Containerized Workloads - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Cloud // Software as a Service
News
11/16/2015
10:05 AM
Connect Directly
Twitter
RSS
E-Mail
50%
50%

Rackspace Carina Promises Simplified Containerized Workloads

Rackspace opens Carina service, which promises to make it simpler and faster to move containerized workloads to Rackspace and manage them once they are there.

Cloud Vs. On-Premises: 6 Benefits Of Keeping Data Private
Cloud Vs. On-Premises: 6 Benefits Of Keeping Data Private
(Click image for larger view and slideshow.)

Rackspace has launched in beta a container service called Carina that creates clusters for containerized workloads that customers bring to it. Rackspace systems and system operators then manage the cluster so the customer doesn't have to.

"Carina is a way we can make the container resource available very quickly. It's up and running much more quickly than a virtual machine," said Adrian Otto, distinguished architect at Rackspace and leader of the team that created Carina, in an interview.

Rackspace publicly announced Carina at the OpenStack Summit in Tokyo on Oct. 27.

Generate a virtual machine cluster and launch virtual machines onto it generally takes at least several minutes, Otto said. Carina can create a cluster in 45 seconds and launch a containerized workload a few seconds later.

The containers are running on bare-metal Linux hosts. That means the Carina service can be used for things that need fast startups and shutdowns, too fast perhaps for the public cloud.

Otto gave on example of a developer using an O'Reilly Media online computer-language course. The course presents the developer with a sample of code, which he tweaks in the manner that he wants, then clicks on a button to run his creation. Since the course is hosted on Rackspace, Carina generates a Docker container with the code in it and launches it on a server. The developer sees the results of his coding almost immediately.

(Image: masterzphotois/iStockphoto)

(Image: masterzphotois/iStockphoto)

It's a much more effective way to teach, but if it relied on pauses of several minutes it would lose its immediacy and run up more significant costs, said Otto.

"The code executes in a few seconds, then the container goes away. There's no VMs sitting around idle, fully provisioned, waiting to start up," and incurring public cloud charges by doing so, he pointed out. A video demonstrating O'Reilly's use of Carina is posted on YouTube.

In an online development setting, the use of containers can provide for quick setup and tear down when it comes to dev/test.

In a blog post announcing Carina on Oct. 27, Otto noted that most public clouds running containers services, which would include Amazon Web Services, launch them into virtual machines. "Virtual machines have both advantages and disadvantages. They cause application code to execute more slowly than it would on a bare-metal server environment. That's bad. They are easier to secure for multi-tenancy. That's good," he wrote.

[Want to learn about the latest in container security? See CoreOS Service Scans Containers For Vulnerabilities.]

Without containerization, it's very expensive to keep one customer's workload separated from another's by continually launching bare-metal servers dedicated to particular customers. "You end up adding a full server at a time. That's a lot of hardware!"

The Carina service, however, can provide isolation for applications in containers, which is sufficient for some types of workloads, such as the O'Reilly Media online courses or for dev and test of non-mission-critical code. Where isolation doesn't have to be the top priority, Carina provides "a way to scale up using smaller increments (of bare-metal capacity) that more closely tracks the needs of your dynamic workload," he wrote in his post.

Carina is integrated with Docker operations so a customer with an application in a container stored on the Docker Hub -- an image -- can pull that image from a Carina account and launch it into the Rackspace Managed Cloud. The cluster needed to run the container will be detected and built by Carina. Such a move would constitute a normal workflow under the Carina service, Otto said in the interview.

Amazon offers its EC2 Container Service and Google, an early expert in container use, offers Google Container Service on its public cloud. But Otto said Carina is designed for ease-of-use by container newcomers through its integration with Docker tools and the Docker Hub.

"One thing we showed in Tokyo was my 10-year-old son, Jackson, demo-ing the (Carina) software. If my 10-year-old son can do it, any developer can do it," said Otto.

A Google customer will become a user of Kubernetes -- the basis for Google's internal container management, and now an open source code project. Otto suggested that not every container user would necessarily want to become a long-term Kubernetes user.

In addition to being a distinguished architect at Rackspace, Otto is project technical lead at the Magnum Project, the container-management module that's part of OpenStack. Because Carina is based on the concepts behind Project Magnum, it can offer its own container cluster launching mechanism. It can offer Docker Swarm cluster management and other Docker tools, Apache Mesos cluster management, and open source Kubernetes.

Magnum is OpenStack's container service for public cloud operators. It was designed not to favor one operator's approach over another's, Otto said. Different management systems may be integrated into it.

**New deadline of Dec. 18, 2015** Be a part of the prestigious InformationWeek Elite 100! Time is running out to submit your company's application by Dec. 18, 2015. Go to our 2016 registration page: InformationWeek's Elite 100 list for 2016.

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Charlie Babcock
50%
50%
Charlie Babcock,
User Rank: Author
11/16/2015 | 3:10:50 PM
Deploying containers: where the rubber meets the road
Rackspace still has self-provisioning public cloud services but it's restored emphasis on its own expertise in managed services and is a leader in that field. When it comes to containers, supplying a little managed service in deploying containers may be where the rubber meets the road with Rackspace versus the larger public cloud suppliers.
Slideshows
10 Ways to Transition Traditional IT Talent to Cloud Talent
Lisa Morgan, Freelance Writer,  11/23/2020
News
What Comes Next for the COVID-19 Computing Consortium
Joao-Pierre S. Ruth, Senior Writer,  11/24/2020
News
Top 10 Data and Analytics Trends for 2021
Jessica Davis, Senior Editor, Enterprise Apps,  11/13/2020
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Why Chatbots Are So Popular Right Now
In this IT Trend Report, you will learn more about why chatbots are gaining traction within businesses, particularly while a pandemic is impacting the world.
Slideshows
Flash Poll