Containerization Can Render Apps More Agile Painlessly

Containerization is expanding in IT architectures, and so is an IT teams’ need to be trained on it.

Mary E. Shacklett, President of Transworld Data

August 6, 2024

4 Min Read
Unrecognizable male healthcare administrator using a scalable and secure cloud container application
Leo Wolfert via Alamy Stock

Software-based containers were first commercialized in the early 2000s. The idea of containerizing applications has grown ever since because software-based containers can speed application deployment and, in many cases, minimize IT time. 

The market reflects this. The percentage of companies adopting application containerization is annually growing by double digits. And why wouldn’t it, with its compelling value proposition of virtualizing both hardware and operating system integrations? 

A single container built to run a Linux operating system contains the OS kernel and all the necessary executables, binary code, libraries, and configuration files needed to run an application. All a software developer needs to do is load an application onto the Linux container and, in an ideal world, no further “hand integration" or tweaking are needed. In May, IBM cited Forrester research that showed that 74% of companies were adopting containers for use in both on-premises and cloud environments. 

That said, there are challenges in using containers that IT should address. 

While containerization can render applications more agile, and eliminate painful infrastructure integrations, the integration challenges move elsewhere

These are the issues that IT must address: 

Related:What Inconsistent Environments Mean for Container Management

  • Do I have the right tools to manage containers, and if not, what new tools do I need, and how do I integrate knowledge of these tools into current IT skillsets? 

  • What network and system best practices do I need to modify for managing containers, and how do I integrate these new practices into our present methodology? 

  • Are there ways to automate container operations, and how does this new automation impact current practice? 

  • How do containers change my philosophy about saving and recovering data? 

  • Do we risk creating more IT resource waste? 

  • What is the impact on our existing security and zero-trust network schemes? 

That’s a lot to digest, and it suggests that IT sites should be planning exactly how they will integrate a widespread use of containers into their existing IT infrastructure and practices. 

Developing a Successful Container Environment 

The key to container deployment is to think holistically about the process. 

How does containerization impact application development and deployment, staff skillsets, IT infrastructure, governance, and choice of tools? 

Application development and deployment methods will change because the app developer no longer has to think about the integration of an app with an underlying operating system and associated infrastructure. This is because the container already has the correct configuration of all these elements. If an app developer wants to immediately install his app in both Linux and Windows environments, he can do it. 

Related:Growing Your Cloud Engineering Capabilities

The tougher job belongs to the system and network staff members who must support the containers. 

For instance, how do you maintain consistency within each container when there are so many dynamic interactions occurring among the many bits and pieces of software that have been configured within each container? Can you do it with your existing tools? 

Most IT staff have found that they need specialized tools for container management, and that they can’t use the tools that they are accustomed to. Companies like Kubernetes, Dynatrace, and Docker all provide container management tools, but mastering these tools requires IT staff to be trained on the tools. 

Security and governance also present challenges in the container environment because each container uses its own operating system kernel. If an OS security vulnerability is discovered, the OS kernel images across all containers must be synchronously fixed with a security patch to resolve the vulnerability. 

In cases like this, it’s ideal to have a means of automating the fix process, but it might be necessary to do it manually at first. 

Related:Misconfiguration Could Set Up Cyber Attacks

Additionally, the more containers that you deploy in your infrastructure, the more you broaden your security attack surface for intruders. One way to address this is to use a zero-trust network approach where you place boundaries around certain containers or clusters of containers. This limits container access to all but authorized users. In this way, it can reduce security risks. 

Finally, there are the issues of data preservation and recovery, along with the prevention of IT resource waste. 

Eliminate IT resource waste by removing containers when they are no longer needed. However, IT needs to bear in mind that whenever a container is deleted, any data stored within it disappears, too. To prevent this, data backup and recovery routines should be developed. Containerization vendors have thought about this. Many now offer persistent storage options that can store data outside of a container when a container is deleted. 

Wrapping It Up 

Containerization has the potential to transform work in application development and IT infrastructure support. It will fit well with faster time-to-market methodologies like Agile, and both the end business and IT should reap the benefits.  

The job at hand is to equip IT staff so it can optimize this innovative technology and ready itself for the IT infrastructure shift that containerization will undoubtedly drive. 

About the Author

Mary E. Shacklett

President of Transworld Data

Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturer in the semiconductor industry.

Mary has business experience in Europe, Japan, and the Pacific Rim. She has a BS degree from the University of Wisconsin and an MA from the University of Southern California, where she taught for several years. She is listed in Who's Who Worldwide and in Who's Who in the Computer Industry.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights