The role of containers in DevOps

25 Feb 2025 - 6 min read

DevOps relies on containers more than any other tool or technology.

Interestingly, containers were not created for DevOps. With a history that goes half a century back, containerizations was developed to share computing resources across users and applications.

A brief history of containers

Created in 1979, chroot is one of the pioneering implementation of containers in Unix. chroot could create separate running environments for applications by segregating the file system.

Subsequently, containerization implementations like Solaris containers, Linux containers (LXC), LMCTFY (Let Me Contain That For You created by Google ), etc., came into light. All these container implementations enabled isolation of applications at various levels inside the OS.

In 2013, Docker (the Docker company) released an open-source tool set for packaging containers to run on LXC. These tools could package the software runtime and all dependencies of an application into a single binary called a Docker image.

A Docker image did not include the operating system so it was lighter than OS images used for creating virtual servers. On the other hand, unlike a stand alone binary package, a Docker image was self sufficient - it included all dependencies to run the software on the target platform. It solved one of the notorious problems in the software community - "But it works on my machine".

The Docker tool set quickly became popular due to this efficiency of the Docker images.

In 2014, one year after releasing their tool set, Docker released libcontainers - a new open-source container runtime that's more efficient and flexible than LXC.

Later, other organizations like Google abandoned their own container implementations and started contributing to libcontainers.

The creation of Kubernetes in 2015 fueled the adoption of containers as the standard mechanism for running software applications at large scale.

Docker also created the project containerd and donated it to the CNCF for streamlining the development and evolution of libcontainers.

Today, containerd powers Docker Engine as well Kubernetes creating a platform for running millions of software applications.

How containers support DevOps

Containerization is an integral part of today's DevOps. Here are 4 DevOps practices that explicitly rely on containers.

DevOps practice #1: Architect applications in microservices

Microservices architecture splits a complex application into smaller manageable modules called microservices.

Think of a big ecommerce application. It has to authenticate users, manage inventory, manage shopping carts for individual users, do payment processing, and a lot more.

Instead of bundling all these functions into one big application, you can build each function as a separate module or microservice. Then the developers can individually work on different microservices without being bothered about what the other developers are working on. One developer can add a new feature to the shopping cart while another is fixing a bug in payment processing.

An application architected in microservices needs a mechanism that supports shipping each microservice independent of the others and the containers serve this purpose perfectly.

DevOps practice #2: Continuous integration

Continuous integration is the practice of building software in incremental steps.

Going back to our example of the big ecommerce application, assume a bunch of developers working on different features and bug fixes in parallel. Developers complete their tasks and submit code to the central code repository individually.

In the traditional software development model, you would wait for a fixed time period (one month, three months, etc.) to merge all these changes and release a new version of the software.

When practicing DevOps with continuous integration, you don't wait that long. Instead, you merge the changes, build, and release a new version of the software at every commit of each developer.

Continuous integration depends on your ability to build the software quickly and efficiently.

Containers with their smaller image sizes, are quick to build and easy to move around. So, containers have become the most popular or rather the standard medium for releasing software when practicing DevOps.

DevOps practice #3: Continuous testing

DevOps practices emphasize on automated testing to ensure software quality.

A software application has to go through multiple testing stages like unit testing, smoke testing, regression testing, integration testing, etc,. before being released to production. To support this type of testing you need to deploy the software several times in many testing environments during testing.

Successful continuous testing relies on having software that can be repeatedly deployed reliably on such testing environments.

Containers being self contained, you can replicate containerized software in multiple testing environments guaranteed that the software instances in all these environments are identical. There would be no configuration or version mismatches that could lead to chaos during testing.

Once testing is completed, you can use the same container images to deploy the software in production. What you deploy to production would be identical to what you tested.

DevOps practice #4: Continuous delivery

Continuous integration builds a new version of the software every time a developer commits code to the main repository. Continuous testing ensures that each new version is thoroughly tested before being released to production. Continuous delivery deploys each new release to production as soon as the software passes the testing.

When practicing continuous delivery in a large and complex software, it's quite common to release new versions several times a day - or even hundreds of times a day.

Containers really shine here.

Container are quick to start - with startup times measured in seconds. You can deploy hundreds of containers to production in a few minutes. If required, revert back is simple. Just terminate the new containers and spin up the previous version.

Platforms like Kubernetes coupled with deployment tools like ArgoCD, implement deployment strategies like canary deployment and blue-green deployment taking container deployment to a more advanced level. Using these deployment strategies you can to migrate traffic incrementally to the new version with minimum disruption.

Wrapping Up: Containers and the CICD pipeline

Continuous delivery together with continuous integration and continuous testing, makes up the CICD (Continuous Integration and Continuous Delivery) pipeline - a key enabler for DevOps practices.

What makes CICD work?
It's containers.

Read Cloud Letters in your Inbox