Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Want efficient CI and CD? Better start using containers

Containers have become popular tools for DevOps teams aiming to achieve CI and CD because containers make applications deliver more consistently and efficiently.

What does continuous integration mean, and what does it have to do with containers? Those are questions worth asking...

if you're seeking to understand how two of the hottest buzzwords in IT today fit together.

This article explains the role containers play in a software delivery chain that facilitates continuous integration (CI) and continuous delivery (CD), the big buzzwords inaugurated by the DevOps revolution.

Defining CI and CD

Continuous integration refers to the practice of integrating code changes into an application on a continuous basis. CI is defined in opposition to waterfall software development techniques, in which developers would wait until a large set of code updates were complete before merging them into the main codebase of an application.

The primary motivation for CI is that it allows developers to find problems when they are still easy to fix. Without CI, you run the risk of writing a bunch of new code then merging it into your app's codebase, only to find that the code changes create a problem and need to be rewritten. With CI, however, small changes are merged on a frequent basis into the application and tested immediately.

That means that if something breaks as the result of a code change, it's easy to pinpoint exactly what caused the problem. It's also easy to roll the application back without losing a lot of other work, because only one small change has to be reverted in order to return the codebase to a functional state. Last, but not least, CI enables multiple developers to work in parallel on the same codebase -- one developer does not have to wait on another to integrate and test new code before other new code can be written.

The result is more stability and predictability, and, ultimately, fewer unexpected disruptions that could stop CI or CD processes from being continuous.

In practice, CI is usually not fully continuous. Developers tend to merge and test changes into an application once a day, or sometimes more frequently than that, but they rarely do it every single time they make a change. That would not make much sense, since you can't rebuild and test your application each time you write a single line of new code. So, in the context of CI, "continuous" should be understood as a relative term.

Continuous delivery extends the CI concept beyond the development and testing sections of the software delivery pipeline. Whereas CI entails merging and testing code changes on a frequent basis, CD entails delivering those changes to users according to the same near-continuous rhythm.

In other words, under the CD model, you don't wait until you have written and tested a substantial number of changes to your application before declaring a new version and telling users to update. Instead, you allow them to update each time any meaningful change is introduced, even a relatively small one.

In practice, CD doesn't usually mean that users get a new version of an application every day. But they might get a new build every week or every month. That rate is much quicker than the pace of waterfall development, in which new software releases can take years to appear. (For example, think of how long it took Microsoft to get from Windows XP, which debuted in 2001, to Windows Vista, which came out in 2006.)

Containers, CI and CD

Now, here's the big question: What do containers have to do with CI and CD? Historically, very little.

It's worth noting that the concepts of CI and CD emerged before Docker was introduced in 2013. And while other types of container platforms existed prior to Docker's debut, very few people were using containers before that time. If they did use them, it was usually as an alternative to hypervisor-based virtualization, not as a part of a software delivery chain.

However, once Docker brought the concept of container-based application deployment mainstream, DevOps teams quickly realized how containers could help them achieve CI and CD.

Why? Because containers -- specifically, Docker application containers -- simplify the process of testing and deploying software. If you design your application to run inside a container, you can test it inside a container, build it inside a container and deliver it to users inside a container. This reduces complexity and adds consistency to the delivery chain. The result is more stability and predictability, and, ultimately, fewer unexpected disruptions that could stop CI or CD processes from being continuous.

Containers also make it easier to update an existing application whenever new code changes are introduced. When an application runs inside a container, updating it entails simply updating the container image, then spinning up new containers based on the updated image. That's easier and less disruptive to users than requiring an older version of the application to be uninstalled and replaced with a new one.

Other types of technologies could help do similar things. For example, traditional virtual machines also simplify development and delivery by providing more consistency between testing and deployment environments. But virtual machines don't offer as much consistency as containers. There are multiple types of virtual machine formats, and virtual machines can run a host of different operating systems.

With containers, in contrast, the application environment is essentially identical. A Docker container is a Docker container, no matter what kind of server it runs on or how the application itself is configured. To be sure, there could be small differences between Docker formats depending on which version of Docker is being used, and variables related to things like networking can also vary between Docker environments. But the degree of inconsistency in these regards is still a magnitude lower than what you have when you run virtual machines.

The role of containers in CI and CD has become even more pronounced in recent years, as continuous integration servers -- which automate the process of integrating and testing code changes into a codebase -- have added direct support for containers, and as production environments have become containerized. These changes have meant that containers not only help facilitate CI and CD in theory, but also in practice for real-world delivery chains.

Conclusion

Containers have become the most effective building blocks for modern software delivery chains. They're not perfect, but they introduce a new level of efficiency for organizations seeking to write, test and deliver code to users as continuously as possible.

Next Steps

How containers can go a long way with CI and CD

How DevOps can move beyond virtual machines

Why the DevOps engineer needs CD skills

This was last published in February 2017

Dig Deeper on Software containers

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

How have containers improved your organization's CI and CD?
Cancel

-ADS BY GOOGLE

SearchSoftwareQuality

SearchCloudApplications

SearchAWS

TheServerSide.com

SearchWinDevelopment

DevOpsAgenda

Close