What are Containers?

“Containers” and “containerization” are two buzzwords that have been springing up everywhere in technological discussions over the past few years, and the movement towards using containers has been rapidly gaining momentum in Enterprise level implementations. The effect that containerization is having on the application development landscape is significant, and there are a lot of new concepts to understand in the container stack; with that in mind, let’s take a high-level look at what containers are and how they work. The most common metaphor used to describe application containers, shockingly enough, is that of a standard shipping container. Similar to how shipping containers standardize the footprint and interface for cargo to be packaged, loaded, and manipulated on a freighter, an application container wraps up your custom applications in a package that has a consistent structure and interface. You package everything you need for your application to run into a single atomic container, which can then be easily started, stopped, copied, moved, or manipulated.

What makes this concept so compelling is that when you run a container, you know exactly how it will run; a container is predictable, easily duplicated, and immutable. It behaves the exact same way when you deploy a container on different machines. There are no nasty surprises. As organizations move towards microservices architecture, containers are becoming irreplaceable components of overall enterprise architecture.

Docker vs Kubernetes

At the surface, Docker and Kubernetes appear to be similar technologies. They both help you run applications within containers. However, as you dive slightly deeper under the surface, you will find that these technologies work at different levels of the stack; additionally, these technologies actually complement each other. Having a good understanding of both Docker and Kubernetes is essential in today’s development landscape, especially if you want to build and run a modern microservices-based containerized application.

Build and run a container with Docker

Docker is a computer program that performs operating-system-level virtualization, also known as “containerization”. It is an open source tool that helps you create, deploy, and run applications by using containers. The key aspect of containerization is that the container can be run anywhere; whether it is on your laptop, on a server within your data center, on a friend’s machine, or even on a server in the cloud. With Docker, you create a file called Dockerfile that defines the overall container provisioning process. When you run the docker build command from a location with a valid Dockerfile, the docker application will execute the instructions contained in the Dockerfile and produce a docker image; this docker image can be treated as a versioned artifact. You can deploy or start this image using the docker run command on any platform where the docker daemon is supported and is running.

You can also download pre-created images from the shared Docker Hub repository, which provides images for a myriad of applications. Images that you create yourself can also be distributed via Docker Hub.

Docker Hub

Docker also has its own native container orchestration tool called Docker Swarm or Swarm. It lets you deploy groups of containers as Swarms that you can interact with as a single unit. Any software, services, or tools that run within Docker containers run equally well in Swarm. Swarm turns a pool of Docker hosts into a virtual, single host; it is especially useful for people who are trying to get comfortable with an orchestrated environment or who need to adhere to a simple deployment technique but also need to run multiple Docker containers on more than one particular platform.

Manage Docker Containers with Kubernetes

Spinning up containers is a great way of scaling your enterprise architecture. However, if you run multiple containers in your enterprise infrastructure, you will realize that Docker isn’t enough as a stand-alone product. Especially in production and production-like environments where you will need to scale up and scale down your application quickly, you will need an application to manage this autoscaling process. You need to start the right containers at the right time, setup communication channels between containers, handle resource allocation, and deal with failed containers or hardware. Doing all of this manually would be a Herculean task; that’s where Kubernetes comes in.

Kubernetes

Kubernetes is an open source container orchestration platform originally created by Google. It helps with a number of the challenges presented by managing large numbers of containerized applications, including scaling up and down, managing inter- and intra-container traffic, and provisioning and automation across clusters of container hosts. This production-ready, enterprise-grade, self-healing (auto-scaling, auto-replication, auto-restart, auto-placement) platform is modular, and so it can be utilized for any architecture deployment. Kubernetes also provides capabilities for distributing application load amongst containers. It aims to alleviate a number of the problems created by running large scale, distributed applications in private and public clouds by placing related containers into groups and managing them as logical units.

Summary

As outlined in this blog post, Docker and Kubernetes are excellent containerization tools that allow easy application packaging, deployment, and management while narrowing down the uncertainty in application behavior regardless of the deployment topology. While they are related in that they tackle challenges presented by the advent of containerization, they are fundamentally different at their core. Because of the very large feature-set and ease of deployment, it is no wonder that enterprises are moving towards leveraging Containers for their applications.

If you have questions on how you can best leverage our experience, would like further examples and/or need help with your Cloud-based environment, check out our blog post on 5 Common Gotchas with Enterprise Serverless Development. Please engage with us via comments on this blog post, or reach out to us.

Additional Resources

You can also continue to explore our Cloud Expertise by checking out the following posts.

Creating a Web Application with Azure App Service
Serverless and PaaS, FaaS, SaaS: Same, Similar or Not Even Close?

Troubleshoot HTTP 502.5 Process Failure Error in Azure API AppAzure Cosmos DB Features and Benefits

Share This