What is a container?
Standardized, portable packaging for your applications.
Containers explained
Just as shipping industries use physical containers to isolate different cargos—for example, to transport in ships and trains—software development technologies increasingly use an approach called containerization.
A standard package of software—known as a container—bundles an application’s code together with the related configuration files and libraries, and with the dependencies required for the app to run. This allows developers and IT pros to deploy applications seamlessly across environments.
Why you should care about containers
The problem of an application failing to run correctly when moved from one environment to another is as old as software development itself. Such problems typically arise due to differences in configuration underlying library requirements and other dependencies.
Containers address this problem by providing a lightweight, immutable infrastructure for application packaging and deployment. An application or service, its dependencies, and its configuration are packaged together as a container image. The containerized application can be tested as a unit and deployed as a container image instance to the host operating system.
This way, containers enable developers and IT professionals to deploy applications across environments with little or no modification.
Learn more about the genesis and beauty of containers.
Container vs. virtual machine
When people think about virtualization, virtual machines (VMs) often come to mind. In fact, virtualization can take many forms, and containers are one of those. So what's the difference between VMs and containers?
At a high level, VMs virtualize the underlying hardware so that multiple operating system (OS) instances can run on the hardware. Each VM runs an OS and has access to virtualized resources representing the underlying hardware.
VMs have many benefits. These include the ability to run different operating systems on the same server, more efficient and cost-effective utilization of physical resources, and faster server provisioning. On the flip side, each VM contains an OS image, libraries, applications, and more, and therefore can become quite large.
A container virtualizes the underlying OS and causes the containerized app to perceive that it has the OS—including CPU, memory, file storage, and network connections—all to itself. Because the differences in underlying OS and infrastructure are abstracted, as long as the base image is consistent, the container can be deployed and run anywhere. For developers, this is incredibly attractive.
Since containers share the host OS, they don’t need to boot an OS or load libraries. This enables containers to be much more efficient and lightweight. Containerized applications can start in seconds, and many more instances of the application can fit onto the machine as compared to a VM scenario. The shared OS approach has the added benefit of reduced overhead when it comes to maintenance, such as patching and updates.
Though containers are portable, they’re constrained to the operating system they’re defined for. For example, a container for Linux can’t run on Windows, and vice versa.
Why containers
Agility
When developers build and package their applications into containers and provide them to IT to run on a standardized platform, this reduces the overall effort to deploy applications and can streamline the whole dev and test cycle. This also increases collaboration and efficiency between dev and operations teams to ship apps faster.
Portability
Containers provide a standardized format for packaging and holding all the components necessary to run the desired application. This solves the typical problem of “It works on my machine” and allows for portability between OS platforms and between clouds. Any time a container is deployed anywhere, it executes in a consistent environment that remains unchanged from one deployment to another. You now have a consistent format, from dev box to production.
Rapid scalability
Since containers do not have the overhead typical of VMs, including separate OS instances, many more containers can be supported on the same infrastructure. The lightweight nature of containers means they can be started and stopped quickly, unlocking rapid scale-up and scale-down scenarios.
Use cases
Cloud-native applications
Cloud-native applications rely on containers for a common operational model across environments, including public, private, and hybrid. The low overhead and high density of containers allow many of them to be hosted inside the same virtual machine and makes them ideal for delivering cloud-native applications.
Lift and shift
An organization can gain significant benefits by migrating to the cloud, but may not want to rewrite an existing application. Using containers, you can potentially migrate your applications to the cloud without changing any code.
Batch
Batch processing refers to activities that can be done without human intervention or that can be done on a resource-available basis. Examples include generating reports, resizing images, and converting files from one format to another. Containers provide an easy way to run batch jobs without having to manage an environment and dependencies. Dynamic compute options, such as Azure Container Instances (ACI), can be used to efficiently ingest source data, process it, and place it in a durable store such as Azure Blob storage. Using such an approach instead of statically provisioned virtual machines can achieve significant cost savings through per-second billing.
Machine learning
Machine learning applies algorithms to data and makes predictions based on patterns found in the data. Containers can make machine learning applications self-contained and easily scalable in any environment.
Beyond containers
Orchestration
Running containers at scale requires orchestration and management of distributed, containerized applications via an orchestration platform such as Kubernetes.
Security
With containers requires a layered approach, from container image to cluster isolation. Configuration of these guardrails is best set with your CI/CD pipelines.
Serverless containers
You can further increase agility with containers on demand. Use serverless container technologies to easily run containers without managing servers and burst from your Kubernetes clusters when traffic comes in spikes.
DevOps
Containers allows developers to easily share software and dependencies across IT and production environments. When combined with DevOps practices, you can effectively ship code faster and shorten software development cycles.