Docker 101
Introduction
In the ever-evolving landscape of software development and deployment, Docker has emerged as a game-changer. Docker is a powerful containerization platform that allows developers to package applications and their dependencies into lightweight, portable containers. These containers can run consistently across different environments, making it easier to develop, test, and deploy applications. In this technical blog, we'll go through the world of Docker, exploring its key concepts, benefits, and best practices.
Docker is an open platform for creating, delivering, and executing programs. Docker allows you to rapidly release software by separating your apps from your infrastructure. You can use Docker to manage your infrastructure in the same manner that you do your apps. By taking advantage of Docker's methodologies for shipping, testing, and deploying code, you can significantly reduce the delay between writing code and running it in production.
What are Containers?
It is necessary first to understand what a container is to comprehend Docker. A container is a self-contained environment that has all the necessary components to operate software. As opposed to the more common practice of generating virtual machines (VMs) using hardware-level virtualization, these environments are operated utilizing virtualization at the operating system (OS) level.
Containers and Virtual Machines
Virtual Machines (VMs) are executed within Hypervisors, which facilitate the concurrent operation of multiple Virtual Machines on a single physical host, each with its dedicated operating system. Therefore, the resource footprint of these VMs is comparatively larger, which leads to a slower boot time but provides robust hardware-level process isolation.
A container is an executable, standalone, lightweight software package that contains all the necessary components to run a program, such as libraries, system tools, runtime, and code.
What is Docker?
Docker is an open-source open platform that plays a major role in developing, running, and shipping applications. It can help you create a partition of your application from its infrastructure, to deliver the software quickly.
Containers are executable, standalone, lightweight packages that contain all the code, runtime, system tools, libraries, and settings required for a program to function. Docker containers solve the famed "It works on my machine" issue by enabling consistent operation across many settings, including development, testing, and production.
Why Docker?
Docker provides us with containers. Containerization consists of an entire runtime environment, an application, all its dependencies, libraries, binaries, and configuration files needed to run it, bundled into one package. Each application runs separately from the other. Docker solves the dependency problem by keeping the dependency contained inside the containers.
Docker has gained popularity for several reasons!
-
Portability: Docker containers are highly portable, Docker containers can run on any system that supports Docker, regardless of the underlying infrastructure. This portability means you can develop and test applications on your local machine and then deploy them to various environments, such as on-premises servers, cloud providers, or hybrid setups.
-
Isolation: Containers provide isolation for applications and their dependencies. This means that one container's changes or issues won't affect others, enhancing security and stability.
-
Scalability: Docker makes it easier to scale applications horizontally by creating and managing multiple instances of containers. This is crucial for handling increased workloads and improving application performance.
-
Efficiency: Containers are lightweight and use system resources more efficiently than traditional virtual machines (VMs). You can run more containers on a single host, which can lead to cost savings and improved resource utilization.
Docker Architecture
Docker uses a client-server architecture. The Docker client talks to the Docker daemon, which does the heavy lifting of building, running, and distributing your Docker containers. The Docker client and daemon can run on the same system, or you can connect a Docker client to a remote Docker daemon. Another Docker client is Docker Compose, which lets you work with applications consisting of a set of containers.
Docker daemon: The purpose of the Docker daemon is to receive requests from the Docker API and manage Docker objects such as images, containers, networks, and volumes. A daemon can also communicate with other daemons to manage Docker services.
Docker client: is the primary way that many Docker users interact with Docker.
Docker registries: it stores Docker images. Docker Hub is a public registry that anyone can use, and Docker looks for images on Docker Hub by default. You can even run your own registry such as AWS ECR.
Docker Components
Dockerfile is a text document that contains instructions for building a Docker image. It defines a base image, sets up the environment, copies application code, and configures the container. Dockerfiles are essential for creating custom images tailored to specific applications. DockerFile is more like the CLI instructions that the Docker Engine must run in order to seek assembly of the image.
Docker Images are a read-only template with instructions for creating a Docker container. It contains the application code and its dependencies. Docker images are typically hosted in registries like Docker Hub, and you can also create custom images using Dockerfiles.
Creating Containerized Application
Given a Python application, we want to deploy it to our staging or production server. First, we make sure we have the docker configuration script included in the root directory of the application.
Create a file with the name Dockerfile at the root directory of your application on the repository and include the code below to tell Docker what to do when running in the production or staging environment.
Following is a simple Dockerfile, that containerizes a FAST API application
# This line specifies the base image for the Docker container. In this case, it's using an Alpine Linux-based image with Python 3.9
FROM python:3.9-alpine
# This line sets the working directory within the container to /app. This means that all subsequent commands will be executed in the /app directory.
WORKDIR /app
# This line copies the contents of the local (Python files) ./app directory into the /app directory within the Docker container.
COPY ./app /app
# This line copies the requirements.txt file from the local directory to the /app directory within the container. The requirements.txt file typically contains a list of Python packages and their versions required for the application to run.
COPY requirements.txt /app/requirements.txt
# This line executes a pip install command within the container. It reads the requirements.txt file and installs the Python packages inside the container.
RUN pip install --no-cache-dir --upgrade -r requirements.txt
# This line defines the default command to run when a container based on this image is started.
# It runs FAST API application on a uvicorn ASGI server on port 80 and localhost
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "80"]
Finally, build and push the docker image to DockerHub with GitHub workflow:
jobs:
docker:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
push: true
tags:
Docker Best Practices
-
Keep Containers Single-Purpose: Each container should have a single responsibility. This approach simplifies management and makes it easier to scale and troubleshoot applications.
-
Use Docker Compose for Multi-Container Applications: Docker Compose streamlines the definition and orchestration of services for applications made up of multiple containers.
-
Optimize Image Sizes: Using multi-stage builds and reducing unnecessary dependencies can help create lean images. Images that are smaller can be deployed more quickly and use less storage.
-
Update and Maintain Containers Frequently: Keep your containers and base images up to date to patch security vulnerabilities and ensure optimal performance.
Conclusion
Docker has become an integral part of modern software development and deployment. It offers benefits like portability, isolation, and scalability, making it a valuable tool for developers, sysadmins, and DevOps professionals. By understanding Docker's core concepts and following best practices, you can harness the power of containerization to streamline your development and deployment processes. Docker is not just technology; it's a paradigm shift that simplifies how we build, ship, and run software.
In this blog, we've reviewed just a few of Docker's capabilities. If you want to learn and leverage the full potential of Docker, then dive into Docker and experience the future of software containers.
Reference
About the Author
Mohammed Hassan - Cloud Consultant at Cloud Softway