Deploying from CI/CD pipeline in Docker - Time & Space Complexity
When deploying applications using Docker in a CI/CD pipeline, it is important to understand how the deployment time changes as the project grows.
We want to know how the steps in the pipeline scale when more services or containers are added.
Analyze the time complexity of the following Docker deployment snippet in a CI/CD pipeline.
# Build and push multiple Docker images
for service in service1 service2 service3; do
docker build -t myrepo/$service:latest ./services/$service
docker push myrepo/$service:latest
docker service update --image myrepo/$service:latest $service
sleep 5
done
This script builds, pushes, and updates Docker services for each microservice in the list.
Look for repeated steps that affect total time.
- Primary operation: Loop over each service to build, push, and update.
- How many times: Once per service (3 times in this example, but can grow with more services).
As the number of services increases, the total deployment time grows roughly in direct proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 3 | 3 builds + 3 pushes + 3 updates |
| 10 | 10 builds + 10 pushes + 10 updates |
| 100 | 100 builds + 100 pushes + 100 updates |
Pattern observation: Doubling the number of services roughly doubles the total deployment time.
Time Complexity: O(n)
This means the deployment time grows linearly with the number of services being deployed.
[X] Wrong: "Adding more services won't affect deployment time much because each step is fast."
[OK] Correct: Each service requires separate build, push, and update steps, so total time adds up with more services.
Understanding how deployment time scales helps you design efficient pipelines and shows you can think about real-world system growth.
What if we ran the builds and pushes in parallel instead of sequentially? How would the time complexity change?