GitHub Actions with Docker - Time & Space Complexity
When using GitHub Actions with Docker, it's important to understand how the time to run your workflows grows as you add more steps or build larger images.
We want to see how the execution time changes when the input, like the number of Docker build steps, increases.
Analyze the time complexity of the following GitHub Actions workflow snippet that builds a Docker image.
name: Build Docker Image
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Build Docker image
run: docker build -t my-app .
This workflow checks out the code and builds a Docker image from the Dockerfile in the repository.
Look for repeated actions or loops that affect time.
- Primary operation: Docker build processes each instruction in the Dockerfile sequentially.
- How many times: Once per build, but the number of instructions in the Dockerfile determines how many steps Docker runs.
The time to build grows roughly with the number of instructions in the Dockerfile.
| Input Size (n = Dockerfile instructions) | Approx. Operations (build steps) |
|---|---|
| 10 | 10 steps |
| 100 | 100 steps |
| 1000 | 1000 steps |
Pattern observation: The build time increases linearly as you add more instructions.
Time Complexity: O(n)
This means the build time grows directly in proportion to the number of Dockerfile instructions.
[X] Wrong: "Adding more Dockerfile instructions won't affect build time much because Docker caches layers."
[OK] Correct: While caching helps, the first build or changes to instructions still require processing each step, so more instructions mean more work.
Understanding how build time scales with Dockerfile size shows you can reason about workflow efficiency and resource use, a useful skill in real projects.
What if we split the Docker build into multiple smaller images and build them separately? How would the time complexity change?