Consider this Dockerfile snippet:
FROM python:3.12-slim COPY requirements.txt /app/ RUN pip install -r /app/requirements.txt COPY . /app/ CMD ["python", "/app/app.py"]
If you change the order of the COPY instructions to copy the entire app directory before copying requirements.txt, what will happen when you rebuild the image after only changing app.py?
FROM python:3.12-slim COPY requirements.txt /app/ RUN pip install -r /app/requirements.txt COPY . /app/ CMD ["python", "/app/app.py"]
Think about how Docker caches layers based on the exact content of files copied before a RUN command.
Docker caches layers based on the exact content of files copied before a RUN command. If you copy the entire app directory before copying requirements.txt, any change in app.py will change the entire directory content, invalidating the cache for the pip install layer.
Which statement best explains why ordering instructions in a Dockerfile affects build speed?
Think about what triggers cache invalidation in Docker builds.
Docker caches layers and reuses them only if the instruction and its context (files copied, commands run) have not changed. Placing stable instructions early helps reuse cache and speeds up builds.
You notice that your Docker build always reruns RUN npm install even though package.json has not changed. Your Dockerfile snippet is:
FROM node:20 COPY . /app WORKDIR /app RUN npm install CMD ["node", "server.js"]
What is the most likely cause?
FROM node:20 COPY . /app WORKDIR /app RUN npm install CMD ["node", "server.js"]
Consider what files affect the cache for the RUN npm install layer.
Copying the entire directory before RUN npm install means any file change invalidates the cache for that layer. To optimize, copy only package.json first, run npm install, then copy the rest.
You have a Python app with dependencies listed in requirements.txt. You frequently change app.py but rarely change dependencies. Which Dockerfile instruction order best optimizes build caching?
Think about which files change often and which do not, and how Docker caches layers.
Copying requirements.txt and running pip install first caches dependencies. Then copying the rest means changes to app.py don't invalidate the dependency layer, speeding rebuilds.
Which Dockerfile layering strategy best minimizes rebuild time when source code changes frequently but dependencies rarely change?
Consider how Docker cache invalidation works with file changes and layer order.
Copying dependency files first and installing dependencies before copying source code allows Docker to reuse the cached dependency layer when only source code changes, minimizing rebuild time.