Challenge - 5 Problems
Docker ML Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
💻 Command Output
intermediate2:00remaining
Output of Docker command with ML model container
You run this command to list running Docker containers for your ML model deployment:
docker ps --filter "ancestor=ml-model:latest" --format "{{.Names}}: {{.Status}}"What is the expected output if one container named model-serving-1 is running for 5 minutes?Attempts:
2 left
💡 Hint
The command filters containers by image and formats output to show container names and status.
✗ Incorrect
The command lists running containers from the image 'ml-model:latest' and shows their names and uptime. 'Up 5 minutes' means the container is running for 5 minutes.
❓ Configuration
intermediate2:00remaining
Correct Dockerfile snippet for ML dependencies
Which Dockerfile snippet correctly installs Python 3.12 and the ML package
scikit-learn in a Debian-based image?Attempts:
2 left
💡 Hint
Use official Python images and pip for Python packages.
✗ Incorrect
Option C uses the official Python 3.12 slim image and installs scikit-learn via pip, which is the correct way to install Python ML packages in Docker.
❓ Troubleshoot
advanced3:00remaining
Diagnosing Docker container crash for ML model
Your ML model container crashes immediately after starting. The Docker logs show:
Which Dockerfile change will fix this error?
ModuleNotFoundError: No module named 'tensorflow'
Which Dockerfile change will fix this error?
Attempts:
2 left
💡 Hint
The error means the Python package tensorflow is missing.
✗ Incorrect
Installing tensorflow via pip inside the Dockerfile ensures the Python environment has the required package. Changing base image might work but is not guaranteed. apt-get does not install Python packages.
🔀 Workflow
advanced3:00remaining
Correct order to build and deploy ML Docker container
Arrange these steps in the correct order to build and deploy an ML model Docker container:
1. Push image to Docker registry
2. Write Dockerfile with ML dependencies
3. Run container on cloud server
4. Build Docker image locally
1. Push image to Docker registry
2. Write Dockerfile with ML dependencies
3. Run container on cloud server
4. Build Docker image locally
Attempts:
2 left
💡 Hint
You must write the Dockerfile before building the image.
✗ Incorrect
First write the Dockerfile, then build the image locally, push it to a registry, and finally run the container on the cloud server.
✅ Best Practice
expert4:00remaining
Best practice for managing large ML model files in Docker
You have a large ML model file (2GB) that changes frequently. What is the best practice to handle this in your Docker workflow?
Attempts:
2 left
💡 Hint
Think about avoiding rebuilding large images when the model changes.
✗ Incorrect
Mounting the model as a volume allows updating the model without rebuilding the Docker image, saving time and storage.