0
0
MLOpsdevops~20 mins

Docker for ML workloads in MLOps - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Docker ML Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
💻 Command Output
intermediate
2:00remaining
Output of Docker command with ML model container
You run this command to list running Docker containers for your ML model deployment:
docker ps --filter "ancestor=ml-model:latest" --format "{{.Names}}: {{.Status}}"
What is the expected output if one container named model-serving-1 is running for 5 minutes?
Amodel-serving-1: Created 5 minutes ago
BNo containers found
Cml-model:latest: Up 5 minutes
Dmodel-serving-1: Up 5 minutes
Attempts:
2 left
💡 Hint
The command filters containers by image and formats output to show container names and status.
Configuration
intermediate
2:00remaining
Correct Dockerfile snippet for ML dependencies
Which Dockerfile snippet correctly installs Python 3.12 and the ML package scikit-learn in a Debian-based image?
A
FROM python:3.12
RUN pip install sklearn
B
FROM debian
RUN apt-get install python3.12 scikit-learn
C
FROM python:3.12-slim
RUN pip install scikit-learn
D
FROM python:3.12
RUN apt-get install scikit-learn
Attempts:
2 left
💡 Hint
Use official Python images and pip for Python packages.
Troubleshoot
advanced
3:00remaining
Diagnosing Docker container crash for ML model
Your ML model container crashes immediately after starting. The Docker logs show:
ModuleNotFoundError: No module named 'tensorflow'

Which Dockerfile change will fix this error?
AAdd <code>ENV TENSORFLOW_HOME=/usr/local/tensorflow</code>
BAdd <code>RUN pip install tensorflow</code> before the command that runs the model
CChange base image to <code>tensorflow/tensorflow:latest</code>
DAdd <code>RUN apt-get install tensorflow</code>
Attempts:
2 left
💡 Hint
The error means the Python package tensorflow is missing.
🔀 Workflow
advanced
3:00remaining
Correct order to build and deploy ML Docker container
Arrange these steps in the correct order to build and deploy an ML model Docker container:
1. Push image to Docker registry
2. Write Dockerfile with ML dependencies
3. Run container on cloud server
4. Build Docker image locally
A1,2,3,4
B2,1,3,4
C1,3,2,4
D2,3,1,4
Attempts:
2 left
💡 Hint
You must write the Dockerfile before building the image.
Best Practice
expert
4:00remaining
Best practice for managing large ML model files in Docker
You have a large ML model file (2GB) that changes frequently. What is the best practice to handle this in your Docker workflow?
AMount the model file as a volume at container runtime to avoid rebuilding image
BInclude the model file inside the Docker image to keep everything together
CDownload the model file inside the container during build time
DStore the model file in environment variables for easy access
Attempts:
2 left
💡 Hint
Think about avoiding rebuilding large images when the model changes.