0
0
ML Pythonml~20 mins

Docker containerization in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Docker Mastery for ML
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of Dockerfile RUN command
What will be the output when building this Dockerfile snippet?
ML Python
FROM python:3.8-slim
RUN echo "Hello from Docker" > /message.txt
RUN cat /message.txt
AHello from Docker
BHello from Docker\nHello from Docker
CNo output
DError: /message.txt not found
Attempts:
2 left
💡 Hint
RUN commands execute during build and output their results in the build logs.
Model Choice
intermediate
2:00remaining
Choosing the best Docker base image for ML model deployment
You want to deploy a TensorFlow model with GPU support in a Docker container. Which base image is the best choice?
Aubuntu:20.04
Bpython:3.9-slim
Ctensorflow/tensorflow:latest-gpu
Dnode:16-alpine
Attempts:
2 left
💡 Hint
Look for images that include TensorFlow and GPU support.
Hyperparameter
advanced
2:00remaining
Optimizing Docker container for faster ML training
Which Dockerfile instruction helps reduce image size and speeds up ML training container startup?
ACombining RUN commands with && to reduce layers
BInstalling all packages in separate RUN commands
CUsing multiple RUN commands separately
DUsing COPY after RUN commands
Attempts:
2 left
💡 Hint
Fewer layers in Docker images reduce size and improve startup.
🔧 Debug
advanced
2:00remaining
Debugging Docker container failing to access GPU
You built a Docker container for ML training with GPU support, but inside the container, the GPU is not detected. What is the most likely cause?
AThe host machine does not have a GPU
BThe container was not run with the --gpus flag
CThe Docker image does not have CUDA libraries installed
DThe Dockerfile uses python:3.8-slim base image
Attempts:
2 left
💡 Hint
GPU access requires special runtime flags when running containers.
Metrics
expert
2:00remaining
Measuring Docker container startup time impact on ML inference latency
You want to measure how Docker container startup time affects ML model inference latency in production. Which approach is best?
AIgnore container startup time as it does not affect inference latency
BInclude container startup time in latency measurement for each inference request
CMeasure container startup time separately and add average to inference latency
DMeasure inference latency only after container is fully started
Attempts:
2 left
💡 Hint
Inference latency should reflect only the model prediction time during normal operation.