0
0
ML Pythonml~5 mins

Docker containerization in ML Python

Choose your learning style9 modes available
Introduction

Docker helps package your machine learning code and all its parts so it runs the same everywhere. This makes sharing and running models easy and reliable.

You want to share your ML model with a teammate who uses a different computer setup.
You need to run your ML training on a cloud server without worrying about missing software.
You want to keep your ML project organized with all dependencies in one place.
You want to test your ML model in different environments without changing your main system.
You want to deploy your ML model as a service that others can access easily.
Syntax
ML Python
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "train.py"]

FROM sets the base image with Python installed.

WORKDIR sets the folder inside the container where commands run.

Examples
Basic Dockerfile to run a Python ML training script with dependencies.
ML Python
FROM python:3.12-slim
WORKDIR /app
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "train.py"]
Dockerfile using TensorFlow image to serve a trained model.
ML Python
FROM tensorflow/tensorflow:latest
WORKDIR /model
COPY . .
CMD ["python", "serve_model.py"]
Dockerfile installing specific ML libraries for prediction script.
ML Python
FROM python:3.12
WORKDIR /app
COPY . .
RUN pip install scikit-learn pandas
CMD ["python", "predict.py"]
Sample Model

This simple Python script trains a linear model and predicts output for input 6. You can put this script in a Docker container to run anywhere with the same result.

ML Python
import numpy as np
from sklearn.linear_model import LinearRegression

# Sample data
X = np.array([[1], [2], [3], [4], [5]])
y = np.array([2, 4, 6, 8, 10])

# Train model
model = LinearRegression()
model.fit(X, y)

# Predict
pred = model.predict(np.array([[6]]))
print(f"Prediction for input 6: {pred[0]:.2f}")
OutputSuccess
Important Notes

Always include a requirements.txt file listing your Python packages for easy installation inside Docker.

Keep your Docker images small by using slim or minimal base images.

Test your container locally before deploying to cloud or sharing.

Summary

Docker packages your ML code and environment together for easy sharing and running.

Use Dockerfiles to specify how to build your ML container step-by-step.

Running ML code in Docker ensures consistent results on any machine.