0
0
Ml-pythonHow-ToBeginner ยท 4 min read

How to Use Azure ML for MLOps: Setup, Deployment, and Monitoring

Use Azure ML to manage the full MLOps lifecycle by creating workspaces, training models, deploying them as web services, and monitoring performance. Azure ML provides tools like Azure ML Pipelines for automation and Model Registry for version control to streamline continuous integration and delivery of ML models.
๐Ÿ“

Syntax

Azure ML uses Python SDK commands to manage MLOps workflows. Key parts include:

  • Workspace: Connects to your Azure ML environment.
  • Experiment: Tracks model training runs.
  • Environment: Defines dependencies for training and deployment.
  • Pipeline: Automates workflows like data prep, training, and deployment.
  • Model: Registers and versions trained models.
  • InferenceConfig and DeploymentConfig: Setup for deploying models as web services.
python
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential

# Connect to workspace
ml_client = MLClient(DefaultAzureCredential(), subscription_id, resource_group, workspace_name)

# Create experiment
experiment = ml_client.create_or_update_experiment(name="my-experiment")

# Register model
model = ml_client.models.create_or_update(name="my-model", path="./model.pkl")

# Define environment
from azure.ai.ml.entities import Environment
env = Environment(name="my-env", conda_file="env.yml")

# Create pipeline (simplified)
from azure.ai.ml import command
train_step = command(name="train", environment=env, command="python train.py")

# Deploy model
from azure.ai.ml.entities import ManagedOnlineEndpoint, ManagedOnlineDeployment
endpoint = ManagedOnlineEndpoint(name="my-endpoint")
ml_client.online_endpoints.create_or_update(endpoint)

deployment = ManagedOnlineDeployment(name="blue", endpoint_name="my-endpoint", model=model, environment=env)
ml_client.online_deployments.create_or_update(deployment)

# Invoke endpoint
response = ml_client.online_endpoints.invoke("my-endpoint", input_data)
๐Ÿ’ป

Example

This example shows how to train a simple model, register it, deploy it as a web service, and test the deployment using Azure ML SDK.

python
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
from azure.ai.ml.entities import Environment, Model, ManagedOnlineEndpoint, ManagedOnlineDeployment
import json

# Connect to Azure ML workspace
ml_client = MLClient(DefaultAzureCredential(), "your-subscription-id", "your-resource-group", "your-workspace")

# Register a model
model = Model(path="./model.pkl", name="sample-model")
ml_client.models.create_or_update(model)

# Define environment
env = Environment(name="sample-env", conda_file="./conda.yml")
ml_client.environments.create_or_update(env)

# Create or update endpoint
endpoint = ManagedOnlineEndpoint(name="sample-endpoint")
ml_client.online_endpoints.create_or_update(endpoint)

# Deploy model to endpoint
deployment = ManagedOnlineDeployment(
    name="blue",
    endpoint_name="sample-endpoint",
    model=model,
    environment=env,
    instance_type="Standard_DS3_v2",
    instance_count=1
)
ml_client.online_deployments.create_or_update(deployment)

# Set traffic to deployment
endpoint.traffic = {"blue": 100}
ml_client.online_endpoints.create_or_update(endpoint)

# Invoke endpoint
input_data = json.dumps({"data": [[1.0, 2.0, 3.0, 4.0]]})
response = ml_client.online_endpoints.invoke("sample-endpoint", input_data)
print("Prediction response:", response)
Output
Prediction response: {"result": [0.75]}
โš ๏ธ

Common Pitfalls

  • Not setting up authentication correctly: Always use DefaultAzureCredential or configure service principals properly.
  • Forgetting to register models: Without registration, deployment and version control are difficult.
  • Ignoring environment dependencies: Define all packages in conda.yml to avoid runtime errors.
  • Not managing endpoint traffic: Deployments need traffic routing to serve requests correctly.
  • Skipping monitoring: Use Azure ML's monitoring tools to track model performance and data drift.
python
## Wrong: Deploying without environment
from azure.ai.ml.entities import ManagedOnlineDeployment

deployment = ManagedOnlineDeployment(
    name="bad-deploy",
    endpoint_name="my-endpoint",
    model=model
    # Missing environment causes runtime errors
)

## Right: Include environment
deployment = ManagedOnlineDeployment(
    name="good-deploy",
    endpoint_name="my-endpoint",
    model=model,
    environment=env
)
๐Ÿ“Š

Quick Reference

Key Azure ML components for MLOps:

  • Workspace: Central place for all ML assets.
  • Experiment: Tracks training runs.
  • Model Registry: Stores and versions models.
  • Pipeline: Automates workflows.
  • Endpoint: Deploys models as REST services.
  • Monitoring: Tracks model health and data drift.
โœ…

Key Takeaways

Azure ML manages the full MLOps lifecycle from training to deployment and monitoring.
Always register models and define environments to ensure reproducibility and smooth deployment.
Use Azure ML Pipelines to automate workflows and improve collaboration.
Deploy models as managed endpoints to serve predictions via REST APIs.
Monitor deployed models regularly to detect performance degradation or data drift.