0
0
MLOpsdevops~10 mins

Performance metric tracking in MLOps - Step-by-Step Execution

Choose your learning style9 modes available
Process Flow - Performance metric tracking
Start Training Model
Calculate Metrics
Log Metrics to Tracking System
Visualize Metrics
Evaluate Model Performance
Decide: Improve or Deploy?
NoEnd
Yes
Adjust Model or Data
Back to Start Training Model
This flow shows how model training metrics are calculated, logged, visualized, and used to decide next steps.
Execution Sample
MLOps
metrics = {'accuracy': 0.85, 'loss': 0.35}
log_metrics(metrics)
visualize_metrics()
if metrics['accuracy'] > 0.8:
    deploy_model()
This code calculates metrics, logs them, visualizes, and deploys if accuracy is good.
Process Table
StepActionMetrics StateSystem StateOutput/Result
1Calculate metrics after training{'accuracy': 0.85, 'loss': 0.35}Metrics readyMetrics dictionary created
2Log metrics to tracking system{'accuracy': 0.85, 'loss': 0.35}Metrics loggedMetrics stored in tracking DB
3Visualize metrics{'accuracy': 0.85, 'loss': 0.35}Metrics visualizedGraphs displayed on dashboard
4Evaluate if accuracy > 0.8{'accuracy': 0.85, 'loss': 0.35}Decision pointCondition True
5Deploy model{'accuracy': 0.85, 'loss': 0.35}Model deployedModel available for use
6End process{'accuracy': 0.85, 'loss': 0.35}Process completeNo further action
💡 Accuracy condition met, model deployed, process ends
Status Tracker
VariableStartAfter Step 1After Step 2After Step 3After Step 4Final
metrics{}{'accuracy': 0.85, 'loss': 0.35}{'accuracy': 0.85, 'loss': 0.35}{'accuracy': 0.85, 'loss': 0.35}{'accuracy': 0.85, 'loss': 0.35}{'accuracy': 0.85, 'loss': 0.35}
system_stateIdleMetrics readyMetrics loggedMetrics visualizedDecision pointModel deployed
Key Moments - 2 Insights
Why do we log metrics before visualizing them?
Logging stores metrics persistently so visualization tools can access accurate data, as shown in step 2 and 3 of the execution_table.
What happens if the accuracy is below 0.8?
The condition in step 4 would be False, so deployment would not happen and the process would loop back to improve the model.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table, what is the system state after logging metrics?
AMetrics ready
BMetrics logged
CMetrics visualized
DDecision point
💡 Hint
Check the 'System State' column at step 2 in the execution_table.
At which step does the model get deployed?
AStep 5
BStep 3
CStep 4
DStep 6
💡 Hint
Look for the 'Action' column mentioning deployment in the execution_table.
If accuracy was 0.75 instead of 0.85, what would happen at step 4?
AMetrics would not be logged
BCondition would be True and model deploys
CCondition would be False and deployment skipped
DVisualization would fail
💡 Hint
Refer to the 'Evaluate if accuracy > 0.8' condition in step 4 of the execution_table.
Concept Snapshot
Performance Metric Tracking:
- Calculate metrics after model training
- Log metrics to a tracking system
- Visualize metrics on dashboards
- Use metrics to decide deployment
- Loop back if performance is insufficient
Full Transcript
Performance metric tracking in MLOps involves calculating key metrics like accuracy and loss after training a model. These metrics are then logged into a tracking system to keep a record. Visualization tools read these logged metrics to display graphs and charts for easy understanding. Based on the metrics, a decision is made whether to deploy the model or improve it further. If the accuracy is above a threshold, the model is deployed; otherwise, the process repeats with adjustments. This flow ensures continuous monitoring and improvement of model performance.