Precision-recall curves help us see how well a model finds positive cases without too many mistakes. They show the balance between catching true positives and avoiding false alarms.
0
0
Precision-recall curves in TensorFlow
Introduction
When you want to check how good a model is at finding rare events, like fraud detection.
When false positives are costly, such as in medical diagnosis.
When you want to compare models on how well they identify positive cases.
When accuracy is misleading because classes are imbalanced.
When tuning a model to find the best trade-off between precision and recall.
Syntax
TensorFlow
from sklearn.metrics import precision_recall_curve precision, recall, thresholds = precision_recall_curve(y_true, y_scores)
y_true is the list of true labels (0 or 1).
y_scores are the predicted probabilities or scores for the positive class.
Examples
Calculate precision and recall for a small example with 4 samples.
TensorFlow
precision, recall, thresholds = precision_recall_curve([0, 1, 1, 0], [0.1, 0.9, 0.8, 0.3])
Plot the precision-recall curve to visualize the trade-off.
TensorFlow
import matplotlib.pyplot as plt plt.plot(recall, precision) plt.xlabel('Recall') plt.ylabel('Precision') plt.title('Precision-Recall Curve') plt.show()
Sample Model
This code calculates and prints precision and recall values for different thresholds, then plots the precision-recall curve.
TensorFlow
import tensorflow as tf from sklearn.metrics import precision_recall_curve import matplotlib.pyplot as plt # True labels (0 = negative, 1 = positive) y_true = [0, 0, 1, 1] # Model predicted probabilities for positive class y_scores = [0.1, 0.4, 0.35, 0.8] # Calculate precision, recall, thresholds precision, recall, thresholds = precision_recall_curve(y_true, y_scores) # Print the values print('Precision:', precision) print('Recall:', recall) print('Thresholds:', thresholds) # Plot the precision-recall curve plt.plot(recall, precision, marker='.') plt.xlabel('Recall') plt.ylabel('Precision') plt.title('Precision-Recall Curve') plt.grid(True) plt.show()
OutputSuccess
Important Notes
Precision-recall curves are especially useful when the positive class is rare.
The curve shows how precision and recall change as you change the decision threshold.
Higher area under the precision-recall curve means better model performance.
Summary
Precision-recall curves help evaluate models on positive class detection.
They show the trade-off between precision (correct positive predictions) and recall (found positives).
Use them when classes are imbalanced or false positives matter.