0
0
TensorFlowml~20 mins

ROC and AUC curves in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
ROC and AUC Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
Understanding ROC Curve Basics

What does the ROC curve represent in a binary classification model?

AThe relationship between precision and recall for different classes
BThe trade-off between true positive rate and false positive rate at different thresholds
CThe distribution of predicted probabilities for positive class only
DThe accuracy of the model on the training dataset
Attempts:
2 left
💡 Hint

Think about what happens when you change the decision threshold in classification.

Predict Output
intermediate
2:00remaining
Output of TensorFlow AUC Metric Calculation

What is the output of the following TensorFlow code snippet?

TensorFlow
import tensorflow as tf

labels = [0, 0, 1, 1]
predictions = [0.1, 0.4, 0.35, 0.8]
auc_metric = tf.keras.metrics.AUC()
auc_metric.update_state(labels, predictions)
result = auc_metric.result().numpy()
print(round(result, 2))
A0.95
B0.85
C0.65
D0.75
Attempts:
2 left
💡 Hint

Calculate the area under the ROC curve for the given predictions and labels.

Model Choice
advanced
1:30remaining
Choosing the Best Model Based on ROC AUC

You trained three binary classifiers and obtained these ROC AUC scores on the validation set: Model A: 0.82, Model B: 0.91, Model C: 0.88. Which model should you select if you want the best overall ability to distinguish classes?

AModel B
BAll models perform equally well
CModel C
DModel A
Attempts:
2 left
💡 Hint

Higher AUC means better class separation ability.

Hyperparameter
advanced
1:30remaining
Effect of Threshold on ROC Curve

How does changing the classification threshold affect the ROC curve?

AIt moves the point along the ROC curve, changing true positive and false positive rates
BIt changes the shape of the ROC curve itself
CIt only affects precision but not the ROC curve
DIt has no effect on ROC curve metrics
Attempts:
2 left
💡 Hint

Think about what happens when you decide to be more or less strict in classifying positives.

🔧 Debug
expert
2:00remaining
Identifying the Bug in AUC Calculation Code

What error does the following TensorFlow code produce?

import tensorflow as tf

labels = [0, 1, 0, 1]
predictions = [0.2, 0.8, 0.4]
auc_metric = tf.keras.metrics.AUC()
auc_metric.update_state(labels, predictions)
print(auc_metric.result().numpy())
ANo error, outputs a float value
BTypeError because predictions are not tensors
CValueError due to mismatched lengths of labels and predictions
DAttributeError because AUC metric is not imported correctly
Attempts:
2 left
💡 Hint

Check if labels and predictions have the same number of elements.