Challenge - 5 Problems
Confusion Matrix Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of confusion matrix plot code
What will be the output of the following code snippet that plots a confusion matrix using TensorFlow and Matplotlib?
TensorFlow
import tensorflow as tf import matplotlib.pyplot as plt import numpy as np true_labels = [0, 1, 2, 2, 0, 1] pred_labels = [0, 2, 2, 2, 0, 0] cm = tf.math.confusion_matrix(true_labels, pred_labels, num_classes=3).numpy() plt.imshow(cm, interpolation='nearest', cmap=plt.cm.Blues) plt.title('Confusion Matrix') plt.colorbar() plt.xlabel('Predicted label') plt.ylabel('True label') plt.xticks(np.arange(3), ['Class 0', 'Class 1', 'Class 2']) plt.yticks(np.arange(3), ['Class 0', 'Class 1', 'Class 2']) plt.show()
Attempts:
2 left
💡 Hint
Think about how confusion matrices represent counts of true vs predicted labels.
✗ Incorrect
The code computes a confusion matrix for 3 classes and plots it as a heatmap. The matrix shows counts of predictions vs true labels. Since class 1 has some misclassifications, the diagonal is not perfect.
❓ Model Choice
intermediate1:30remaining
Best model output for confusion matrix visualization
You have a multi-class classification problem with 4 classes. Which TensorFlow function will correctly compute the confusion matrix to visualize model performance?
Attempts:
2 left
💡 Hint
Check the TensorFlow API for confusion matrix functions.
✗ Incorrect
The correct function is tf.math.confusion_matrix which takes labels and predictions and optional num_classes argument.
❓ Metrics
advanced2:00remaining
Interpreting confusion matrix metrics
Given this confusion matrix for a 3-class problem:
[[5, 2, 0],
[1, 7, 1],
[0, 2, 6]]
What is the precision for class 1 (index 1)?
Attempts:
2 left
💡 Hint
Precision = True Positives / (True Positives + False Positives). Look at the predicted column for class 1.
✗ Incorrect
Precision for class 1 is TP / (TP + FP). TP is cm[1][1] = 7. FP is sum of column 1 excluding TP: 2 + 2 = 4. So precision = 7 / (7 + 4) = 7/11 ≈ 0.636, closest to option C's calculation.
🔧 Debug
advanced1:30remaining
Debugging confusion matrix visualization code
What error will this code raise?
import tensorflow as tf
true = [0, 1, 2]
pred = [0, 1]
cm = tf.math.confusion_matrix(true, pred)
print(cm.numpy())
Attempts:
2 left
💡 Hint
Check if true and pred lists have the same length.
✗ Incorrect
The true and pred lists have different lengths, so TensorFlow raises a ValueError about shape mismatch.
🧠 Conceptual
expert2:30remaining
Choosing visualization method for confusion matrix
You want to visualize a confusion matrix for a 10-class classification problem with highly imbalanced classes. Which approach is best to clearly show the model's performance?
Attempts:
2 left
💡 Hint
Normalization helps compare classes with different sample sizes.
✗ Incorrect
Normalized confusion matrices show proportions per true class, making it easier to interpret performance when classes are imbalanced.