Challenge - 5 Problems
Confusion Matrix Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Confusion matrix shape from TensorFlow predictions
Given a binary classification model in TensorFlow, what is the shape of the confusion matrix returned by tf.math.confusion_matrix when comparing true labels and predicted labels?
TensorFlow
import tensorflow as tf true_labels = tf.constant([0, 1, 0, 1, 1]) pred_labels = tf.constant([0, 0, 0, 1, 1]) cm = tf.math.confusion_matrix(true_labels, pred_labels) cm_shape = cm.shape print(cm_shape)
Attempts:
2 left
💡 Hint
The confusion matrix size depends on the number of classes, not the number of samples.
✗ Incorrect
For binary classification, the confusion matrix is 2x2 because there are two classes: 0 and 1.
❓ Metrics
intermediate2:00remaining
Calculating accuracy from confusion matrix
Given the confusion matrix below, what is the accuracy of the model?
[[50, 10],
[5, 35]]
Attempts:
2 left
💡 Hint
Accuracy = (True Positives + True Negatives) / Total samples
✗ Incorrect
Accuracy = (50 + 35) / (50 + 10 + 5 + 35) = 85 / 100 = 0.85
❓ Model Choice
advanced2:00remaining
Choosing model type based on confusion matrix imbalance
You have a confusion matrix with many false negatives but few false positives. Which model adjustment is best to reduce false negatives?
Attempts:
2 left
💡 Hint
Recall focuses on reducing false negatives.
✗ Incorrect
Lowering the classification threshold increases recall, reducing false negatives.
🔧 Debug
advanced2:00remaining
Identifying error in confusion matrix calculation code
What error will this TensorFlow code raise?
import tensorflow as tf
true = tf.constant([0, 1, 2])
pred = tf.constant([0, 1])
cm = tf.math.confusion_matrix(true, pred)
print(cm)
TensorFlow
import tensorflow as tf true = tf.constant([0, 1, 2]) pred = tf.constant([0, 1]) cm = tf.math.confusion_matrix(true, pred) print(cm)
Attempts:
2 left
💡 Hint
True and predicted labels must have the same length.
✗ Incorrect
The true and predicted label tensors have different lengths, causing an InvalidArgumentError.
🧠 Conceptual
expert3:00remaining
Interpreting confusion matrix for multi-class classification
In a 3-class classification problem, the confusion matrix is:
[[30, 2, 3],
[4, 25, 1],
[5, 0, 35]]
Which class has the highest precision?
Attempts:
2 left
💡 Hint
Precision = True Positives / (True Positives + False Positives) for each class.
✗ Incorrect
Class 1 precision = 25 / (2 + 25 + 0) = 25 / 27 ≈ 0.926, which is higher than others.