0
0
Computer Visionml~20 mins

Evaluation and confusion matrix in Computer Vision - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Confusion Matrix Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of confusion matrix calculation
What is the output of the following Python code that computes a confusion matrix for a binary classification?
Computer Vision
from sklearn.metrics import confusion_matrix
true_labels = [0, 1, 0, 1, 0, 1, 1]
pred_labels = [0, 0, 0, 1, 0, 1, 1]
cm = confusion_matrix(true_labels, pred_labels)
print(cm)
A
[[3 0]
 [1 3]]
B
[[2 1]
 [1 3]]
C
[[3 1]
 [0 3]]
D
[[3 0]
 [0 4]]
Attempts:
2 left
💡 Hint
Recall confusion matrix rows represent true classes, columns predicted classes.
🧠 Conceptual
intermediate
1:30remaining
Understanding precision and recall from confusion matrix
Given a confusion matrix for a binary classifier: [[50, 10], [5, 35]], what is the recall value?
A0.778
B0.875
C0.833
D0.909
Attempts:
2 left
💡 Hint
Recall = True Positives / (True Positives + False Negatives)
Metrics
advanced
1:30remaining
Choosing the best metric for imbalanced data
Which metric is most appropriate to evaluate a model on a highly imbalanced dataset where the positive class is rare?
AF1-score
BAccuracy
CRecall
DPrecision
Attempts:
2 left
💡 Hint
Consider a metric that balances precision and recall.
🔧 Debug
advanced
1:30remaining
Identify the error in confusion matrix code
What error will this code raise? from sklearn.metrics import confusion_matrix true = [1, 0, 1] pred = [0, 1] cm = confusion_matrix(true, pred) print(cm)
Computer Vision
from sklearn.metrics import confusion_matrix
true = [1, 0, 1]
pred = [0, 1]
cm = confusion_matrix(true, pred)
print(cm)
ANo error, prints confusion matrix
BTypeError: unsupported operand type(s) for +: 'int' and 'str'
CIndexError: list index out of range
DValueError: Found input variables with inconsistent numbers of samples
Attempts:
2 left
💡 Hint
Check if true and pred lists have the same length.
Model Choice
expert
2:00remaining
Best model choice based on confusion matrix analysis
You have two models evaluated on the same test set with confusion matrices: Model A: [[90, 10], [30, 70]] Model B: [[80, 20], [10, 90]] Which model has better recall for the positive class?
AModel A has better recall
BBoth have the same recall
CModel B has better recall
DRecall cannot be determined from confusion matrix
Attempts:
2 left
💡 Hint
Recall = TP / (TP + FN)