0
0
ML Pythonprogramming~5 mins

Confusion matrix in ML Python - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is a confusion matrix in machine learning?
A confusion matrix is a table that shows the performance of a classification model by comparing actual labels with predicted labels. It helps us see how many predictions were correct or wrong for each class.
Click to reveal answer
beginner
What do the terms True Positive (TP), False Positive (FP), True Negative (TN), and False Negative (FN) mean in a confusion matrix?
TP: Model correctly predicted positive.
FP: Model predicted positive but it was negative.
TN: Model correctly predicted negative.
FN: Model predicted negative but it was positive.
Click to reveal answer
intermediate
How can a confusion matrix help in understanding model errors?
It shows exactly where the model makes mistakes by counting wrong predictions for each class. This helps us know if the model confuses certain classes more than others.
Click to reveal answer
intermediate
What is the difference between accuracy and precision using confusion matrix terms?
Accuracy = (TP + TN) / Total predictions, shows overall correctness.
Precision = TP / (TP + FP), shows how many predicted positives are actually positive.
Click to reveal answer
advanced
Why is a confusion matrix especially useful for imbalanced datasets?
Because accuracy can be misleading when classes are imbalanced, the confusion matrix shows detailed counts of each type of error, helping us understand model performance on minority classes.
Click to reveal answer
What does the cell in the confusion matrix representing False Negative (FN) indicate?
AModel predicted negative but actual was positive
BModel predicted negative and actual was negative
CModel predicted positive and actual was positive
DModel predicted positive but actual was negative
Which metric can be directly calculated from a confusion matrix?
ALearning rate
BAccuracy
CNumber of features
DEpoch count
If a confusion matrix has high False Positives, what does it mean?
AModel predicts all cases correctly
BModel often misses positive cases
CModel often wrongly predicts positive when it is negative
DModel has no errors
Why might accuracy be misleading on imbalanced datasets?
ABecause it ignores the number of features
BBecause it only counts True Positives
CBecause it depends on the learning rate
DBecause it can be high even if the model ignores minority class
Which of these is NOT part of a confusion matrix?
ALearning Rate
BTrue Positive
CFalse Negative
DTrue Negative
Explain what a confusion matrix is and how it helps evaluate a classification model.
Describe the four parts of a confusion matrix and what each part means.