0
0
TensorFlowml~5 mins

Confusion matrix analysis in TensorFlow - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is a confusion matrix in machine learning?
A confusion matrix is a table that shows how well a classification model performs by comparing actual labels with predicted labels. It helps us see where the model makes correct and wrong predictions.
Click to reveal answer
beginner
What do the terms True Positive (TP), False Positive (FP), True Negative (TN), and False Negative (FN) mean in a confusion matrix?
TP: Model correctly predicts positive.<br>FP: Model wrongly predicts positive.<br>TN: Model correctly predicts negative.<br>FN: Model wrongly predicts negative.
Click to reveal answer
beginner
How do you calculate accuracy from a confusion matrix?
Accuracy = (TP + TN) / (TP + TN + FP + FN). It shows the overall correctness of the model's predictions.
Click to reveal answer
intermediate
What is precision and why is it important?
Precision = TP / (TP + FP). It tells us how many of the predicted positives are actually correct. Important when false positives are costly.
Click to reveal answer
intermediate
How can you compute a confusion matrix using TensorFlow?
Use tf.math.confusion_matrix(labels, predictions) where labels are true classes and predictions are predicted classes. It returns a matrix showing counts of all classes, not specifically TP, FP, TN, FN.
Click to reveal answer
What does a high number of False Negatives (FN) indicate?
AModel misses many positive cases
BModel predicts too many positives
CModel predicts negatives correctly
DModel has high precision
Which metric is calculated as TP / (TP + FP)?
ARecall
BAccuracy
CF1 Score
DPrecision
In TensorFlow, which function computes the confusion matrix?
Atf.math.confusion_matrix()
Btf.metrics.confusion_matrix()
Ctf.confusion_matrix()
Dtf.linalg.confusion_matrix()
What does the diagonal of a confusion matrix represent?
AWrong predictions
BFalse negatives
CCorrect predictions
DFalse positives
Which metric is best to use when false positives are very costly?
ARecall
BPrecision
CAccuracy
DLoss
Explain what a confusion matrix is and how it helps evaluate a classification model.
Think about how you check if your model guesses right or wrong.
You got /3 concepts.
    Describe how to compute precision and recall from a confusion matrix and when each is important.
    Precision is about correct positive predictions; recall is about finding all positives.
    You got /3 concepts.