0
0
NLPml~5 mins

Evaluation metrics (accuracy, F1, confusion matrix) in NLP - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What does accuracy measure in a classification model?
Accuracy measures the percentage of correct predictions out of all predictions made by the model. It tells us how often the model is right.
Click to reveal answer
intermediate
Explain the F1 score in simple terms.
The F1 score is the balance between precision (how many selected items are relevant) and recall (how many relevant items are selected). It is useful when you want to balance false positives and false negatives.
Click to reveal answer
beginner
What is a confusion matrix?
A confusion matrix is a table that shows the number of correct and incorrect predictions broken down by each class. It helps us see where the model makes mistakes.
Click to reveal answer
intermediate
How do false positives and false negatives relate to the confusion matrix?
False positives are cases where the model predicted positive but the true label is negative. False negatives are cases where the model predicted negative but the true label is positive. Both appear in the confusion matrix.
Click to reveal answer
intermediate
Why might accuracy be misleading in some cases?
Accuracy can be misleading when classes are imbalanced. For example, if 95% of data is one class, a model that always predicts that class will have high accuracy but poor real performance.
Click to reveal answer
Which metric balances precision and recall?
AF1 score
BAccuracy
CConfusion matrix
DLoss function
What does the diagonal of a confusion matrix represent?
AFalse positives
BCorrect predictions
CFalse negatives
DTotal samples
If a model has high accuracy but low F1 score, what might be true?
AThe model is perfect
BThe model has no false positives
CThe confusion matrix is empty
DThe data is imbalanced
Which of these is NOT part of a confusion matrix?
ATrue positives
BTrue negatives
CLoss values
DFalse positives
What does recall measure?
AHow many relevant items are selected
BHow many selected items are relevant
COverall accuracy
DNumber of false positives
Describe what a confusion matrix is and how it helps evaluate a classification model.
Think of it as a table showing correct and wrong predictions for each class.
You got /6 concepts.
    Explain why accuracy alone might not be enough to judge a model's performance and when F1 score is more useful.
    Consider a case where one class is much bigger than others.
    You got /4 concepts.