Recall & Review
beginner
What is accuracy in classification evaluation?
Accuracy is the ratio of correct predictions to the total number of predictions made. It tells us how often the model is right overall.
Click to reveal answer
beginner
Define precision in classification tasks.
Precision measures how many of the items predicted as positive are actually positive. It shows the model's exactness or quality of positive predictions.
Click to reveal answer
beginner
What does recall tell us in classification?
Recall tells us how many of the actual positive cases the model correctly found. It measures the model's ability to find all positive examples.
Click to reveal answer
intermediate
Explain the F1 score and why it is useful.
The F1 score is the harmonic mean of precision and recall. It balances both, giving a single score that considers both false positives and false negatives.
Click to reveal answer
intermediate
Why might accuracy be misleading in some classification problems?
Accuracy can be misleading when classes are imbalanced. For example, if 95% of data is one class, a model predicting only that class gets 95% accuracy but fails to detect the other class.Click to reveal answer
Which metric tells you the proportion of true positives out of all predicted positives?
If a model has high recall but low precision, what does it mean?
What is the F1 score a balance of?
Why is accuracy not always the best metric for classification?
Recall is also known as:
Describe in your own words what precision and recall measure in classification.
Explain why the F1 score is useful when evaluating a classification model.