Recall & Review
beginner
What is a random forest in machine learning?
A random forest is a group of decision trees working together. Each tree makes a prediction, and the forest picks the most common answer. This helps make better and more stable predictions.
Click to reveal answer
beginner
Why does random forest use many decision trees instead of one?
Using many trees reduces mistakes from any single tree. It lowers errors by averaging many opinions, making the final prediction more accurate and less likely to be wrong.
Click to reveal answer
intermediate
What is 'bagging' in the context of random forests?
Bagging means making many trees from different random samples of the data. Each tree sees a slightly different set of data, which helps the forest learn better and avoid overfitting.
Click to reveal answer
intermediate
How does random forest select features when splitting nodes?
At each split, random forest picks a random small group of features and chooses the best split only from them. This randomness helps trees be different and improves the forest's overall strength.
Click to reveal answer
beginner
What metrics can we use to check a random forest's performance?
We can use accuracy, precision, recall, F1 score for classification tasks, and mean squared error or R-squared for regression tasks. These metrics tell us how well the forest predicts.
Click to reveal answer
What does each tree in a random forest use to make splits?
✗ Incorrect
Random forest selects a random subset of features at each split to create diverse trees.
What is the main benefit of using many trees in a random forest?
✗ Incorrect
Many trees help reduce overfitting and improve the model's accuracy by averaging predictions.
What is 'bagging' short for in random forests?
✗ Incorrect
Bagging stands for bootstrap aggregating, which means training trees on random samples with replacement.
Which metric is NOT typically used to evaluate a random forest classifier?
✗ Incorrect
Mean squared error is used for regression, not classification.
How does random forest help prevent overfitting?
✗ Incorrect
Random forest averages many trees trained on random samples and features, reducing overfitting.
Explain how random forest builds its model and why it is more reliable than a single decision tree.
Think about how many opinions together make a better decision.
You got /5 concepts.
Describe the role of randomness in random forest and how it improves model performance.
Randomness helps trees learn different things.
You got /5 concepts.