Recall & Review
beginner
What is the main idea behind Gradient Boosting for regression?
Gradient Boosting builds a strong prediction model by combining many weak models, usually decision trees, where each new model corrects the errors of the previous ones.
Click to reveal answer
beginner
How does Gradient Boosting improve the model step-by-step?
It fits a new model to the residual errors (differences between actual and predicted values) of the previous model, gradually reducing the overall error.
Click to reveal answer
intermediate
What role does the learning rate play in Gradient Boosting for regression?
The learning rate controls how much each new model influences the overall prediction. A smaller learning rate means slower learning but can lead to better accuracy.
Click to reveal answer
beginner
Why are decision trees commonly used as weak learners in Gradient Boosting?
Decision trees are simple, fast to train, and can capture non-linear relationships, making them effective weak learners to be combined in Gradient Boosting.
Click to reveal answer
beginner
What metric is commonly used to evaluate Gradient Boosting regression models?
Common metrics include Mean Squared Error (MSE) and Root Mean Squared Error (RMSE), which measure how close the predicted values are to the actual values.
Click to reveal answer
In Gradient Boosting for regression, what does each new model try to predict?
✗ Incorrect
Each new model is trained to predict the residual errors (differences between actual and predicted values) from the previous model.
What happens if the learning rate in Gradient Boosting is set too high?
✗ Incorrect
A high learning rate can cause the model to overfit by making large updates that fit noise instead of true patterns.
Which of the following is NOT a typical characteristic of weak learners in Gradient Boosting?
✗ Incorrect
Weak learners have limited predictive power individually; their strength comes from being combined.
Which metric would you use to measure the accuracy of a Gradient Boosting regression model?
✗ Incorrect
MSE is a common metric for regression tasks, measuring the average squared difference between predicted and actual values.
What is the main benefit of combining many weak models in Gradient Boosting?
✗ Incorrect
Combining weak models helps reduce errors and improves overall prediction accuracy.
Explain how Gradient Boosting builds a regression model step-by-step.
Think about how each new model fixes mistakes from before.
You got /5 concepts.
Describe the role of learning rate and weak learners in Gradient Boosting for regression.
Consider how small steps and simple models work together.
You got /4 concepts.