Recall & Review
beginner
What is Gradient Boosting in simple terms?
Gradient Boosting is a way to build a strong prediction model by combining many weak models, usually decision trees, one after another. Each new model tries to fix the mistakes of the models before it.
Click to reveal answer
beginner
How does Gradient Boosting improve model predictions step-by-step?
It builds models one at a time. Each model looks at the errors (residuals) from the previous models and learns to predict those errors better, reducing overall mistakes gradually.
Click to reveal answer
intermediate
What role does the learning rate play in Gradient Boosting?
The learning rate controls how much each new model influences the final prediction. A smaller learning rate means the model learns slowly but can be more accurate, while a larger rate learns faster but might miss details.
Click to reveal answer
beginner
Why are decision trees commonly used as weak learners in Gradient Boosting?
Decision trees are simple and fast to build. They can capture patterns in data easily and their errors can be corrected step-by-step, making them perfect for boosting.
Click to reveal answer
intermediate
What is overfitting in Gradient Boosting and how can it be controlled?
Overfitting happens when the model learns the training data too well, including noise, and performs poorly on new data. It can be controlled by limiting tree depth, using a small learning rate, and stopping training early.
Click to reveal answer
What does each new model in Gradient Boosting try to predict?
✗ Incorrect
Each new model focuses on predicting the errors (residuals) from the combined previous models to improve accuracy.
Which of these is a common weak learner used in Gradient Boosting?
✗ Incorrect
Decision trees are simple models that work well as weak learners in Gradient Boosting.
What happens if the learning rate in Gradient Boosting is set too high?
✗ Incorrect
A high learning rate can cause the model to fit noise and overfit the training data.
How can overfitting be reduced in Gradient Boosting?
✗ Incorrect
Controlling tree complexity and learning rate helps prevent overfitting.
What is the main goal of Gradient Boosting?
✗ Incorrect
Gradient Boosting builds many weak models that together make strong predictions.
Explain how Gradient Boosting builds a model step-by-step and why it focuses on errors.
Think about fixing mistakes one at a time.
You got /3 concepts.
Describe what overfitting means in Gradient Boosting and list two ways to prevent it.
Overfitting is like memorizing answers instead of understanding.
You got /4 concepts.