0
0
ML Pythonml~5 mins

Gradient Boosting for regression in ML Python - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is the main idea behind Gradient Boosting for regression?
Gradient Boosting builds a strong prediction model by combining many weak models, usually decision trees, where each new model corrects the errors of the previous ones.
Click to reveal answer
beginner
How does Gradient Boosting improve the model step-by-step?
It fits a new model to the residual errors (differences between actual and predicted values) of the previous model, gradually reducing the overall error.
Click to reveal answer
intermediate
What role does the learning rate play in Gradient Boosting for regression?
The learning rate controls how much each new model influences the overall prediction. A smaller learning rate means slower learning but can lead to better accuracy.
Click to reveal answer
beginner
Why are decision trees commonly used as weak learners in Gradient Boosting?
Decision trees are simple, fast to train, and can capture non-linear relationships, making them effective weak learners to be combined in Gradient Boosting.
Click to reveal answer
beginner
What metric is commonly used to evaluate Gradient Boosting regression models?
Common metrics include Mean Squared Error (MSE) and Root Mean Squared Error (RMSE), which measure how close the predicted values are to the actual values.
Click to reveal answer
In Gradient Boosting for regression, what does each new model try to predict?
AThe residual errors of the previous model
BThe original target values directly
CRandom noise in the data
DThe average of all previous predictions
What happens if the learning rate in Gradient Boosting is set too high?
AThe model will ignore residuals
BThe model will learn too slowly
CThe model may overfit and learn too quickly
DThe model will stop training
Which of the following is NOT a typical characteristic of weak learners in Gradient Boosting?
AStrong individual predictive power
BSimple and shallow decision trees
CFast to train
DAble to capture some patterns
Which metric would you use to measure the accuracy of a Gradient Boosting regression model?
AConfusion matrix
BAccuracy score
CPrecision
DMean Squared Error (MSE)
What is the main benefit of combining many weak models in Gradient Boosting?
ATo avoid using decision trees
BTo create a strong model with better predictions
CTo make the model run faster
DTo reduce the size of the dataset
Explain how Gradient Boosting builds a regression model step-by-step.
Think about how each new model fixes mistakes from before.
You got /5 concepts.
    Describe the role of learning rate and weak learners in Gradient Boosting for regression.
    Consider how small steps and simple models work together.
    You got /4 concepts.