0
0
ML Pythonml~5 mins

Gradient Boosting (GBM) in ML Python - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is Gradient Boosting in simple terms?
Gradient Boosting is a way to build a strong prediction model by combining many weak models, usually decision trees, one after another. Each new model tries to fix the mistakes of the models before it.
Click to reveal answer
beginner
How does Gradient Boosting improve model predictions step-by-step?
It builds models one at a time. Each model looks at the errors (residuals) from the previous models and learns to predict those errors better, reducing overall mistakes gradually.
Click to reveal answer
intermediate
What role does the learning rate play in Gradient Boosting?
The learning rate controls how much each new model influences the final prediction. A smaller learning rate means the model learns slowly but can be more accurate, while a larger rate learns faster but might miss details.
Click to reveal answer
beginner
Why are decision trees commonly used as weak learners in Gradient Boosting?
Decision trees are simple and fast to build. They can capture patterns in data easily and their errors can be corrected step-by-step, making them perfect for boosting.
Click to reveal answer
intermediate
What is overfitting in Gradient Boosting and how can it be controlled?
Overfitting happens when the model learns the training data too well, including noise, and performs poorly on new data. It can be controlled by limiting tree depth, using a small learning rate, and stopping training early.
Click to reveal answer
What does each new model in Gradient Boosting try to predict?
AThe original target values directly
BRandom noise in the data
CThe errors made by previous models
DThe average of all previous predictions
Which of these is a common weak learner used in Gradient Boosting?
ADecision tree
BNeural network
CSupport vector machine
DK-means clustering
What happens if the learning rate in Gradient Boosting is set too high?
AThe model ignores previous errors
BThe model will never learn anything
CThe model will be very slow to train
DThe model may learn too quickly and overfit
How can overfitting be reduced in Gradient Boosting?
ABy limiting tree depth and using a small learning rate
BBy increasing the number of trees indefinitely
CBy ignoring the residuals
DBy using only one tree
What is the main goal of Gradient Boosting?
ATo reduce the size of the dataset
BTo combine weak models to create a strong model
CTo cluster data points into groups
DTo randomly guess target values
Explain how Gradient Boosting builds a model step-by-step and why it focuses on errors.
Think about fixing mistakes one at a time.
You got /3 concepts.
    Describe what overfitting means in Gradient Boosting and list two ways to prevent it.
    Overfitting is like memorizing answers instead of understanding.
    You got /4 concepts.