0
0
ML Pythonml~20 mins

Gradient Boosting (GBM) in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Gradient Boosting Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
How does Gradient Boosting improve model predictions?

Gradient Boosting builds a strong model by combining many weak models. What is the main way it improves predictions at each step?

AIt fits a new model to the residual errors of the previous model to reduce overall error.
BIt randomly selects features to build each new model independently.
CIt averages the predictions of all previous models without adjustment.
DIt increases the depth of the decision trees in each iteration.
Attempts:
2 left
💡 Hint

Think about how the model learns from mistakes made before.

Predict Output
intermediate
2:00remaining
Output of training loss during Gradient Boosting

Consider this Python snippet training a Gradient Boosting Regressor on a simple dataset. What is the printed training loss after 3 iterations?

ML Python
from sklearn.ensemble import GradientBoostingRegressor
from sklearn.metrics import mean_squared_error
import numpy as np

X = np.array([[1], [2], [3], [4], [5]])
y = np.array([1.5, 3.5, 3.0, 5.0, 7.5])

model = GradientBoostingRegressor(n_estimators=3, learning_rate=1.0, max_depth=1, random_state=42)
model.fit(X, y)
pred = model.predict(X)
loss = mean_squared_error(y, pred)
print(round(loss, 2))
A0.50
B0.00
C0.12
D1.25
Attempts:
2 left
💡 Hint

Check how well the model fits the small dataset after 3 boosting steps.

Model Choice
advanced
2:00remaining
Choosing the best base learner for Gradient Boosting

You want to use Gradient Boosting for a classification task with many categorical features. Which base learner is most suitable?

ALinear regression models
BDecision trees with limited depth
CK-nearest neighbors
DSupport vector machines
Attempts:
2 left
💡 Hint

Think about what base learners are commonly used in Gradient Boosting and handle categorical data well.

Hyperparameter
advanced
2:00remaining
Effect of learning rate in Gradient Boosting

What happens if you set the learning rate too high in a Gradient Boosting model?

AThe model ignores residuals and stops learning.
BThe model automatically prunes trees to prevent overfitting.
CThe model will train slower but generalize better.
DThe model may overfit quickly and have unstable training.
Attempts:
2 left
💡 Hint

Consider how a large step size affects the model updates.

Metrics
expert
2:00remaining
Interpreting feature importance in Gradient Boosting

After training a Gradient Boosting model, you get these feature importances: {'age': 0.6, 'income': 0.3, 'gender': 0.1}. What does this mean?

AThe 'age' feature contributes most to reducing prediction error in the model.
BAll features contribute equally to the model's predictions.
CThe 'gender' feature is the most important for predictions.
DFeature importance values indicate the correlation between features.
Attempts:
2 left
💡 Hint

Think about what feature importance measures in Gradient Boosting.