0
0
ML Pythonml~20 mins

CatBoost in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
CatBoost Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
What is the main advantage of CatBoost's handling of categorical features?

CatBoost is known for its special way of dealing with categorical data. What is the main advantage of this approach compared to traditional methods like one-hot encoding?

AIt ignores categorical features to speed up training.
BIt converts categories into numerical values using target statistics without data leakage.
CIt converts categories into random numbers to increase model randomness.
DIt requires manual one-hot encoding before training.
Attempts:
2 left
💡 Hint

Think about how CatBoost prevents information from leaking from the target variable into the feature transformation.

Model Choice
intermediate
2:00remaining
Choosing CatBoost for a dataset with many categorical features

You have a dataset with 50 categorical features and 10 numerical features. You want a model that handles categorical data well without extensive preprocessing. Which model is the best choice?

ALinear regression, because it is simple and fast.
BSupport Vector Machine, because it works well with categorical data.
CK-Nearest Neighbors, because it handles categorical data well by default.
DCatBoost, because it natively supports categorical features.
Attempts:
2 left
💡 Hint

Consider which model can directly use categorical features without manual encoding.

Metrics
advanced
2:00remaining
Interpreting CatBoost training output metrics

During CatBoost training, you see the following output for a binary classification task:

Iteration 100: train Logloss = 0.25, validation Logloss = 0.30

What does this tell you about the model's performance?

AThe model perfectly fits both training and validation data.
BThe model is underfitting because both losses are low.
CThe model fits training data well but may be overfitting since validation loss is higher.
DThe model is not learning because validation loss is higher than training loss.
Attempts:
2 left
💡 Hint

Compare training and validation losses to understand model fit.

🔧 Debug
advanced
2:00remaining
Why does CatBoost training raise a 'CatBoostError: Categorical feature is not converted'?

You try to train a CatBoost model with categorical features but get this error:

CatBoostError: Categorical feature is not converted

What is the most likely cause?

AYou did not specify which columns are categorical in the CatBoost Pool or model parameters.
BYour dataset contains missing values in numerical columns.
CYou used a regression loss function for classification.
DYou did not normalize the numerical features before training.
Attempts:
2 left
💡 Hint

Think about how CatBoost knows which features are categorical.

Hyperparameter
expert
2:00remaining
Optimizing CatBoost with the 'depth' hyperparameter

You want to improve your CatBoost model's performance on a complex dataset. You consider increasing the 'depth' parameter from 6 to 10. What is the most likely effect?

AThe model may capture more complex patterns but risks overfitting and longer training time.
BThe model will train faster and generalize better with higher depth.
CThe model's performance will not change because depth only affects numerical features.
DThe model will ignore categorical features with higher depth.
Attempts:
2 left
💡 Hint

Think about how tree depth affects model complexity and training time.