0
0
ML Pythonml~20 mins

Feature importance explanation in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Feature Importance Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding feature importance in decision trees

Which statement best describes how feature importance is calculated in a decision tree model?

AIt counts how many times a feature appears in the dataset.
BIt measures the average value of each feature across all samples.
CIt calculates the correlation between each feature and the target variable.
DIt measures how much each feature decreases the weighted impurity in the tree's splits.
Attempts:
2 left
💡 Hint

Think about how decision trees decide where to split data.

Predict Output
intermediate
2:00remaining
Output of feature importance extraction code

What is the output of the following Python code using scikit-learn's RandomForestClassifier?

ML Python
from sklearn.ensemble import RandomForestClassifier
from sklearn.datasets import load_iris

iris = load_iris()
X, y = iris.data, iris.target
model = RandomForestClassifier(random_state=0)
model.fit(X, y)
importances = model.feature_importances_
print([round(i, 2) for i in importances])
A[0.11, 0.03, 0.44, 0.42]
B[0.25, 0.25, 0.25, 0.25]
C[0.0, 0.0, 0.0, 1.0]
D[0.5, 0.2, 0.2, 0.1]
Attempts:
2 left
💡 Hint

Random forests usually assign higher importance to features that better split the data.

Hyperparameter
advanced
2:00remaining
Effect of max_features on feature importance in Random Forest

How does setting the max_features parameter to a low value in a Random Forest affect the computed feature importance?

AIt has no effect on feature importance values.
BIt increases the importance of all features equally.
CIt can reduce the importance of some features because fewer features are considered at each split.
DIt causes the model to ignore the target variable.
Attempts:
2 left
💡 Hint

Think about how limiting features at splits changes the model's view of data.

Metrics
advanced
2:00remaining
Interpreting SHAP values for feature importance

Which statement correctly describes what SHAP values represent in feature importance analysis?

ASHAP values are random numbers assigned to features for visualization.
BSHAP values show how much each feature contributes to pushing a prediction from the average prediction to the actual prediction.
CSHAP values measure the correlation between features and the target variable.
DSHAP values count how many times a feature appears in the training data.
Attempts:
2 left
💡 Hint

SHAP values explain individual predictions by comparing to a baseline.

🔧 Debug
expert
2:00remaining
Identifying error in feature importance extraction code

What error will the following code raise when trying to get feature importances from a trained model?

ML Python
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import load_iris

iris = load_iris()
X, y = iris.data, iris.target
model = LogisticRegression(max_iter=200)
model.fit(X, y)
print(model.feature_importances_)
AAttributeError: 'LogisticRegression' object has no attribute 'feature_importances_'
BValueError: Invalid parameter 'feature_importances_'
CTypeError: 'feature_importances_' is not callable
DNo error; prints feature importances as a list
Attempts:
2 left
💡 Hint

Not all models provide feature importance attributes.