0
0
ML Pythonprogramming~20 mins

Feature importance in regression in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Feature Importance Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding feature importance methods

Which method directly measures the impact of each feature on the prediction error by randomly shuffling its values and observing the change in model performance?

AFeature scaling
BPermutation importance
CCorrelation coefficient with target variable
DCoefficient magnitude in linear regression
Attempts:
2 left
Predict Output
intermediate
2:00remaining
Output of feature importance extraction

What is the output of the following Python code using scikit-learn's RandomForestRegressor to get feature importances?

ML Python
from sklearn.ensemble import RandomForestRegressor
import numpy as np

X = np.array([[1, 2], [3, 4], [5, 6], [7, 8]])
y = np.array([1, 3, 5, 7])
model = RandomForestRegressor(random_state=0)
model.fit(X, y)
importances = model.feature_importances_
print(np.round(importances, 2))
A[0.3 0.7]
B[0.0 1.0]
C[0.5 0.5]
D[0.7 0.3]
Attempts:
2 left
Model Choice
advanced
2:00remaining
Choosing a model for interpretable feature importance

You want a regression model that provides clear, direct coefficients to interpret feature importance easily. Which model should you choose?

ASupport vector regression
BRandom forest regression
CNeural network regression
DLinear regression
Attempts:
2 left
Metrics
advanced
2:00remaining
Evaluating feature importance with model metrics

Which metric change best indicates a feature is important when removed from a regression model?

ASignificant increase in mean squared error (MSE)
BIncrease in training time
CDecrease in R-squared value
DDecrease in number of features
Attempts:
2 left
🔧 Debug
expert
2:00remaining
Debugging incorrect feature importance values

You trained a linear regression model but the feature importance (coefficients) are all zero. What is the most likely cause?

AThe learning rate was set too high
BThe features are perfectly correlated with each other
CThe target variable has zero variance (all values are the same)
DThe model was trained with too many epochs
Attempts:
2 left