0
0
ML Pythonprogramming~5 mins

Feature importance in regression in ML Python - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is feature importance in regression?
Feature importance in regression tells us how much each input feature (variable) helps the model predict the target value. It shows which features matter most.
Click to reveal answer
beginner
Name one simple method to calculate feature importance in regression models.
One simple method is to look at the coefficients of a linear regression model. Larger absolute values mean the feature has more influence on the prediction.
Click to reveal answer
intermediate
How does permutation importance work for regression models?
Permutation importance measures feature importance by randomly shuffling one feature's values and seeing how much the model's prediction error increases. A big increase means the feature is important.
Click to reveal answer
intermediate
Why might feature importance from tree-based models be more reliable than from linear regression?
Tree-based models capture complex relationships and interactions between features, so their importance scores reflect more realistic influence, while linear regression assumes simple linear effects.
Click to reveal answer
advanced
What is a limitation of using feature importance scores directly for feature selection?
Feature importance scores can be biased if features are correlated or if the model is complex. Removing features based only on importance might hurt model performance.
Click to reveal answer
Which method uses model coefficients to determine feature importance in regression?
ALinear regression coefficients
BPermutation importance
CSHAP values
DPCA
What does a large increase in error after shuffling a feature indicate in permutation importance?
AThe feature is redundant
BThe feature is not important
CThe model is overfitting
DThe feature is important
Which model type often provides feature importance by measuring splits and gains?
ADecision trees
BK-means clustering
CLinear regression
DNaive Bayes
Why can correlated features cause problems in feature importance?
AThey always increase importance scores
BThey can share importance, making scores unreliable
CThey reduce model accuracy
DThey are ignored by models
Which of these is NOT a direct way to get feature importance in regression?
AModel coefficients
BPermutation importance
CFeature scaling
DTree-based importance
Explain how permutation importance helps identify important features in regression models.
Describe why feature importance from tree-based regression models might be more informative than from linear regression.