0
0
ML Pythonml~10 mins

Recursive feature elimination in ML Python - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to import the Recursive Feature Elimination class from scikit-learn.

ML Python
from sklearn.feature_selection import [1]
Drag options to blanks, or click blank then click option'
ARFE
BPCA
CSelectKBest
DVarianceThreshold
Attempts:
3 left
💡 Hint
Common Mistakes
Confusing RFE with PCA which is for dimensionality reduction.
Using SelectKBest which is a different feature selection method.
2fill in blank
medium

Complete the code to create an RFE object using a logistic regression model as the estimator.

ML Python
from sklearn.linear_model import LogisticRegression
rfe = RFE(estimator=[1], n_features_to_select=3)
Drag options to blanks, or click blank then click option'
ALogisticRegression()
BDecisionTreeClassifier()
CKNeighborsClassifier()
DRandomForestClassifier()
Attempts:
3 left
💡 Hint
Common Mistakes
Using a classifier other than logistic regression when the instruction specifies logistic regression.
Passing the class instead of an instance (missing parentheses).
3fill in blank
hard

Fix the error in the code to fit the RFE selector to data X and target y.

ML Python
rfe = RFE(estimator=LogisticRegression(), n_features_to_select=2)
rfe.[1](X, y)
Drag options to blanks, or click blank then click option'
Apredict
Btransform
Cfit
Dscore
Attempts:
3 left
💡 Hint
Common Mistakes
Using transform before fitting the selector.
Using predict or score which are for models, not selectors.
4fill in blank
hard

Fill both blanks to select features and transform the dataset using the fitted RFE selector.

ML Python
X_selected = rfe.[1](X)
selected_features = rfe.[2]_
Drag options to blanks, or click blank then click option'
Atransform
Bsupport
Cfit
Dpredict
Attempts:
3 left
💡 Hint
Common Mistakes
Using fit instead of transform to reduce features.
Using predict instead of support_ to get selected features.
5fill in blank
hard

Fill all three blanks to create an RFE selector, fit it, and print the ranking of features.

ML Python
selector = RFE(estimator=[1], n_features_to_select=4)
selector.[2](X, y)
print(selector.[3]_)
Drag options to blanks, or click blank then click option'
ALogisticRegression()
Bfit
Cranking
Dtransform
Attempts:
3 left
💡 Hint
Common Mistakes
Using transform instead of fit to train the selector.
Trying to print support_ instead of ranking_ for feature ranks.