0
0
ML Pythonml~10 mins

Mutual information for feature selection in ML Python - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to import the function for mutual information classification.

ML Python
from sklearn.feature_selection import [1]
Drag options to blanks, or click blank then click option'
Amutual_info_regression
Bmutual_info_classif
CSelectKBest
Dchi2
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing mutual_info_regression which is for regression tasks.
Choosing SelectKBest which is a selector, not the mutual information function.
2fill in blank
medium

Complete the code to calculate mutual information scores for features X and target y.

ML Python
mi_scores = [1](X, y)
Drag options to blanks, or click blank then click option'
Amutual_info_classif
Bf_classif
Cchi2
Dmutual_info_regression
Attempts:
3 left
💡 Hint
Common Mistakes
Using mutual_info_regression which is for regression problems.
Using chi2 or f_classif which are different feature scoring methods.
3fill in blank
hard

Fix the error in the code to select top 3 features based on mutual information scores.

ML Python
from sklearn.feature_selection import SelectKBest
selector = SelectKBest(score_func=[1], k=3)
X_new = selector.fit_transform(X, y)
Drag options to blanks, or click blank then click option'
Amutual_info_regression
Bchi2
Cmutual_info_classif
Df_regression
Attempts:
3 left
💡 Hint
Common Mistakes
Using chi2 which requires non-negative features.
Using mutual_info_regression which is for regression.
4fill in blank
hard

Fill both blanks to create a dictionary of feature names and their mutual information scores, filtering scores greater than 0.1.

ML Python
mi_dict = {feature: [1] for feature, score in zip(feature_names, mi_scores) if score [2] 0.1}
Drag options to blanks, or click blank then click option'
Ascore
B>
C<
Dfeature
Attempts:
3 left
💡 Hint
Common Mistakes
Using feature instead of score as dictionary value.
Using '<' instead of '>' in the condition.
5fill in blank
hard

Fill both blanks to select top 5 features using mutual information and transform the dataset.

ML Python
selector = SelectKBest(score_func=[1], k=[2])
selected_features = selector.fit_transform(X, y)
selected_feature_names = [name for i, name in enumerate(feature_names) if selector.get_support()[i]]
Drag options to blanks, or click blank then click option'
Amutual_info_classif
B3
Cindex
D5
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong scoring function.
Selecting wrong number of features.