0
0
ML Pythonml~20 mins

Mutual information for feature selection in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Mutual Information Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Understanding Mutual Information in Feature Selection

Which statement best describes the role of mutual information in feature selection?

AIt quantifies the amount of shared information between a feature and the target variable, capturing any kind of dependency.
BIt measures the linear correlation between features and the target variable.
CIt ranks features based on their variance across the dataset.
DIt removes features that have missing values in the dataset.
Attempts:
2 left
💡 Hint

Think about how mutual information captures relationships beyond just linear ones.

Predict Output
intermediate
2:00remaining
Output of Mutual Information Calculation

What is the output of the following Python code that calculates mutual information between features and a binary target?

ML Python
from sklearn.feature_selection import mutual_info_classif
import numpy as np

X = np.array([[1, 2, 3], [1, 3, 3], [0, 2, 1], [0, 3, 1]])
y = np.array([0, 1, 0, 1])

mi = mutual_info_classif(X, y, discrete_features=[True, True, True], random_state=0)
print([round(v, 2) for v in mi])
A[0.0, 0.0, 0.0]
B[0.19, 0.0, 0.0]
C[0.0, 0.19, 0.0]
D[0.19, 0.19, 0.0]
Attempts:
2 left
💡 Hint

Mutual information is non-negative and measures dependency; check which feature varies with the target.

Model Choice
advanced
2:00remaining
Choosing Features Based on Mutual Information

You have 10 features and their mutual information scores with the target. Which approach best selects features to improve model performance?

ASelect the top 3 features with the highest mutual information scores only.
BSelect features with mutual information scores above zero, regardless of redundancy between features.
CSelect features randomly to avoid bias from mutual information scores.
DSelect features with high mutual information scores and low mutual information among themselves to reduce redundancy.
Attempts:
2 left
💡 Hint

Consider both relevance to target and redundancy among features.

Hyperparameter
advanced
2:00remaining
Hyperparameter Impact on Mutual Information Estimation

When using mutual_info_classif from scikit-learn, which hyperparameter affects the smoothness of the mutual information estimate for continuous features?

A<code>n_neighbors</code>
B<code>random_state</code>
C<code>discrete_features</code>
D<code>copy</code>
Attempts:
2 left
💡 Hint

Think about parameters controlling neighborhood size in nearest neighbor estimation.

🔧 Debug
expert
2:00remaining
Debugging Mutual Information Calculation Error

What error will the following code raise when calculating mutual information, and why?

from sklearn.feature_selection import mutual_info_classif
import numpy as np

X = np.array([[1.5, 2.3], [3.1, 4.7], [5.2, 6.8]])
y = np.array([0, 1, 0])

mi = mutual_info_classif(X, y, discrete_features=True)
print(mi)
ARuntimeWarning due to division by zero in mutual information calculation.
BTypeError because <code>discrete_features</code> must be a boolean or array-like, not a single boolean when X has continuous features.
CNo error; the code runs and outputs mutual information scores.
DValueError because the number of samples in X and y do not match.
Attempts:
2 left
💡 Hint

Check the type and shape of discrete_features parameter relative to input data.