Recall & Review
beginner
What is the main goal of feature selection in machine learning?
The main goal of feature selection is to choose the most important features from the data to improve model performance, reduce overfitting, and make the model simpler and faster.
Click to reveal answer
beginner
Name three common types of feature selection methods.
The three common types are: 1) Filter methods, 2) Wrapper methods, and 3) Embedded methods.
Click to reveal answer
intermediate
How do filter methods select features?
Filter methods select features based on their statistical relationship with the target variable, such as correlation or mutual information, without involving any machine learning model.
Click to reveal answer
intermediate
What is the difference between wrapper and embedded methods?
Wrapper methods use a machine learning model to evaluate feature subsets by training and testing repeatedly, while embedded methods perform feature selection during the model training process itself.
Click to reveal answer
beginner
Why can feature selection help reduce overfitting?
Feature selection removes irrelevant or noisy features, which reduces the chance that the model learns patterns from noise, thus helping the model generalize better to new data.
Click to reveal answer
Which feature selection method evaluates features using a model repeatedly?
✗ Incorrect
Wrapper methods select features by training and testing a model on different feature subsets repeatedly.
Which method selects features based on correlation with the target variable?
✗ Incorrect
Filter methods use statistical measures like correlation to select features without involving a model.
What is a key advantage of embedded feature selection methods?
✗ Incorrect
Embedded methods perform feature selection as part of the model training process.
Feature selection helps reduce overfitting by:
✗ Incorrect
Removing irrelevant or noisy features helps the model avoid learning noise, reducing overfitting.
Which of these is NOT a feature selection method?
✗ Incorrect
Principal Component Analysis is a dimensionality reduction technique, not a feature selection method.
Explain the differences between filter, wrapper, and embedded feature selection methods.
Think about whether the method uses a model and when feature selection happens.
You got /3 concepts.
Describe why feature selection is important in building machine learning models.
Consider how fewer features affect model learning and performance.
You got /4 concepts.