Recall & Review
beginner
What is feature scaling in machine learning?
Feature scaling is the process of adjusting the range of features (input variables) so they have similar scales. This helps many algorithms work better and faster.
Click to reveal answer
beginner
What does StandardScaler do to the data?
StandardScaler subtracts the mean and divides by the standard deviation for each feature. This makes the data have a mean of 0 and a standard deviation of 1.
Click to reveal answer
beginner
How does MinMaxScaler transform features?
MinMaxScaler scales features to a fixed range, usually 0 to 1, by subtracting the minimum value and dividing by the range (max - min).
Click to reveal answer
intermediate
Why is feature scaling important for algorithms like K-Nearest Neighbors or SVM?
Because these algorithms use distances or dot products, features with larger scales can dominate the results. Scaling ensures all features contribute equally.
Click to reveal answer
intermediate
When might you prefer MinMaxScaler over StandardScaler?
Use MinMaxScaler when you want to keep all features within a specific range, like 0 to 1, especially if the data is not normally distributed or you want to preserve zero values.
Click to reveal answer
What mean and standard deviation does StandardScaler produce for each feature?
Which scaler transforms data to a fixed range like 0 to 1?
Why is feature scaling important for distance-based algorithms?
If your data has many outliers, which scaler might be less sensitive?
What happens if you don’t scale features before using SVM?
Explain in your own words what StandardScaler does to a dataset and why it is useful.
Describe a situation where MinMaxScaler would be a better choice than StandardScaler.