0
0
ML Pythonml~20 mins

Why engineered features improve models in ML Python - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Feature Engineering Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why do engineered features help machine learning models?

Imagine you want to predict house prices. You have raw data like size in square feet and number of rooms. Why might creating new features like 'price per room' help the model?

ABecause models only work with features created by humans, not raw data.
BBecause new features can reveal hidden patterns that raw data alone might not show.
CBecause raw data is always noisy and engineered features remove all noise.
DBecause adding more features always makes the model more complex and accurate.
Attempts:
2 left
💡 Hint

Think about how combining simple data points can create more meaningful information.

Predict Output
intermediate
2:00remaining
Output of feature scaling on data

What is the output of the following Python code that scales a feature using min-max scaling?

ML Python
import numpy as np
from sklearn.preprocessing import MinMaxScaler

X = np.array([[10], [20], [30], [40], [50]])
scaler = MinMaxScaler()
X_scaled = scaler.fit_transform(X)
print(X_scaled.flatten())
A[0.0 0.25 0.5 0.75 1.0]
B[1.0 0.75 0.5 0.25 0.0]
C[10 20 30 40 50]
D[0.0 0.2 0.4 0.6 0.8]
Attempts:
2 left
💡 Hint

Min-max scaling transforms values to a range between 0 and 1 based on min and max values.

Model Choice
advanced
2:00remaining
Choosing a model for engineered polynomial features

You created polynomial features (squares and cubes) from your original data. Which model below is best suited to use these features effectively?

ALinear regression with regularization (like Ridge or Lasso)
BDecision tree classifier
CLinear regression without regularization
DK-means clustering
Attempts:
2 left
💡 Hint

Think about how adding many polynomial features can cause overfitting and how regularization helps.

Metrics
advanced
2:00remaining
Effect of engineered features on model accuracy

A model trained on raw features has 75% accuracy. After adding engineered features, accuracy rises to 85%. What does this improvement most likely indicate?

AThe training data was too small to learn from raw features.
BThe model is overfitting and will perform worse on new data.
CThe model's hyperparameters were changed to increase accuracy.
DEngineered features helped the model capture more useful information.
Attempts:
2 left
💡 Hint

Think about what adding meaningful features does to model learning.

🔧 Debug
expert
3:00remaining
Why does adding engineered features sometimes hurt model performance?

Consider a model where adding many engineered features caused test accuracy to drop. Which reason below best explains this?

AEngineered features always reduce model performance if not normalized.
BThe model cannot handle more than 10 features due to algorithm limits.
CThe new features introduced noise or irrelevant information causing overfitting.
DThe training data was too large, confusing the model.
Attempts:
2 left
💡 Hint

Think about how adding many features can sometimes confuse the model.