Challenge - 5 Problems
Polynomial Features Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate1:30remaining
Understanding Polynomial Feature Expansion
Which of the following best describes what polynomial features do to the original input data in machine learning?
Attempts:
2 left
💡 Hint
Think about how polynomial features help linear models capture curves.
✗ Incorrect
Polynomial features generate new features by taking powers and products of the original features, enabling linear models to fit nonlinear patterns.
❓ Predict Output
intermediate2:00remaining
Output of Polynomial Feature Transformation
What is the output of the following Python code using sklearn's PolynomialFeatures with degree=2 on input [[2, 3]]?
ML Python
from sklearn.preprocessing import PolynomialFeatures import numpy as np poly = PolynomialFeatures(degree=2, include_bias=False) X = np.array([[2, 3]]) X_poly = poly.fit_transform(X) print(X_poly)
Attempts:
2 left
💡 Hint
Remember include_bias=False means no constant 1 column.
✗ Incorrect
PolynomialFeatures with degree=2 and include_bias=False creates features: original features, their squares, and their product. For input [2,3], these are [2,3,2^2=4,2*3=6,3^2=9].
❓ Model Choice
advanced2:00remaining
Choosing a Model for Polynomial Features
You have expanded your dataset with polynomial features of degree 3. Which model below is most suitable to avoid overfitting and handle the increased feature space?
Attempts:
2 left
💡 Hint
Think about how to control complexity when features increase.
✗ Incorrect
Ridge Regression adds L2 regularization which helps control overfitting caused by many polynomial features by shrinking coefficients.
❓ Hyperparameter
advanced1:30remaining
Effect of Degree Parameter in Polynomial Features
What is the main effect of increasing the degree parameter in PolynomialFeatures on the dataset?
Attempts:
2 left
💡 Hint
Think about how many new features are created as degree grows.
✗ Incorrect
Increasing degree creates many new polynomial combinations, increasing feature count exponentially, which can cause overfitting if not controlled.
❓ Metrics
expert2:30remaining
Evaluating Model Performance with Polynomial Features
You trained a polynomial regression model with degree 4 and got training R² = 0.95 but test R² = 0.60. What does this indicate and which metric would best help diagnose the problem?
Attempts:
2 left
💡 Hint
High train score but low test score usually means overfitting.
✗ Incorrect
High training R² and low test R² means the model fits training data too closely but fails to generalize. Cross-validation helps estimate true performance.