0
0
ML Pythonml~20 mins

Polynomial regression pipeline in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Polynomial Regression Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of polynomial regression prediction
Given the following code that fits a polynomial regression model and predicts a value, what is the output?
ML Python
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
import numpy as np

X = np.array([[1], [2], [3], [4], [5]])
y = np.array([1, 4, 9, 16, 25])  # y = x^2

poly = PolynomialFeatures(degree=2)
X_poly = poly.fit_transform(X)

model = LinearRegression()
model.fit(X_poly, y)

pred = model.predict(poly.transform([[6]]))
print(round(pred[0], 2))
A36.0
B35.0
C30.0
D42.0
Attempts:
2 left
💡 Hint
Remember that the model fits y = x^2, so predicting for x=6 should be close to 36.
Model Choice
intermediate
1:30remaining
Best model choice for polynomial regression pipeline
You want to build a pipeline that fits a polynomial regression model to data. Which of the following pipeline components is the correct choice to transform the input features before fitting a linear regression?
AMinMaxScaler followed by LinearRegression
BStandardScaler followed by LinearRegression
CPolynomialFeatures followed by LinearRegression
DPCA followed by LinearRegression
Attempts:
2 left
💡 Hint
Polynomial regression requires creating polynomial features before fitting a linear model.
Hyperparameter
advanced
1:30remaining
Choosing the polynomial degree hyperparameter
In a polynomial regression pipeline, which effect does increasing the degree hyperparameter have on the model?
AIt has no effect on model complexity
BIt increases model complexity and may cause overfitting
CIt decreases model complexity and reduces overfitting
DIt always improves model generalization
Attempts:
2 left
💡 Hint
Higher degree polynomials can fit data more closely but risk fitting noise.
Metrics
advanced
1:30remaining
Evaluating polynomial regression with R² score
After fitting a polynomial regression model, you compute the R² score on test data and get 0.95. What does this value indicate?
AThe model explains 95% of the variance in the test data
BThe model has 95% accuracy in classification
CThe model's predictions are 95% correct on average
DThe model has 95% error rate
Attempts:
2 left
💡 Hint
R² score measures how well the model explains variance in continuous data.
🔧 Debug
expert
2:30remaining
Debugging pipeline with incorrect feature transformation
You build a pipeline with PolynomialFeatures(degree=3) and LinearRegression. After training, predictions are constant and do not change with input. What is the most likely cause?
APolynomialFeatures was not fitted before transforming the input
BLinearRegression was not fitted with the transformed features
CDegree parameter was set to 1 instead of 3
DInput data was not reshaped to 2D array before transformation
Attempts:
2 left
💡 Hint
PolynomialFeatures expects 2D input; wrong shape can cause incorrect transformations.