0
0
ML Pythonprogramming~20 mins

Polynomial regression in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Polynomial Regression Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
Understanding Polynomial Regression Degree

Which statement best describes the role of the degree in polynomial regression?

AThe degree controls the highest power of the input features, allowing the model to fit more complex curves.
BThe degree determines the number of input features used in the model.
CThe degree sets the number of output variables the model predicts.
DThe degree limits the number of training samples used during fitting.
Attempts:
2 left
Predict Output
intermediate
1:30remaining
Output of Polynomial Feature Transformation

What is the output of the following code snippet?

ML Python
from sklearn.preprocessing import PolynomialFeatures
import numpy as np
X = np.array([[2]])
poly = PolynomialFeatures(degree=3, include_bias=False)
result = poly.fit_transform(X)
print(result)
A[[2. 4. 8.]]
B[[2. 4.]]
C[[2. 3. 4.]]
D[[1. 2. 4. 8.]]
Attempts:
2 left
Hyperparameter
advanced
2:00remaining
Choosing Degree to Avoid Overfitting

You have a dataset with noisy data points. Which degree choice for polynomial regression is most likely to avoid overfitting?

ADegree 10, because higher degree fits data perfectly.
BDegree 1, because it fits a simple straight line.
CDegree 5, as a moderate complexity balance.
DDegree 0, which fits a constant value.
Attempts:
2 left
Metrics
advanced
1:30remaining
Interpreting Polynomial Regression Metrics

After training a polynomial regression model, you get these metrics on test data: Mean Squared Error (MSE) = 0.02, R² = 0.95. What does this tell you?

AThe model has high error and poor fit to the data.
BThe model fits the data well with low error and explains 95% of variance.
CThe model is underfitting because MSE is too low.
DThe model explains only 2% of the variance.
Attempts:
2 left
🔧 Debug
expert
2:30remaining
Debugging Polynomial Regression Prediction Error

Consider this code snippet for polynomial regression prediction. What is the main reason the predictions are all zeros?

ML Python
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
import numpy as np

X_train = np.array([[1], [2], [3], [4]])
y_train = np.array([1, 4, 9, 16])

poly = PolynomialFeatures(degree=2)
X_poly = poly.fit_transform(X_train)

model = LinearRegression()
model.fit(X_train, y_train)

X_test = np.array([[5], [6]])
X_test_poly = poly.transform(X_test)
predictions = model.predict(X_test_poly)
print(predictions)
AThe training labels y_train are incorrect for polynomial regression.
BThe degree of PolynomialFeatures is too low to capture the data pattern.
CThe test data X_test is not transformed before prediction.
DThe model was trained on original features, but predictions use polynomial features, causing mismatch.
Attempts:
2 left