0
0
ML Pythonml~20 mins

Why advanced regression handles non-linearity in ML Python - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Advanced Regression Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why can polynomial regression model non-linear data?

Polynomial regression can fit curves instead of straight lines. Why does this help with non-linear data?

ABecause it adds powers of input features, allowing the model to capture curves.
BBecause it removes noise from data, making it linear.
CBecause it uses only linear combinations of features without transformation.
DBecause it reduces the number of features to avoid overfitting.
Attempts:
2 left
💡 Hint

Think about how adding squared or cubic terms changes the shape of the prediction.

Model Choice
intermediate
2:00remaining
Which regression model best handles complex non-linear patterns?

You have data with complex curves and interactions. Which regression model is best to capture these patterns?

ALinear regression
BPolynomial regression with degree 3
CSimple mean prediction
DDecision tree regression
Attempts:
2 left
💡 Hint

Think about models that split data into regions and fit different values in each.

Predict Output
advanced
2:00remaining
Output of polynomial regression prediction

What is the predicted value from this polynomial regression model?

ML Python
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
import numpy as np

X = np.array([[2]])
poly = PolynomialFeatures(degree=2, include_bias=False)
X_poly = poly.fit_transform(X)
model = LinearRegression()
model.coef_ = np.array([3, 4])
model.intercept_ = 5
prediction = model.predict(X_poly)
print(prediction[0])
A19.0
B17.0
C27.0
D23.0
Attempts:
2 left
💡 Hint

Calculate 3*x + 4*x^2 + 5 for x=2.

Metrics
advanced
2:00remaining
Which metric best shows improvement in non-linear regression?

You compare linear regression and polynomial regression on the same dataset. Which metric best shows that polynomial regression fits non-linear data better?

AHigher Mean Squared Error (MSE)
BLower Mean Squared Error (MSE)
CHigher training loss
DLower number of features
Attempts:
2 left
💡 Hint

Think about what a good fit means for error values.

🔧 Debug
expert
3:00remaining
Why does this kernel ridge regression code fail to capture non-linearity?

Given this code snippet, why does the kernel ridge regression model fail to capture non-linear patterns?

ML Python
from sklearn.kernel_ridge import KernelRidge
import numpy as np

X = np.array([[1], [2], [3], [4], [5]])
y = np.array([1, 4, 9, 16, 25])  # y = x^2

model = KernelRidge(alpha=1.0, kernel='linear')
model.fit(X, y)
pred = model.predict(np.array([[6]]))
print(round(pred[0], 2))
ABecause the kernel is linear, it cannot model the quadratic relationship.
BBecause alpha is too high, causing overfitting.
CBecause the input X is not scaled.
DBecause the model is missing the intercept term.
Attempts:
2 left
💡 Hint

Think about what the kernel function does in kernel ridge regression.