0
0
R Programmingprogramming~20 mins

Linear regression (lm) in R Programming - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Linear Regression Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of simple linear regression coefficients
What is the output of the following R code that fits a linear model and extracts coefficients?
R Programming
data <- data.frame(x = 1:5, y = c(2, 4, 6, 8, 10))
model <- lm(y ~ x, data = data)
coef(model)
AIntercept = 1, x = 2
BIntercept = 2, x = 0
CIntercept = 0, x = 2
DIntercept = 0, x = 1
Attempts:
2 left
💡 Hint
Think about the relationship between y and x in the data.
data_output
intermediate
1:30remaining
Number of residuals after fitting lm
After fitting a linear model with 10 observations, how many residuals does the model produce?
R Programming
df <- data.frame(x = 1:10, y = rnorm(10))
model <- lm(y ~ x, data = df)
length(residuals(model))
A10
B9
C11
D1
Attempts:
2 left
💡 Hint
Residuals correspond to each observation.
🔧 Debug
advanced
2:00remaining
Identify the error in lm formula usage
What error does this R code produce when fitting a linear model?
R Programming
data <- data.frame(x = 1:5, y = c(2, 4, 6, 8, 10))
lm(y = x, data = data)
ANo error, model fits successfully
BError: object 'y' not found
CError: argument is missing, with no default
DError: invalid formula
Attempts:
2 left
💡 Hint
Check the formula argument syntax in lm.
🚀 Application
advanced
1:30remaining
Interpreting adjusted R-squared
You fit a linear model and get an adjusted R-squared of 0.85. What does this tell you about the model?
A85% of the variance in the response is explained by the predictors, adjusted for number of predictors
BThe model predictions are 85% accurate
CThe model has 85% chance to predict new data correctly
DThe residuals sum to 0.85
Attempts:
2 left
💡 Hint
Adjusted R-squared measures explained variance accounting for predictors.
🧠 Conceptual
expert
2:30remaining
Effect of multicollinearity on lm coefficients
What is the main effect of strong multicollinearity among predictors in a linear regression model?
AModel residuals become zero
BCoefficients become unstable and standard errors increase
CAdjusted R-squared becomes negative
DThe model cannot be fit at all
Attempts:
2 left
💡 Hint
Think about how correlated predictors affect coefficient estimates.