0
0
SciPydata~20 mins

Least squares optimization in SciPy - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Least Squares Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of simple least squares fit
What is the output of the following code that fits a line y = mx + c to data points using scipy.optimize.least_squares?
SciPy
import numpy as np
from scipy.optimize import least_squares

def fun(params, x, y):
    m, c = params
    return m * x + c - y

x = np.array([0, 1, 2, 3])
y = np.array([1, 3, 5, 7])

res = least_squares(fun, x0=[0, 0], args=(x, y))
print(res.x.round(2))
A[0.5 1.0]
B[1.0 2.0]
C[2.0 1.0]
D[2.0 0.0]
Attempts:
2 left
💡 Hint
Think about the line that best fits points (0,1), (1,3), (2,5), (3,7).
data_output
intermediate
2:00remaining
Number of iterations in least squares optimization
How many iterations does the least_squares optimizer perform for this problem?
SciPy
import numpy as np
from scipy.optimize import least_squares

def fun(params, x, y):
    m, c = params
    return m * x + c - y

x = np.array([0, 1, 2, 3])
y = np.array([1, 3, 5, 7])

res = least_squares(fun, x0=[0, 0], args=(x, y))
print(res.nfev)
A3
B6
C5
D4
Attempts:
2 left
💡 Hint
Check the attribute that counts function evaluations in the result object.
🔧 Debug
advanced
2:00remaining
Identify the error in least squares residual function
What error does this code raise when run?
SciPy
import numpy as np
from scipy.optimize import least_squares

def fun(params, x, y):
    m, c = params
    return m * x + c + y

x = np.array([0, 1, 2])
y = np.array([1, 2, 3])

res = least_squares(fun, x0=[0, 0], args=(x, y))
AValueError: residuals must be difference between model and data
BNo error, runs successfully
CRuntimeWarning: overflow encountered in add
DThe optimization converges to wrong parameters but no error
Attempts:
2 left
💡 Hint
Check if the function returns an array of residuals as expected.
🧠 Conceptual
advanced
2:00remaining
Effect of initial guess on least squares result
Which statement about the initial guess in scipy.optimize.least_squares is true?
AThe initial guess affects the speed but not the final solution for convex problems.
BA poor initial guess always causes the optimizer to fail.
CThe initial guess does not affect the final result for linear problems.
DThe initial guess must be exactly the true parameters to converge.
Attempts:
2 left
💡 Hint
Think about convex problems and local minima.
🚀 Application
expert
3:00remaining
Fitting a nonlinear model with least squares
Given data points x and y, which option fits the nonlinear model y = a * exp(b * x) using scipy.optimize.least_squares and returns the optimized parameters?
SciPy
import numpy as np
from scipy.optimize import least_squares

x = np.array([0, 1, 2, 3])
y = np.array([1, 2.7, 7.4, 20.1])
A
def fun(params, x, y):
    a, b = params
    return a * np.exp(b * x) - y
res = least_squares(fun, x0=[1, 0.5], args=(x, y))
print(res.x.round(2))
B
def fun(params, x, y):
    a, b = params
    return y - a * np.exp(b * x)
res = least_squares(fun, x0=[1, 0.5], args=(x, y))
print(res.x.round(2))
C
def fun(params, x, y):
    a, b = params
    return np.log(y) - (a + b * x)
res = least_squares(fun, x0=[1, 0.5], args=(x, y))
print(res.x.round(2))
D
def fun(params, x, y):
    a, b = params
    return a * b * x - y
res = least_squares(fun, x0=[1, 0.5], args=(x, y))
print(res.x.round(2))
Attempts:
2 left
💡 Hint
Residuals should be model prediction minus observed data.