0
0
SciPydata~20 mins

Non-linear curve fitting in SciPy - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Non-linear Curve Fitting Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of curve fitting with exponential decay
What is the output of the following code snippet that fits an exponential decay model to data?
SciPy
import numpy as np
from scipy.optimize import curve_fit

def model(x, a, b):
    return a * np.exp(-b * x)

xdata = np.array([0, 1, 2, 3, 4, 5])
ydata = np.array([5, 3, 2, 1.2, 0.7, 0.4])

params, _ = curve_fit(model, xdata, ydata, p0=[5, 0.5])
print(np.round(params, 2))
A[5.00 0.50]
B[4.98 0.48]
C[5.10 0.60]
D[4.00 0.30]
Attempts:
2 left
💡 Hint
Look at how curve_fit estimates parameters close to initial guesses but adjusted to data.
data_output
intermediate
2:00remaining
Number of iterations in curve fitting
After fitting a non-linear model using scipy.optimize.curve_fit, how can you find the number of iterations the optimizer took? What is the value of the 'nfev' key in the returned info dictionary?
SciPy
import numpy as np
from scipy.optimize import curve_fit

def model(x, a, b):
    return a * np.exp(-b * x)

xdata = np.linspace(0, 4, 50)
ydata = model(xdata, 3, 1.5) + 0.1 * np.random.normal(size=xdata.size)

params, cov = curve_fit(model, xdata, ydata)

# The number of function evaluations is stored in the 'nfev' attribute of the OptimizeResult object
# But curve_fit returns only params and cov, so we need to use full_output=True to get info
params, cov, info, mesg, ier = curve_fit(model, xdata, ydata, full_output=True)
print(info['nfev'])
A50
B100
C500
D200
Attempts:
2 left
💡 Hint
The 'nfev' key counts how many times the model function was evaluated during fitting.
🔧 Debug
advanced
2:00remaining
Identify the error in curve fitting code
What error does the following code raise when trying to fit a quadratic model to data?
SciPy
import numpy as np
from scipy.optimize import curve_fit

def model(x, a, b, c):
    return a * x**2 + b * x + c

xdata = np.array([1, 2, 3, 4, 5])
ydata = np.array([2, 5, 10, 17, 26])

params, cov = curve_fit(model, xdata, ydata, p0=[1, 1])
ARuntimeError: Optimal parameters not found
BTypeError: curve_fit() got an unexpected keyword argument 'p0'
CValueError: Expected 3 initial parameters but got 2
DNo error, outputs parameters
Attempts:
2 left
💡 Hint
Check the number of parameters in the model and the length of p0.
visualization
advanced
2:00remaining
Plotting fitted curve and data points
Which code snippet correctly plots the original data points and the fitted non-linear curve using matplotlib?
SciPy
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit

def model(x, a, b):
    return a * np.exp(-b * x)

xdata = np.linspace(0, 5, 50)
ydata = model(xdata, 3, 1.2) + 0.2 * np.random.normal(size=xdata.size)

params, _ = curve_fit(model, xdata, ydata)

# Plotting code here
Aplt.scatter(xdata, ydata, label='Data')\nplt.plot(xdata, model(xdata, *params), 'r-', label='Fit')\nplt.legend()\nplt.show()
Bplt.plot(xdata, ydata, 'ro', label='Data')\nplt.scatter(xdata, model(xdata, *params), label='Fit')\nplt.legend()\nplt.show()
Cplt.plot(xdata, ydata, label='Data')\nplt.plot(xdata, model(xdata, params[0], params[1]), 'g--', label='Fit')\nplt.legend()\nplt.show()
Dplt.scatter(xdata, ydata)\nplt.plot(xdata, model(xdata, params), 'b-', label='Fit')\nplt.legend()\nplt.show()
Attempts:
2 left
💡 Hint
Use scatter for data points and plot for the fitted curve. Unpack params with *params.
🧠 Conceptual
expert
2:00remaining
Understanding residuals in non-linear curve fitting
In non-linear curve fitting, what does the residual sum of squares (RSS) represent and why is minimizing it important?
ARSS measures the total squared difference between observed and predicted values; minimizing it finds the best fit parameters.
BRSS counts the number of data points; minimizing it reduces dataset size.
CRSS is the sum of absolute differences between parameters; minimizing it ensures parameters are small.
DRSS is the product of residuals; minimizing it maximizes the error.
Attempts:
2 left
💡 Hint
Think about how fitting tries to reduce the difference between model and data.