Challenge - 5 Problems
Non-linear Curve Fitting Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of curve fitting with exponential decay
What is the output of the following code snippet that fits an exponential decay model to data?
SciPy
import numpy as np from scipy.optimize import curve_fit def model(x, a, b): return a * np.exp(-b * x) xdata = np.array([0, 1, 2, 3, 4, 5]) ydata = np.array([5, 3, 2, 1.2, 0.7, 0.4]) params, _ = curve_fit(model, xdata, ydata, p0=[5, 0.5]) print(np.round(params, 2))
Attempts:
2 left
💡 Hint
Look at how curve_fit estimates parameters close to initial guesses but adjusted to data.
✗ Incorrect
The curve_fit function finds parameters close to the initial guess but adjusted to best fit the data. The output parameters are approximately [4.98, 0.48].
❓ data_output
intermediate2:00remaining
Number of iterations in curve fitting
After fitting a non-linear model using scipy.optimize.curve_fit, how can you find the number of iterations the optimizer took? What is the value of the 'nfev' key in the returned info dictionary?
SciPy
import numpy as np from scipy.optimize import curve_fit def model(x, a, b): return a * np.exp(-b * x) xdata = np.linspace(0, 4, 50) ydata = model(xdata, 3, 1.5) + 0.1 * np.random.normal(size=xdata.size) params, cov = curve_fit(model, xdata, ydata) # The number of function evaluations is stored in the 'nfev' attribute of the OptimizeResult object # But curve_fit returns only params and cov, so we need to use full_output=True to get info params, cov, info, mesg, ier = curve_fit(model, xdata, ydata, full_output=True) print(info['nfev'])
Attempts:
2 left
💡 Hint
The 'nfev' key counts how many times the model function was evaluated during fitting.
✗ Incorrect
The 'nfev' key in the info dictionary shows the number of function evaluations. For this data and model, it is typically around 200.
🔧 Debug
advanced2:00remaining
Identify the error in curve fitting code
What error does the following code raise when trying to fit a quadratic model to data?
SciPy
import numpy as np from scipy.optimize import curve_fit def model(x, a, b, c): return a * x**2 + b * x + c xdata = np.array([1, 2, 3, 4, 5]) ydata = np.array([2, 5, 10, 17, 26]) params, cov = curve_fit(model, xdata, ydata, p0=[1, 1])
Attempts:
2 left
💡 Hint
Check the number of parameters in the model and the length of p0.
✗ Incorrect
The model has 3 parameters (a, b, c) but p0 provides only 2 initial guesses. This mismatch causes a ValueError.
❓ visualization
advanced2:00remaining
Plotting fitted curve and data points
Which code snippet correctly plots the original data points and the fitted non-linear curve using matplotlib?
SciPy
import numpy as np import matplotlib.pyplot as plt from scipy.optimize import curve_fit def model(x, a, b): return a * np.exp(-b * x) xdata = np.linspace(0, 5, 50) ydata = model(xdata, 3, 1.2) + 0.2 * np.random.normal(size=xdata.size) params, _ = curve_fit(model, xdata, ydata) # Plotting code here
Attempts:
2 left
💡 Hint
Use scatter for data points and plot for the fitted curve. Unpack params with *params.
✗ Incorrect
Option A correctly uses scatter for data points and plots the fitted curve by unpacking params. Others misuse plot/scatter or params.
🧠 Conceptual
expert2:00remaining
Understanding residuals in non-linear curve fitting
In non-linear curve fitting, what does the residual sum of squares (RSS) represent and why is minimizing it important?
Attempts:
2 left
💡 Hint
Think about how fitting tries to reduce the difference between model and data.
✗ Incorrect
RSS sums squared differences between observed and predicted values. Minimizing RSS finds parameters that best fit the data.