0
0
SciPydata~10 mins

Least squares (least_squares) in SciPy - Step-by-Step Execution

Choose your learning style9 modes available
Concept Flow - Least squares (least_squares)
Define model function f(x, params)
Provide data points (x, y)
Define residuals: difference between f(x, params) and y
Call scipy.optimize.least_squares with residuals
Algorithm iteratively adjusts params
Stop when residuals minimized
Return best-fit parameters
The process fits a model to data by minimizing the difference between predicted and actual values using iterative optimization.
Execution Sample
SciPy
import numpy as np
from scipy.optimize import least_squares

def model(x, p):
    return p[0] * x + p[1]

x = np.array([0, 1, 2, 3])
y = np.array([1, 3, 5, 7])

res = least_squares(lambda p: model(x, p) - y, x0=[0, 0])
print(res.x)
Fits a line y = p0*x + p1 to points (x, y) by minimizing residuals.
Execution Table
StepParameters pResiduals (model(x,p)-y)Cost (sum squares)Action
1[0, 0][-1, -3, -5, -7]84Initial guess, calculate residuals and cost
2[1, 0][-1, -2, -3, -4]30Update p0 to 1, recalc residuals and cost
3[1, -1][0, -1, -2, -3]14Update p1 to -1, cost decreases
4[1, 1][-2, -3, -4, -5]54Update p1 to 1, cost increases, revert
5[2, 1][0, 0, 0, 0]0Update p0 to 2, perfect fit
6[2, 1][0, 0, 0, 0]0No improvement, minimal cost reached
7[2, 1][0, 0, 0, 0]0Algorithm converges to best parameters
Exit[2, 1][0, 0, 0, 0]0Stop: minimum cost found or max iterations reached
💡 Algorithm stops when cost no longer decreases significantly or max iterations reached.
Variable Tracker
VariableStartAfter 1After 2After 3After 4After 5Final
p[0, 0][1, 0][1, -1][1, 1][2, 1][2, 1][2, 1]
Cost84301454000
Key Moments - 2 Insights
Why does the cost increase when parameters change from [1, -1] to [1, 1]?
Because the residuals get larger, increasing the sum of squares cost, as shown in execution_table row 4 where cost jumps from 14 to 54.
Why does the algorithm revert parameter changes that increase cost?
The algorithm tries to minimize cost, so if a parameter update increases cost, it rejects that update and tries a different direction, as seen in row 4.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table at step 2, what is the cost value?
A54
B84
C30
D14
💡 Hint
Check the 'Cost (sum squares)' column at step 2 in the execution_table.
At which step does the algorithm first try parameters [1, 1]?
AStep 4
BStep 5
CStep 3
DStep 6
💡 Hint
Look at the 'Parameters p' column in the execution_table to find when [1, 1] appears.
If the initial guess was [2, 1], what would be the initial cost?
A30
B0
C84
D14
💡 Hint
Refer to the cost value when parameters are [2, 1] in the execution_table rows 5 and 6.
Concept Snapshot
Use scipy.optimize.least_squares to fit model parameters by minimizing residuals.
Define a model function and residuals (model - data).
Provide initial guess for parameters.
Algorithm iteratively updates parameters to reduce sum of squared residuals.
Stops when improvement is minimal or max iterations reached.
Returns best-fit parameters in res.x.
Full Transcript
This visual execution shows how least squares fitting works using scipy.optimize.least_squares. We start with a model function and data points. We define residuals as the difference between model predictions and actual data. The algorithm begins with an initial guess for parameters and calculates residuals and cost (sum of squared residuals). It then tries to update parameters to reduce cost. If cost increases, it reverts changes and tries other directions. This process repeats until the cost stops decreasing significantly or maximum iterations are reached. The final parameters minimize the difference between model and data.