0
0
SciPydata~10 mins

Least squares optimization in SciPy - Step-by-Step Execution

Choose your learning style9 modes available
Concept Flow - Least squares optimization
Define model function
Provide data points
Define residuals function
Call least_squares optimizer
Optimizer adjusts parameters
Calculate residuals
Minimize sum of squares of residuals
Return optimized parameters
The process starts by defining a model and residuals, then the optimizer adjusts parameters to minimize the sum of squared residuals, returning the best fit.
Execution Sample
SciPy
import numpy as np
from scipy.optimize import least_squares

def model(x, t):
    return x[0] * t + x[1]

def residuals(x, t, y):
    return model(x, t) - y

# Data points
t = np.array([0, 1, 2, 3])
y = np.array([1, 3, 5, 7])

# Initial guess
x0 = np.array([0, 0])

# Run optimizer
result = least_squares(residuals, x0, args=(t, y))
print(result.x)
This code fits a line y = m*t + b to data points by minimizing squared residuals using scipy's least_squares.
Execution Table
StepParameters (x)ResidualsSum of SquaresAction
1[0.0, 0.0][-1, -3, -5, -7]84Initial guess, calculate residuals and sum of squares
2[1.0, 0.0][0, -2, -4, -6]56Optimizer updates slope to 1.0
3[2.0, 0.0][1, -1, -3, -5]36Optimizer updates slope to 2.0
4[2.0, 1.0][0, 0, 0, 0]0Optimizer updates intercept to 1.0
5[2.0, 1.0][0, 0, 0, 0]0No further improvement, optimization stops
💡 Sum of squares stops decreasing significantly, optimizer converges at parameters [2.0, 1.0]
Variable Tracker
VariableStartAfter 1After 2After 3After 4Final
x (parameters)[0.0, 0.0][0.0, 0.0][1.0, 0.0][2.0, 0.0][2.0, 1.0][2.0, 1.0]
ResidualsN/A[-1, -3, -5, -7][0, -2, -4, -6][1, -1, -3, -5][0, 0, 0, 0][0, 0, 0, 0]
Sum of SquaresN/A84563600
Key Moments - 2 Insights
Why do residuals change when parameters change?
Residuals are the differences between model predictions and actual data. When parameters change (see execution_table steps 1 to 4), the model predictions change, so residuals update accordingly.
Why does the optimizer stop at step 5 even though residuals are not zero?
The optimizer stops when it cannot reduce the sum of squares significantly anymore (step 5). Perfect zero residuals are rare; the goal is to minimize error as much as possible.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution table, what are the parameters after step 3?
A[1.0, 0.0]
B[2.0, 0.0]
C[0.0, 0.0]
D[2.0, 1.0]
💡 Hint
Check the 'Parameters (x)' column at step 3 in the execution_table.
At which step does the sum of squares first drop below 40?
AStep 3
BStep 4
CStep 2
DStep 5
💡 Hint
Look at the 'Sum of Squares' column in the execution_table for values below 40.
If the initial guess was [1, 1], how would the sum of squares at step 1 change?
AIt would be exactly 84
BIt would be larger than 84
CIt would be smaller than 84
DIt would be zero
💡 Hint
Initial residuals depend on how close the guess is to data; [1,1] is closer than [0,0], so sum of squares decreases.
Concept Snapshot
Least squares optimization fits model parameters by minimizing the sum of squared residuals.
Define a model and residuals function.
Use scipy.optimize.least_squares with initial guess and data.
Optimizer iteratively updates parameters to reduce error.
Stops when improvements become minimal.
Full Transcript
Least squares optimization finds the best parameters for a model by minimizing the squared differences between predicted and actual data. We start by defining a model function and a residuals function that calculates errors. Using scipy's least_squares, we provide an initial guess and data points. The optimizer changes parameters step-by-step, recalculating residuals and their sum of squares. When the sum of squares stops decreasing significantly, the optimizer stops and returns the best parameters found. This process helps fit models like lines to data in a way that minimizes overall error.