Start by defining the function to minimize. Then pick one method: Nelder-Mead, BFGS, or Powell. Run optimization and get the best solution found.
Execution Sample
SciPy
from scipy.optimize import minimize
def f(x):
return (x[0]-1)**2 + (x[1]-2)**2
res = minimize(f, [0,0], method='Nelder-Mead')
This code finds the minimum of a simple function using the Nelder-Mead method starting from [0,0].
Execution Table
Step
Method
Current Point
Function Value
Action
Notes
1
Nelder-Mead
[0, 0]
5
Evaluate function at start
Initial guess
2
Nelder-Mead
[1, 0]
4
Reflect point
Improved function value
3
Nelder-Mead
[1, 1]
1
Expand point
Better value found
4
Nelder-Mead
[1, 2]
0
Expand point
Minimum found
5
Nelder-Mead
[1, 2]
0
Converged
Stop optimization
1
BFGS
[0, 0]
5
Evaluate function and gradient
Initial guess
2
BFGS
[0.5, 1]
1.25
Update point using gradient
Improved value
3
BFGS
[1, 2]
0
Update point using gradient
Minimum found
4
BFGS
[1, 2]
0
Converged
Stop optimization
1
Powell
[0, 0]
5
Evaluate function at start
Initial guess
2
Powell
[1, 0]
4
Line search along x
Improved value
3
Powell
[1, 2]
0
Line search along y
Minimum found
4
Powell
[1, 2]
0
Converged
Stop optimization
💡 Optimization stops when the function value no longer improves significantly or maximum iterations reached.
Variable Tracker
Variable
Start
After 1
After 2
After 3
Final
Current Point (Nelder-Mead)
[0, 0]
[1, 0]
[1, 1]
[1, 2]
[1, 2]
Function Value (Nelder-Mead)
5
4
1
0
0
Current Point (BFGS)
[0, 0]
[0.5, 1]
[1, 2]
[1, 2]
[1, 2]
Function Value (BFGS)
5
1.25
0
0
0
Current Point (Powell)
[0, 0]
[1, 0]
[1, 2]
[1, 2]
[1, 2]
Function Value (Powell)
5
4
0
0
0
Key Moments - 3 Insights
Why does Nelder-Mead not use gradients like BFGS?
Nelder-Mead only uses function values to explore the space by reflecting and expanding points, as shown in execution_table rows 1-4 for Nelder-Mead. It works well when gradients are not available.
How does BFGS improve the point each step?
BFGS uses both function values and gradients to update the point in a smart direction, as seen in execution_table rows 1-3 for BFGS, quickly reaching the minimum.
What is the main difference in Powell's method steps?
Powell performs line searches along coordinate directions without gradients, moving step-by-step along axes, as shown in execution_table rows 1-3 for Powell.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table, what is the function value at step 3 for BFGS?
A1.25
B0
C4
D5
💡 Hint
Check the row with Step 3 and Method BFGS in the execution_table.
At which step does Nelder-Mead find the minimum function value?
AStep 2
BStep 3
CStep 4
DStep 5
💡 Hint
Look at the function values for Nelder-Mead in execution_table rows 1-5.
If we start Powell method at [2, 2], how would the first function value change?
Use scipy.optimize.minimize to find function minima.
Choose method: Nelder-Mead (no gradients), BFGS (uses gradients), Powell (line search).
Start from initial guess point.
Each method updates points differently.
Stop when function value stabilizes or max iterations reached.
Full Transcript
This visual execution shows how three optimization methods in scipy minimize work step-by-step. We start with a simple function to minimize. Nelder-Mead moves points by reflecting and expanding without gradients. BFGS uses gradients to update points smartly. Powell searches along coordinate lines without gradients. The execution table traces each step's point and function value. Variable tracker shows how points and values change. Key moments clarify common confusions about gradients and method differences. The quiz tests understanding of these steps and values. This helps beginners see how method selection affects optimization progress.