Basin-hopping for global minima in SciPy - Time & Space Complexity
We want to understand how the time needed to find the lowest point using basin-hopping changes as the problem size grows.
How does the number of steps and calculations grow when we try to find the global minimum?
Analyze the time complexity of the following code snippet.
from scipy.optimize import basinhopping
def func(x):
return x**4 - 3*x**3 + 2
result = basinhopping(func, x0=[0], niter=100)
print(result.x, result.fun)
This code tries to find the lowest value of a function by jumping around and checking many points.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Running local minimizations repeatedly after random jumps.
- How many times: The local minimization runs once per iteration, here 100 times.
Each iteration runs a local search that depends on the problem size, and the total steps grow with the number of iterations.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 100 local searches, each with small cost |
| 100 | 100 local searches, each more costly due to bigger input |
| 1000 | 100 local searches, each much more costly as input grows |
Pattern observation: The total time grows roughly with the number of iterations times the cost of each local search, which grows with input size.
Time Complexity: O(n \times m)
This means the time grows with the number of iterations m times the cost of local minimization on input size n.
[X] Wrong: "The time only depends on the number of iterations, not the input size."
[OK] Correct: Each local search inside an iteration depends on input size, so bigger problems take longer per step.
Understanding how iterative optimization methods scale helps you explain algorithm choices clearly and confidently.
"What if we increase the number of iterations instead of input size? How would the time complexity change?"