0
0
SciPydata~5 mins

Global optimization (differential_evolution) in SciPy

Choose your learning style9 modes available
Introduction

Global optimization helps find the best solution when many possible answers exist. Differential evolution is a method that tries many solutions and improves them step-by-step to find the best one.

When you want to find the lowest or highest value of a complex function with many peaks and valleys.
When simple methods get stuck in a bad solution and can't find the best answer.
When you have limits on the values your solution can take, like a range for each variable.
When you want a method that works well even if the function is not smooth or has many ups and downs.
Syntax
SciPy
from scipy.optimize import differential_evolution

result = differential_evolution(func, bounds, strategy='best1bin', maxiter=1000, popsize=15, tol=0.01)

# func: function to minimize
# bounds: list of (min, max) for each variable
# result.x: best solution found
# result.fun: function value at best solution

func must take a list or array of variables and return a single number.

bounds define the search space for each variable.

Examples
Minimize a simple function with two variables between -5 and 5.
SciPy
from scipy.optimize import differential_evolution

def func(x):
    return x[0]**2 + x[1]**2

bounds = [(-5, 5), (-5, 5)]
result = differential_evolution(func, bounds)
print(result.x, result.fun)
Minimize a shifted parabola with custom iteration and population size.
SciPy
from scipy.optimize import differential_evolution

def func(x):
    return (x[0]-1)**2 + (x[1]+2)**2

bounds = [(-3, 3), (-3, 3)]
result = differential_evolution(func, bounds, maxiter=500, popsize=10)
print(result.x, result.fun)
Sample Program

This program finds the minimum of the Rosenbrock function, a common test problem in optimization. It searches within the given bounds and prints the best solution and its function value.

SciPy
from scipy.optimize import differential_evolution

def rosenbrock(x):
    return (1 - x[0])**2 + 100 * (x[1] - x[0]**2)**2

bounds = [(-2, 2), (-1, 3)]

result = differential_evolution(rosenbrock, bounds)

print(f"Best solution: {result.x}")
print(f"Function value at best solution: {result.fun}")
OutputSuccess
Important Notes

Differential evolution works well for many problems but can be slower than simple methods for easy problems.

Adjusting maxiter and popsize changes how long the search runs and how thoroughly it explores.

Always check the result to see if the solution makes sense for your problem.

Summary

Differential evolution is a global optimization method that tries many solutions and improves them.

It works well for complex functions with many variables and tricky shapes.

You provide the function and bounds, and it returns the best solution found.