0
0
SciPydata~10 mins

Global optimization (differential_evolution) in SciPy - Step-by-Step Execution

Choose your learning style9 modes available
Concept Flow - Global optimization (differential_evolution)
Define objective function
Set bounds for variables
Initialize population randomly
Mutation and recombination
Selection: choose better solutions
Check stopping criteria
Return best
The differential evolution algorithm starts with a random population, then improves solutions by mutation, recombination, and selection until stopping criteria are met.
Execution Sample
SciPy
from scipy.optimize import differential_evolution

def f(x):
    return (x[0]-3)**2 + (x[1]+1)**2

bounds = [(-5, 5), (-5, 5)]
result = differential_evolution(f, bounds)
print(result.x, result.fun)
This code finds the minimum of a simple function using differential evolution within given bounds.
Execution Table
StepPopulation SampleMutation VectorTrial VectorSelection ResultBest Solution So Far
1[[0.1, -4.5], [2.0, 0.0], [-3.0, 1.0], [4.5, -2.0]][2.0, 0.0][2.0, 0.0]Selected (better)[2.0, 0.0] (fun=2)
2[[2.0, 0.0], [2.5, -1.0], [-3.0, 1.0], [4.5, -2.0]][2.5, -1.0][2.5, -1.0]Selected (better)[2.5, -1.0] (fun=0.25)
3[[2.0, 0.0], [-3.0, 1.0], [3.0, -1.0], [4.5, -2.0]][3.0, -1.0][3.0, -1.0]Selected (better)[3.0, -1.0] (fun=0)
4[[3.0, -1.0], [-3.0, 1.0], [3.0, -1.0], [4.5, -2.0]][4.5, -2.0][4.5, -2.0]Not selected (worse)[3.0, -1.0] (fun=0)
5[[3.0, -1.0], [-3.0, 1.0], [3.0, -1.0], [4.5, -2.0]][2.9, -1.1][2.9, -1.1]Selected (better)[2.9, -1.1] (fun=0.02)
Exit----Converged to best solution [2.9, -1.1] with fun=0.02
💡 Algorithm stops when improvement is minimal or max iterations reached.
Variable Tracker
VariableStartAfter 1After 2After 3After 4After 5Final
Population[[0.1, -4.5], [2.0, 0.0], [-3.0, 1.0], [4.5, -2.0]][[2.0, 0.0], [2.5, -1.0], [-3.0, 1.0], [4.5, -2.0]][[2.0, 0.0], [-3.0, 1.0], [3.0, -1.0], [4.5, -2.0]][[3.0, -1.0], [-3.0, 1.0], [3.0, -1.0], [4.5, -2.0]][[3.0, -1.0], [-3.0, 1.0], [3.0, -1.0], [4.5, -2.0]][[3.0, -1.0], [2.9, -1.1], [3.0, -1.0], [4.5, -2.0]][[3.0, -1.0], [2.9, -1.1], [3.0, -1.0], [4.5, -2.0]]
Best SolutionNone[2.0, 0.0][2.5, -1.0][3.0, -1.0][3.0, -1.0][2.9, -1.1][2.9, -1.1]
Best Fun ValueNone20.25000.020.02
Key Moments - 3 Insights
Why does the algorithm sometimes keep a worse solution instead of the trial vector?
Because selection only replaces a population member if the trial vector has a better (lower) function value, as shown in execution_table step 4 where the trial vector was worse and not selected.
How does the population change over iterations?
The population updates by replacing individuals with better trial vectors, as seen in variable_tracker where population members change after each iteration.
What stops the differential evolution algorithm?
The algorithm stops when improvements become very small or after a max number of iterations, as noted in the exit_note and the last row of execution_table.
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table at step 3. What is the best solution and its function value?
A[2.0, 0.0] with fun=2
B[-3.0, 1.0] with fun=40
C[3.0, -1.0] with fun=0
D[4.5, -2.0] with fun=unknown
💡 Hint
Check the 'Best Solution So Far' column at step 3 in execution_table.
At which step does the algorithm reject a trial vector because it is worse?
AStep 4
BStep 2
CStep 5
DStep 1
💡 Hint
Look for 'Not selected (worse)' in the Selection Result column in execution_table.
If the bounds were changed to narrower ranges, how would the population likely change?
APopulation would become larger
BPopulation values would be limited within new bounds
CPopulation would not change
DAlgorithm would stop immediately
💡 Hint
Variable_tracker shows population values; bounds restrict possible values.
Concept Snapshot
differential_evolution(func, bounds)
- Starts with random population within bounds
- Uses mutation and recombination to create trial vectors
- Selects better solutions to form next generation
- Stops when convergence or max iterations reached
- Returns best solution found and its function value
Full Transcript
Differential evolution is a global optimization method. It starts by defining the function to minimize and setting variable bounds. Then it creates a random population of candidate solutions. Each iteration, it mutates and recombines population members to create trial vectors. If a trial vector improves the function value, it replaces the original member. This process repeats until the solution converges or a maximum number of iterations is reached. The best solution and its function value are returned.