Challenge - 5 Problems
Differential Evolution Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of differential_evolution on a simple function
What is the output of the following code snippet using
scipy.optimize.differential_evolution to minimize a simple quadratic function?SciPy
from scipy.optimize import differential_evolution func = lambda x: (x[0] - 3)**2 + (x[1] + 1)**2 bounds = [(-5, 5), (-5, 5)] result = differential_evolution(func, bounds, seed=42) print((round(result.x[0], 2), round(result.x[1], 2)))
Attempts:
2 left
💡 Hint
Recall that differential_evolution tries to find the minimum of the function within the given bounds.
✗ Incorrect
The function is minimized at x=3 and y=-1, so the output is (3.0, -1.0).
❓ data_output
intermediate2:00remaining
Number of iterations in differential_evolution result
After running differential_evolution on the Rosenbrock function with bounds [(-2, 2), (-1, 3)], what is the value of
result.nit (number of iterations)?SciPy
from scipy.optimize import differential_evolution def rosenbrock(x): return (1 - x[0])**2 + 100*(x[1] - x[0]**2)**2 bounds = [(-2, 2), (-1, 3)] result = differential_evolution(rosenbrock, bounds, seed=0) print(result.nit)
Attempts:
2 left
💡 Hint
Check the
nit attribute of the result object.✗ Incorrect
The differential_evolution algorithm converges in 11 iterations for this setup and seed.
🔧 Debug
advanced2:00remaining
Identify the error in differential_evolution usage
What error will the following code raise when trying to run differential_evolution?
SciPy
from scipy.optimize import differential_evolution import math def f(x): return math.pow(x, 2) bounds = [(-1, 1)] result = differential_evolution(f, bounds) print(result.fun)
Attempts:
2 left
💡 Hint
Check how the function f expects its input and what differential_evolution passes.
✗ Incorrect
The function f expects a scalar but differential_evolution passes an array, causing a TypeError.
🧠 Conceptual
advanced1:30remaining
Understanding differential_evolution parameters
Which parameter of
scipy.optimize.differential_evolution controls the mutation constant that influences the differential weight in the algorithm?Attempts:
2 left
💡 Hint
Mutation controls the differential weight in the mutation step.
✗ Incorrect
The 'mutation' parameter sets the mutation constant, which controls the differential weight in the algorithm.
🚀 Application
expert3:00remaining
Global minimum found by differential_evolution on a noisy function
Given the noisy function
f(x) = (x[0]-2)^2 + (x[1]+3)^2 + noise where noise is a small random value, which of the following is the closest output of result.x after running differential_evolution with bounds [(-5,5), (-5,5)] and seed=1?SciPy
import numpy as np from scipy.optimize import differential_evolution np.random.seed(1) noise = lambda: np.random.normal(0, 0.1) def f(x): return (x[0]-2)**2 + (x[1]+3)**2 + noise() bounds = [(-5, 5), (-5, 5)] result = differential_evolution(f, bounds, seed=1) print((round(result.x[0], 1), round(result.x[1], 1)))
Attempts:
2 left
💡 Hint
The noise is small, so the optimizer should find a point close to the true minimum (2, -3).
✗ Incorrect
Despite noise, differential_evolution finds a point close to (2, -3), the true minimum of the noiseless function.