0
0
SciPydata~20 mins

Minimizing multivariate functions (minimize) in SciPy - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Minimization Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of scipy minimize with Nelder-Mead
What is the output of the following code snippet that minimizes a simple quadratic function using Nelder-Mead method?
SciPy
import numpy as np
from scipy.optimize import minimize

def f(x):
    return (x[0] - 1)**2 + (x[1] + 2)**2

result = minimize(f, x0=[0, 0], method='Nelder-Mead')
print(result.x.round(2))
A[0.00, 0.00]
B[-1.00, 2.00]
C[1.00, 2.00]
D[1.00, -2.00]
Attempts:
2 left
💡 Hint
Think about where the function (x-1)^2 + (y+2)^2 is smallest.
data_output
intermediate
2:00remaining
Number of iterations in BFGS minimization
How many iterations does the BFGS method take to minimize the Rosenbrock function starting at [-1.2, 1]?
SciPy
import numpy as np
from scipy.optimize import minimize

def rosenbrock(x):
    return 100*(x[1] - x[0]**2)**2 + (1 - x[0])**2

result = minimize(rosenbrock, x0=[-1.2, 1], method='BFGS')
print(result.nit)
A26
B100
C50
D10
Attempts:
2 left
💡 Hint
The Rosenbrock function is tricky; BFGS usually takes a few dozen iterations.
🔧 Debug
advanced
2:00remaining
Identify the error in minimize call
What error does this code raise when trying to minimize a function with constraints?
SciPy
from scipy.optimize import minimize

def f(x):
    return x[0]**2 + x[1]**2

cons = ({'type': 'eq', 'fun': lambda x: x[0] + x[1] - 1})

result = minimize(f, x0=[0, 0], constraints=cons)
print(result.success)
ATypeError: 'dict' object is not iterable
BValueError: Constraints must be a sequence
CNo error, prints True
DSyntaxError
Attempts:
2 left
💡 Hint
Check the type of the constraints argument.
🧠 Conceptual
advanced
2:00remaining
Effect of initial guess on minimize
Which statement best describes the effect of the initial guess on the result of scipy.optimize.minimize for non-convex functions?
AThe initial guess can lead to different local minima being found.
BThe initial guess does not affect the result; global minimum is always found.
CThe initial guess only affects the speed, not the final solution.
DThe initial guess must be the exact minimum to succeed.
Attempts:
2 left
💡 Hint
Think about hills and valleys in a bumpy landscape.
🚀 Application
expert
3:00remaining
Minimize a function with bounds and constraints
Which option correctly minimizes the function f(x,y) = (x-2)^2 + (y-3)^2 with the constraint x + y = 5 and bounds 0 <= x <= 3, 0 <= y <= 4?
SciPy
from scipy.optimize import minimize

def f(x):
    return (x[0] - 2)**2 + (x[1] - 3)**2

cons = {'type': 'eq', 'fun': lambda x: x[0] + x[1] - 5}
bounds = [(0, 3), (0, 4)]

result = minimize(f, x0=[0, 0], constraints=[cons], bounds=bounds, method='SLSQP')
print(result.x.round(2))
A[2.00, 3.00]
B[1.00, 4.00]
C[3.00, 2.00]
D[0.00, 5.00]
Attempts:
2 left
💡 Hint
The constraint forces x + y = 5, and bounds limit x and y.