Challenge - 5 Problems
Constrained Optimization Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of constrained minimization with inequality
What is the minimum value found by this constrained optimization code using scipy.optimize.minimize?
SciPy
from scipy.optimize import minimize # Objective function f = lambda x: (x[0]-1)**2 + (x[1]-2.5)**2 # Constraint: x0 + x1 >= 2 cons = ({'type': 'ineq', 'fun': lambda x: x[0] + x[1] - 2}) # Initial guess x0 = [2, 0] res = minimize(f, x0, constraints=cons) print(round(res.fun, 4))
Attempts:
2 left
💡 Hint
Think about the point closest to (1, 2.5) that satisfies x0 + x1 >= 2.
✗ Incorrect
The minimum is at (1,1), which satisfies the constraint x0 + x1 = 2. The function value is (1-1)^2 + (1-2.5)^2 = 0 + 2.25 = 2.25, but since the initial guess is (2,0), the optimizer finds (1.25, 0.75) with value 0.25.
❓ data_output
intermediate2:00remaining
Number of iterations in constrained optimization
How many iterations does the following constrained optimization take to converge?
SciPy
from scipy.optimize import minimize f = lambda x: (x[0]-3)**2 + (x[1]+1)**2 cons = ({'type': 'eq', 'fun': lambda x: x[0] - 2*x[1]}) x0 = [0, 0] res = minimize(f, x0, constraints=cons) print(res.nit)
Attempts:
2 left
💡 Hint
Check the number of iterations attribute in the result object.
✗ Incorrect
The solver converges in 5 iterations for this simple quadratic with linear equality constraint.
🔧 Debug
advanced2:00remaining
Identify the error in constraint definition
What error will this code raise when running scipy.optimize.minimize with the given constraints?
SciPy
from scipy.optimize import minimize f = lambda x: x[0]**2 + x[1]**2 cons = ({'type': 'ineq', 'fun': lambda x: x[0] - 1}, {'type': 'eq', 'fun': lambda x: x[1] + 2}) x0 = [0, 0] res = minimize(f, x0, constraints=cons) print(res.fun)
Attempts:
2 left
💡 Hint
Check if the initial guess satisfies the constraints.
✗ Incorrect
The initial guess x0 = [0,0] does not satisfy the inequality x0[0] - 1 >= 0 (0 - 1 = -1 < 0) and equality x0[1] + 2 = 2 != 0, so the solver warns about initial guess violation.
❓ visualization
advanced3:00remaining
Plot of feasible region and minimum point
Which plot correctly shows the feasible region and the minimum point for the problem:
Minimize f(x,y) = (x-2)^2 + (y-3)^2
Subject to constraints: x + y <= 4 and x >= 0, y >= 0?
SciPy
import numpy as np import matplotlib.pyplot as plt x = np.linspace(-1,5,400) y = np.linspace(-1,5,400) X, Y = np.meshgrid(x,y) Z = (X-2)**2 + (Y-3)**2 plt.contour(X, Y, Z, levels=30) plt.fill_between(x, 0, 4 - x, where=(x<=4), color='lightblue', alpha=0.5) plt.xlim(-1,5) plt.ylim(-1,5) plt.scatter(1.5, 2.5, color='red', label='Minimum') plt.legend() plt.show()
Attempts:
2 left
💡 Hint
The feasible region is where x and y are non-negative and their sum is at most 4.
✗ Incorrect
The constraints form a triangle bounded by x=0, y=0, and x+y=4. The minimum point inside this region is at (1.5, 2.5), which is the closest point to (2,3) inside the feasible area.
🚀 Application
expert3:00remaining
Interpreting Lagrange multipliers from constrained optimization
After solving the constrained optimization problem:
Minimize f(x,y) = x^2 + y^2
Subject to x + 2y = 6
The solver returns Lagrange multiplier value 3. What does this multiplier represent?
Attempts:
2 left
💡 Hint
Lagrange multipliers measure sensitivity of the optimal value to constraint changes.
✗ Incorrect
The Lagrange multiplier shows how much the minimum value of the objective function would increase if the right side of the constraint (6) increased by 1 unit.