0
0
ML Pythonml~20 mins

Gradient descent optimization in ML Python - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Gradient Descent Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
What does the learning rate control in gradient descent?

Imagine you are walking down a hill to reach the lowest point. In gradient descent, what does the learning rate control?

AThe size of each step you take down the hill
BThe speed at which the hill moves
CThe total distance from the hill to the valley
DThe direction you walk towards the hilltop
Attempts:
2 left
💡 Hint

Think about how big or small your steps are when walking down.

Predict Output
intermediate
2:00remaining
Output of gradient descent update step

What is the value of w after one gradient descent update?

ML Python
w = 2.0
learning_rate = 0.1
gradient = 3.0
w = w - learning_rate * gradient
print(round(w, 2))
A1.3
B1.7
C2.3
D2.0
Attempts:
2 left
💡 Hint

Use the formula: new_w = old_w - learning_rate * gradient

Hyperparameter
advanced
2:00remaining
Choosing the right learning rate for convergence

Which learning rate is most likely to cause the gradient descent to converge smoothly to the minimum?

A10.0
B1.0
C0.0001
D0.1
Attempts:
2 left
💡 Hint

Too small means slow progress, too large means overshooting.

Metrics
advanced
2:00remaining
Effect of learning rate on training loss curve

Which training loss curve shape best represents a too-large learning rate during gradient descent?

ALoss oscillates up and down without settling
BLoss steadily decreases and flattens near zero
CLoss increases steadily over time
DLoss remains constant throughout training
Attempts:
2 left
💡 Hint

Think about what happens if steps are too big and jump around the minimum.

🔧 Debug
expert
2:00remaining
Identify the error in this gradient descent update code

What error will this code produce when running?

ML Python
w = 1.0
learning_rate = 0.05
gradient = None
w = w - learning_rate * gradient
print(w)
ASyntaxError: invalid syntax
BNameError: name 'gradient' is not defined
CTypeError: unsupported operand type(s) for *: 'float' and 'NoneType'
DNo error, prints 1.0
Attempts:
2 left
💡 Hint

Check the type of gradient before multiplication.