Challenge - 5 Problems
Scheduler Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of StepLR scheduler after 5 epochs
Given the following PyTorch learning rate scheduler code, what is the learning rate after 5 epochs?
PyTorch
import torch optimizer = torch.optim.SGD([torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))], lr=0.1) scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=3, gamma=0.1) for epoch in range(5): scheduler.step() current_lr = optimizer.param_groups[0]['lr'] print(current_lr)
Attempts:
2 left
💡 Hint
StepLR reduces the learning rate by gamma every step_size epochs.
✗ Incorrect
StepLR reduces the learning rate by multiplying it by gamma every step_size epochs. After 3 epochs, lr = 0.1 * 0.1 = 0.01. After 5 epochs, only one step happened, so lr remains 0.01.
❓ Predict Output
intermediate2:00remaining
Learning rate after 7 epochs with MultiStepLR
What is the learning rate after 7 epochs using this MultiStepLR scheduler?
PyTorch
import torch optimizer = torch.optim.SGD([torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))], lr=0.2) scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[3, 6], gamma=0.5) for epoch in range(7): scheduler.step() current_lr = optimizer.param_groups[0]['lr'] print(current_lr)
Attempts:
2 left
💡 Hint
MultiStepLR multiplies lr by gamma at each milestone epoch.
✗ Incorrect
At epoch 3, lr = 0.2 * 0.5 = 0.1. At epoch 6, lr = 0.1 * 0.5 = 0.05. After 7 epochs, lr remains 0.05.
❓ Model Choice
advanced2:00remaining
Choosing scheduler for gradual learning rate decay
You want a learning rate scheduler that reduces the learning rate smoothly every 2 epochs by a factor of 0.9. Which scheduler and parameters should you choose?
Attempts:
2 left
💡 Hint
StepLR reduces learning rate at fixed intervals.
✗ Incorrect
StepLR with step_size=2 reduces the learning rate every 2 epochs by gamma=0.9, which matches the requirement for smooth decay every 2 epochs.
❓ Hyperparameter
advanced2:00remaining
Effect of gamma in MultiStepLR scheduler
If you set gamma=2 in a MultiStepLR scheduler, what happens to the learning rate at each milestone?
Attempts:
2 left
💡 Hint
Gamma multiplies the learning rate at milestones.
✗ Incorrect
Gamma is a multiplier applied to the learning rate at each milestone. If gamma=2, the learning rate doubles at each milestone.
🔧 Debug
expert3:00remaining
Why does StepLR not reduce learning rate as expected?
A user writes this code but notices the learning rate does not change after epochs:
import torch
optimizer = torch.optim.Adam([torch.nn.Parameter(torch.randn(1, requires_grad=True))], lr=0.05)
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=3, gamma=0.1)
for epoch in range(5):
# training code here
scheduler.step()
print(f'Epoch {epoch+1}, lr: {optimizer.param_groups[0]["lr"]}')
What is the likely cause?
Attempts:
2 left
💡 Hint
Order of calling scheduler.step() affects learning rate update timing.
✗ Incorrect
In PyTorch, scheduler.step() should be called after optimizer.step() to update the learning rate correctly for the next epoch. Calling scheduler.step() before optimizer.step() causes the learning rate to update too early, so the printed lr does not reflect the expected decay.