0
0
PyTorchml~20 mins

StepLR and MultiStepLR in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Scheduler Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of StepLR scheduler after 5 epochs
Given the following PyTorch learning rate scheduler code, what is the learning rate after 5 epochs?
PyTorch
import torch
optimizer = torch.optim.SGD([torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))], lr=0.1)
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=3, gamma=0.1)
for epoch in range(5):
    scheduler.step()
current_lr = optimizer.param_groups[0]['lr']
print(current_lr)
A0.01
B0.1
C0.001
D0.0001
Attempts:
2 left
💡 Hint
StepLR reduces the learning rate by gamma every step_size epochs.
Predict Output
intermediate
2:00remaining
Learning rate after 7 epochs with MultiStepLR
What is the learning rate after 7 epochs using this MultiStepLR scheduler?
PyTorch
import torch
optimizer = torch.optim.SGD([torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))], lr=0.2)
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[3, 6], gamma=0.5)
for epoch in range(7):
    scheduler.step()
current_lr = optimizer.param_groups[0]['lr']
print(current_lr)
A0.2
B0.05
C0.1
D0.025
Attempts:
2 left
💡 Hint
MultiStepLR multiplies lr by gamma at each milestone epoch.
Model Choice
advanced
2:00remaining
Choosing scheduler for gradual learning rate decay
You want a learning rate scheduler that reduces the learning rate smoothly every 2 epochs by a factor of 0.9. Which scheduler and parameters should you choose?
AStepLR with step_size=2 and gamma=0.9
BMultiStepLR with milestones=[2,4,6] and gamma=0.9
CStepLR with step_size=1 and gamma=0.9
DMultiStepLR with milestones=[1,3,5] and gamma=0.9
Attempts:
2 left
💡 Hint
StepLR reduces learning rate at fixed intervals.
Hyperparameter
advanced
2:00remaining
Effect of gamma in MultiStepLR scheduler
If you set gamma=2 in a MultiStepLR scheduler, what happens to the learning rate at each milestone?
AThe learning rate becomes zero
BThe learning rate halves at each milestone
CThe learning rate doubles at each milestone
DThe learning rate stays the same
Attempts:
2 left
💡 Hint
Gamma multiplies the learning rate at milestones.
🔧 Debug
expert
3:00remaining
Why does StepLR not reduce learning rate as expected?
A user writes this code but notices the learning rate does not change after epochs: import torch optimizer = torch.optim.Adam([torch.nn.Parameter(torch.randn(1, requires_grad=True))], lr=0.05) scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=3, gamma=0.1) for epoch in range(5): # training code here scheduler.step() print(f'Epoch {epoch+1}, lr: {optimizer.param_groups[0]["lr"]}') What is the likely cause?
AStepLR requires step_size to be 1 to work properly
BThe initial learning rate is too low to observe changes
Cscheduler.step() should be called before optimizer.step() in each epoch
Dscheduler.step() should be called after optimizer.step() in each epoch
Attempts:
2 left
💡 Hint
Order of calling scheduler.step() affects learning rate update timing.