0
0
PyTorchml~20 mins

CosineAnnealingLR in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
CosineAnnealingLR Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
What does CosineAnnealingLR scheduler do during training?
Imagine you are training a model and using the CosineAnnealingLR scheduler. What best describes how the learning rate changes over time?
AThe learning rate oscillates between a maximum and minimum value following a cosine curve over a set period.
BThe learning rate increases exponentially during training to speed up convergence.
CThe learning rate stays constant until a certain epoch, then drops suddenly to a lower value.
DThe learning rate decreases linearly from the initial value to zero over the total number of epochs.
Attempts:
2 left
💡 Hint
Think about how cosine functions behave between 0 and pi.
Predict Output
intermediate
2:00remaining
What is the learning rate after 5 epochs?
Given this PyTorch code snippet using CosineAnnealingLR, what is the learning rate at epoch 5?
PyTorch
import torch
import torch.optim as optim

model_params = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = optim.SGD(model_params, lr=0.1)
scheduler = optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=10, eta_min=0.01)

lrs = []
for epoch in range(6):
    scheduler.step()
    lrs.append(optimizer.param_groups[0]['lr'])

print(lrs[-1])
A0.1
B0.01
C0.075
D0.055
Attempts:
2 left
💡 Hint
CosineAnnealingLR formula: lr = eta_min + (initial_lr - eta_min) * (1 + cos(pi * epoch / T_max)) / 2
Model Choice
advanced
2:00remaining
Which scenario benefits most from using CosineAnnealingLR?
You want to train a deep neural network that tends to get stuck in local minima. Which training setup is best suited for CosineAnnealingLR?
ATraining a small linear regression model with a fixed learning rate.
BTraining a deep convolutional network where you want the learning rate to restart periodically to escape local minima.
CTraining a model with very few epochs where learning rate decay is not needed.
DTraining a model with a learning rate that should increase steadily during training.
Attempts:
2 left
💡 Hint
CosineAnnealingLR can be combined with restarts to help escape local minima.
Hyperparameter
advanced
2:00remaining
What effect does increasing T_max have in CosineAnnealingLR?
In the CosineAnnealingLR scheduler, what happens if you increase the T_max parameter while keeping other settings constant?
AThe learning rate decreases faster and reaches eta_min sooner.
BThe learning rate oscillates more frequently between max and min values.
CThe learning rate decreases more slowly and takes longer to reach eta_min.
DThe learning rate stays constant for longer before decreasing.
Attempts:
2 left
💡 Hint
T_max controls the period of the cosine cycle.
🔧 Debug
expert
2:00remaining
Why does the learning rate not change as expected?
You wrote this code to use CosineAnnealingLR but the learning rate stays constant at 0.1 for all epochs. What is the most likely cause?
PyTorch
import torch
import torch.optim as optim

model_params = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = optim.SGD(model_params, lr=0.1)
scheduler = optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=10, eta_min=0.01)

for epoch in range(5):
    # training code here
    print(f"Epoch {epoch} lr: {optimizer.param_groups[0]['lr']}")
    # forgot to call scheduler.step()
AThe scheduler.step() function was not called inside the training loop.
BThe model parameters are not passed correctly to the optimizer.
CThe eta_min parameter is set too low to affect the learning rate.
DThe optimizer learning rate was set too high initially.
Attempts:
2 left
💡 Hint
Schedulers need to be updated each epoch to change the learning rate.