0
0
PyTorchml~10 mins

CosineAnnealingLR in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a CosineAnnealingLR scheduler with 10 epochs.

PyTorch
scheduler = torch.optim.lr_scheduler.[1](optimizer, T_max=10)
Drag options to blanks, or click blank then click option'
ACosineAnnealingLR
BReduceLROnPlateau
CExponentialLR
DStepLR
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing StepLR or ExponentialLR instead of CosineAnnealingLR.
2fill in blank
medium

Complete the code to update the learning rate scheduler after each epoch.

PyTorch
for epoch in range(20):
    train()
    validate()
    [1]
Drag options to blanks, or click blank then click option'
Aoptimizer.update()
Bscheduler.update()
Coptimizer.step()
Dscheduler.step()
Attempts:
3 left
💡 Hint
Common Mistakes
Calling optimizer.step() instead of scheduler.step().
Using a non-existent scheduler.update() method.
3fill in blank
hard

Fix the error in the scheduler initialization to correctly set the minimum learning rate to 0.001.

PyTorch
scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=50, [1]=0.001)
Drag options to blanks, or click blank then click option'
Aeta_min
Bmin_lr
Cmin_learning_rate
Dlr_min
Attempts:
3 left
💡 Hint
Common Mistakes
Using min_lr or min_learning_rate which are invalid parameters.
4fill in blank
hard

Fill both blanks to create a scheduler that restarts every 30 epochs and sets minimum learning rate to 0.

PyTorch
scheduler = torch.optim.lr_scheduler.[1](optimizer, T_max=[2], eta_min=0)
Drag options to blanks, or click blank then click option'
ACosineAnnealingLR
BStepLR
C30
D50
Attempts:
3 left
💡 Hint
Common Mistakes
Using StepLR instead of CosineAnnealingLR.
Setting T_max to 50 instead of 30.
5fill in blank
hard

Fill all three blanks to print the learning rate at each epoch during training with CosineAnnealingLR.

PyTorch
for epoch in range(40):
    train()
    validate()
    scheduler.[1]()
    lr = scheduler.optimizer.param_groups[[2]]['[3]']
    print(f"Epoch {epoch+1}: lr = {lr}")
Drag options to blanks, or click blank then click option'
Astep
B0
Clr
Dzero
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'zero' instead of 'lr' as the key.
Accessing param_groups[1] which may not exist.