0
0
PyTorchml~10 mins

Learning rate schedulers in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a learning rate scheduler that decreases the learning rate by a factor of 0.1 every 10 epochs.

PyTorch
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=[1], gamma=0.1)
Drag options to blanks, or click blank then click option'
A1
B10
C20
D5
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing a step_size too small or too large for the training schedule.
Confusing the gamma parameter with step_size.
2fill in blank
medium

Complete the code to initialize a cosine annealing learning rate scheduler with 50 epochs.

PyTorch
scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=[1])
Drag options to blanks, or click blank then click option'
A50
B25
C100
D10
Attempts:
3 left
💡 Hint
Common Mistakes
Setting T_max too low causing multiple cycles.
Confusing T_max with learning rate value.
3fill in blank
hard

Fix the error in the code to correctly update the learning rate scheduler after each epoch.

PyTorch
for epoch in range(num_epochs):
    train()
    validate()
    [1]
Drag options to blanks, or click blank then click option'
Aoptimizer.step()
Bscheduler.update()
Cscheduler.step()
Dscheduler.reset()
Attempts:
3 left
💡 Hint
Common Mistakes
Calling optimizer.step() instead of scheduler.step().
Using non-existent methods like update() or reset().
4fill in blank
hard

Fill both blanks to create a learning rate scheduler that reduces the learning rate by 10% every 5 epochs.

PyTorch
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=[1], gamma=[2])
Drag options to blanks, or click blank then click option'
A5
B0.1
C0.9
D10
Attempts:
3 left
💡 Hint
Common Mistakes
Using gamma=0.1 which reduces learning rate by 90%, not 10%.
Setting step_size incorrectly.
5fill in blank
hard

Fill all three blanks to create a dictionary comprehension that maps each epoch to its learning rate from the lrs list (the scheduler's history), only for epochs where the learning rate is greater than 0.001.

PyTorch
lr_history = {epoch: lr for epoch, lr in enumerate(lrs) if lr [1] [2] and epoch [3] 10}
Drag options to blanks, or click blank then click option'
A>
B0.001
C<
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong comparison operators causing empty or incorrect dictionaries.
Confusing epoch and learning rate conditions.