Complete the code to create a StepLR scheduler that decreases the learning rate every 5 epochs.
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=[1], gamma=0.1)
The step_size parameter controls how often the learning rate decreases. Here, it should be 5 to reduce every 5 epochs.
Complete the code to create a MultiStepLR scheduler that decreases the learning rate at epochs 10 and 20.
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[1], gamma=0.1)
The milestones parameter is a list of epochs where the learning rate will be reduced. Here, it should be [10, 20].
Fix the error in the code to correctly update the learning rate scheduler after each epoch.
for epoch in range(num_epochs): train() validate() [1]
The correct method to update the learning rate scheduler after each epoch is scheduler.step().
Fill both blanks to create a StepLR scheduler with a step size of 7 and a decay factor of 0.5.
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=[1], gamma=[2])
The step_size should be 7 and the gamma (decay factor) should be 0.5 as specified.
Fill all three blanks to create a MultiStepLR scheduler with milestones at epochs 8 and 16, and a decay factor of 0.2.
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[1], gamma=[2]) for epoch in range(num_epochs): train() validate() [3]
Milestones are [8, 16], gamma is 0.2, and the scheduler is updated each epoch with scheduler.step().