Complete the code to set the learning rate for the optimizer.
optimizer = torch.optim.SGD(model.parameters(), lr=[1])The learning rate is a number that controls how big the steps are when the model learns. Here, 0.01 is a common starting learning rate.
Complete the code to apply a learning rate scheduler that reduces the learning rate every 10 epochs.
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=[1], gamma=0.1)
The step_size parameter controls how often the learning rate is reduced. Here, every 10 epochs.
Fix the error in the training loop to update the learning rate scheduler correctly.
for epoch in range(num_epochs): train(model, optimizer) loss = validate(model) [1] # update learning rate scheduler
The scheduler.step() call updates the learning rate according to the schedule after each epoch.
Fill both blanks to create a dictionary comprehension that maps epoch numbers to learning rates using scheduler.get_last_lr().
lr_dict = {epoch: [1] for epoch in range(num_epochs) if epoch [2] 5}We get the current learning rate from scheduler.get_last_lr()[0]. The condition filters epochs greater than 5.
Fill all three blanks to create a training loop that updates the optimizer, scheduler, and prints the learning rate.
for epoch in range(num_epochs): optimizer.zero_grad() output = model(data) loss = criterion(output, target) loss.backward() optimizer.[1]() scheduler.[2]() print(f"Epoch {epoch}: lr = {scheduler.get_last_lr()[[3]]}")
optimizer.step() updates weights, scheduler.step() updates learning rate, and index 0 gets the current learning rate.