Learning rate schedulers help the model learn better by changing the speed of learning during training. This can make training faster and improve results.
Learning rate schedulers in PyTorch
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=10, gamma=0.1) for epoch in range(num_epochs): train(...) # your training code scheduler.step()
optimizer is the optimizer you use for training, like Adam or SGD.
step_size is how many epochs before the learning rate changes.
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=5, gamma=0.5)
scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.9)scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=10)This code trains a simple linear model on dummy data. The learning rate starts at 0.1 and drops by 10 times every 3 epochs. The print shows loss and current learning rate each epoch.
import torch import torch.nn as nn import torch.optim as optim # Simple model model = nn.Linear(2, 1) # Optimizer optimizer = optim.SGD(model.parameters(), lr=0.1) # Scheduler: reduce LR by 0.1 every 3 epochs scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=3, gamma=0.1) # Dummy data inputs = torch.tensor([[1.0, 2.0], [3.0, 4.0]]) targets = torch.tensor([[1.0], [2.0]]) loss_fn = nn.MSELoss() for epoch in range(6): optimizer.zero_grad() outputs = model(inputs) loss = loss_fn(outputs, targets) loss.backward() optimizer.step() scheduler.step() print(f"Epoch {epoch+1}, Loss: {loss.item():.4f}, LR: {scheduler.get_last_lr()[0]:.5f}")
Call scheduler.step() after each epoch to update the learning rate.
You can check the current learning rate with scheduler.get_last_lr().
Different schedulers change the learning rate in different ways; choose one that fits your training needs.
Learning rate schedulers adjust the learning speed during training to help the model learn better.
They are used to reduce the learning rate gradually or at specific times.
PyTorch provides many schedulers like StepLR, ExponentialLR, and CosineAnnealingLR.