Recall & Review
beginner
What is the purpose of ReduceLROnPlateau in PyTorch?
ReduceLROnPlateau lowers the learning rate when a model's performance stops improving, helping the model learn better by taking smaller steps.
Click to reveal answer
beginner
Which metric does ReduceLROnPlateau monitor to decide when to reduce the learning rate?
It monitors a chosen metric like validation loss or accuracy to detect when the model stops improving.
Click to reveal answer
intermediate
What does the 'patience' parameter control in ReduceLROnPlateau?
Patience sets how many epochs to wait without improvement before reducing the learning rate.
Click to reveal answer
intermediate
How does the 'factor' parameter affect learning rate reduction in ReduceLROnPlateau?
Factor is the multiplier applied to the current learning rate to reduce it, for example, factor=0.1 reduces the rate to 10% of its value.
Click to reveal answer
beginner
Show a simple PyTorch code snippet using ReduceLROnPlateau with validation loss monitoring.
import torch.optim as optim
optimizer = optim.Adam(model.parameters(), lr=0.01)
scheduler = optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.1, patience=5)
# In training loop:
# val_loss = compute_validation_loss()
# scheduler.step(val_loss)Click to reveal answer
What does ReduceLROnPlateau do when the monitored metric stops improving?
✗ Incorrect
ReduceLROnPlateau reduces the learning rate to help the model improve when progress stalls.
Which parameter controls how long to wait before reducing the learning rate?
✗ Incorrect
Patience sets the number of epochs to wait without improvement before reducing the learning rate.
If factor=0.5, what happens to the learning rate when ReduceLROnPlateau triggers?
✗ Incorrect
Factor multiplies the learning rate; 0.5 means the learning rate is reduced to half.
Which mode should you use to monitor validation loss with ReduceLROnPlateau?
✗ Incorrect
Use mode='min' when monitoring loss because lower loss is better.
When should you call scheduler.step() in training with ReduceLROnPlateau?
✗ Incorrect
Call scheduler.step() after each epoch, passing the metric value to decide if learning rate should reduce.
Explain how ReduceLROnPlateau helps improve model training and what key parameters control its behavior.
Think about when and how the learning rate changes during training.
You got /4 concepts.
Describe how you would implement ReduceLROnPlateau in a PyTorch training loop to adjust learning rate based on validation loss.
Focus on the code steps and when to call the scheduler.
You got /4 concepts.