0
0
PyTorchml~10 mins

ReduceLROnPlateau in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a ReduceLROnPlateau scheduler that monitors validation loss.

PyTorch
scheduler = torch.optim.lr_scheduler.[1](optimizer, mode='min')
Drag options to blanks, or click blank then click option'
AReduceLROnPlateau
BExponentialLR
CCosineAnnealingLR
DStepLR
Attempts:
3 left
💡 Hint
Common Mistakes
Using StepLR instead of ReduceLROnPlateau
Forgetting to set mode='min' for validation loss
2fill in blank
medium

Complete the code to call the scheduler step function with the validation loss value.

PyTorch
scheduler.[1](val_loss)
Drag options to blanks, or click blank then click option'
Aadjust
Bupdate
Creduce
Dstep
Attempts:
3 left
💡 Hint
Common Mistakes
Calling step() without passing the metric value
Using incorrect method names like 'update' or 'reduce'
3fill in blank
hard

Fix the error in the scheduler initialization by filling the blank with the correct patience value.

PyTorch
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, patience=[1])
Drag options to blanks, or click blank then click option'
A5
B'5'
CNone
D-1
Attempts:
3 left
💡 Hint
Common Mistakes
Passing patience as a string instead of integer
Using negative or None values
4fill in blank
hard

Fill both blanks to create a scheduler that reduces learning rate by a factor of 0.1 after 3 epochs without improvement.

PyTorch
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, factor=[1], patience=[2])
Drag options to blanks, or click blank then click option'
A0.1
B3
C0.5
D5
Attempts:
3 left
💡 Hint
Common Mistakes
Using factor greater than 1
Setting patience too high or as a float
5fill in blank
hard

Fill all three blanks to create a training loop that updates the scheduler with validation loss and prints the learning rate.

PyTorch
for epoch in range(num_epochs):
    train()
    val_loss = validate()
    scheduler.[1](val_loss)
    lr = optimizer.param_groups[0]['[2]']
    print(f"Epoch {epoch+1}, Learning Rate: {lr:.6f}")

# The scheduler step method is called with the validation loss, and the learning rate key is '[3]'.
Drag options to blanks, or click blank then click option'
Astep
Blr
Dlearning_rate
Attempts:
3 left
💡 Hint
Common Mistakes
Using incorrect method name instead of 'step'
Accessing learning rate with wrong key like 'learning_rate'