Recall & Review
beginner
What is gradient clipping in machine learning?
Gradient clipping is a technique to limit or "clip" the gradients during training to prevent them from becoming too large, which helps avoid unstable updates and exploding gradients.
Click to reveal answer
beginner
Why do exploding gradients cause problems during training?
Exploding gradients cause very large updates to model weights, which can make the training unstable and cause the model to fail to learn properly.
Click to reveal answer
intermediate
How does PyTorch implement gradient clipping?
PyTorch provides functions like torch.nn.utils.clip_grad_norm_ and torch.nn.utils.clip_grad_value_ to clip gradients by norm or by value before the optimizer updates the model weights.
Click to reveal answer
intermediate
What is the difference between clipping gradients by norm and by value?
Clipping by norm scales all gradients so their total length (norm) does not exceed a threshold, while clipping by value limits each individual gradient element to a maximum absolute value.
Click to reveal answer
beginner
Show a simple PyTorch code snippet to clip gradients by norm.
After computing loss.backward(), use torch.nn.utils.clip_grad_norm_(model.parameters(), max_norm=1.0) before optimizer.step() to clip gradients with max norm 1.0.
Click to reveal answer
What problem does gradient clipping mainly solve?
✗ Incorrect
Gradient clipping is used to prevent exploding gradients by limiting their size.
Which PyTorch function clips gradients by their norm?
✗ Incorrect
torch.nn.utils.clip_grad_norm_ clips gradients based on their norm.
When should gradient clipping be applied during training?
✗ Incorrect
Gradients are clipped after computing them with loss.backward() and before updating weights with optimizer.step().
Clipping gradients by value means:
✗ Incorrect
Clipping by value limits each gradient element individually.
What happens if gradients are not clipped and explode?
✗ Incorrect
Exploding gradients cause unstable training and can prevent the model from learning.
Explain in your own words what gradient clipping is and why it is useful.
Think about what happens when gradients get too big during training.
You got /3 concepts.
Describe how to apply gradient clipping in a PyTorch training loop.
Remember the order of operations in training.
You got /3 concepts.