Recall & Review
beginner
What is a learning rate in machine learning?
The learning rate is a small number that controls how much the model changes its knowledge each time it learns from data. Think of it like the speed of learning.
Click to reveal answer
beginner
What does 'learning rate differential' mean?
Learning rate differential means using different learning rates for different parts of a model. Some parts learn faster, others slower, like giving more attention to some tasks.
Click to reveal answer
intermediate
Why use different learning rates for different layers in a neural network?
Because some layers may need bigger changes to learn new things, while others need smaller changes to keep what they already know. This helps the model learn better and faster.
Click to reveal answer
intermediate
How do you set different learning rates in PyTorch?
You can pass a list of dictionaries to the optimizer, each with a 'params' key for the model part and a 'lr' key for its learning rate. This tells PyTorch to update each part with its own speed.
Click to reveal answer
intermediate
What is a practical example of learning rate differential?
When fine-tuning a pre-trained model, you might use a small learning rate for the old layers to keep their knowledge, and a bigger learning rate for new layers to learn fast.
Click to reveal answer
What does a higher learning rate do?
✗ Incorrect
A higher learning rate means bigger steps in learning, which can speed up training but might skip over important details.
In PyTorch, how do you apply different learning rates to different layers?
✗ Incorrect
PyTorch allows setting different learning rates by passing parameter groups with their own 'lr' values to the optimizer.
Why might you want a smaller learning rate for pre-trained layers?
✗ Incorrect
A smaller learning rate helps preserve the useful knowledge already learned in pre-trained layers.
What is a risk of using too large a learning rate?
✗ Incorrect
Too large a learning rate can cause the model to miss the best solution by jumping too much.
Learning rate differential is especially useful in which scenario?
✗ Incorrect
Fine-tuning benefits from different learning rates to adjust new and old layers properly.
Explain what learning rate differential is and why it helps in training neural networks.
Think about how some parts of the model might need to learn slower or faster.
You got /3 concepts.
Describe how to implement learning rate differential in PyTorch with code.
Focus on how the optimizer receives different learning rates.
You got /3 concepts.