Recall & Review
beginner
What is overfitting in machine learning?
Overfitting happens when a model learns the training data too well, including noise and details that don't apply to new data. This makes the model perform poorly on new, unseen data.
Click to reveal answer
beginner
What does regularization do in a machine learning model?
Regularization adds a penalty to the model's complexity, encouraging it to keep weights smaller or simpler. This helps the model generalize better to new data by avoiding fitting noise.
Click to reveal answer
intermediate
How does L2 regularization (weight decay) work in PyTorch?
L2 regularization adds the sum of squared weights to the loss function. In PyTorch, this is often done by setting the 'weight_decay' parameter in the optimizer, which shrinks weights during training.
Click to reveal answer
beginner
Why does regularization reduce overfitting?
Regularization limits how complex the model can get by penalizing large weights. This stops the model from memorizing training data noise and helps it learn patterns that work well on new data.
Click to reveal answer
intermediate
What is the difference between L1 and L2 regularization?
L1 regularization adds the absolute values of weights to the loss, encouraging sparsity (some weights become zero). L2 adds squared weights, encouraging smaller weights but not necessarily zero.
Click to reveal answer
What problem does regularization mainly help to solve?
✗ Incorrect
Regularization helps prevent overfitting by controlling model complexity.
In PyTorch, how do you apply L2 regularization?
✗ Incorrect
L2 regularization is applied by setting the weight_decay parameter in the optimizer.
Which regularization method encourages some weights to become exactly zero?
✗ Incorrect
L1 regularization encourages sparsity by pushing some weights to zero.
Why does a model with very large weights tend to overfit?
✗ Incorrect
Large weights can cause the model to memorize noise, leading to overfitting.
What is a simple way to explain regularization to a friend?
✗ Incorrect
Regularization keeps the model simple to help it generalize better.
Explain in your own words why regularization helps control overfitting in machine learning models.
Think about how adding a penalty changes the model's learning.
You got /4 concepts.
Describe how you would add L2 regularization to a PyTorch model training process.
Focus on the optimizer settings in PyTorch.
You got /4 concepts.