Challenge - 5 Problems
Label Smoothing Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate1:30remaining
What is the main purpose of label smoothing in classification?
Label smoothing is a technique used during training classification models. What is its main purpose?
Attempts:
2 left
💡 Hint
Think about how label smoothing changes the target labels.
✗ Incorrect
Label smoothing replaces hard 0 and 1 labels with softer values to prevent the model from becoming too confident, which helps generalization.
❓ Predict Output
intermediate2:00remaining
What is the output tensor after applying label smoothing?
Given the following PyTorch code applying label smoothing to a batch of 3 samples with 4 classes, what is the smoothed target tensor?
PyTorch
import torch def smooth_labels(labels, smoothing=0.1, num_classes=4): with torch.no_grad(): smooth_value = smoothing / num_classes one_hot = torch.zeros(labels.size(0), num_classes) one_hot.scatter_(1, labels.unsqueeze(1), 1.0) return one_hot * (1 - smoothing) + smooth_value labels = torch.tensor([0, 2, 3]) smoothed = smooth_labels(labels) print(smoothed)
Attempts:
2 left
💡 Hint
Calculate smoothing value as smoothing/num_classes and add it to all classes after scaling the one-hot by (1-smoothing).
✗ Incorrect
With smoothing=0.1 and 4 classes, smoothing value is 0.1/4=0.025. The one-hot is scaled by (1-0.1)=0.9, then 0.025 is added to all, resulting in 0.925 for target and 0.025 for others.
❓ Hyperparameter
advanced1:30remaining
Choosing the label smoothing factor
Which label smoothing factor is most likely to cause the model to underfit?
Attempts:
2 left
💡 Hint
Think about what happens if the smoothing factor is very large.
✗ Incorrect
A very high smoothing factor like 0.5 makes the target labels too soft, reducing the model's ability to learn clear distinctions, leading to underfitting.
❓ Metrics
advanced1:30remaining
Effect of label smoothing on training loss
When using label smoothing during training, how does the training loss typically behave compared to training without label smoothing?
Attempts:
2 left
💡 Hint
Consider how smoothing changes the target labels and model confidence.
✗ Incorrect
Label smoothing prevents the model from assigning full probability to the correct class, so the loss is higher since the model is penalized for overconfidence.
🔧 Debug
expert2:00remaining
Identify the error in this label smoothing implementation
What error will this PyTorch code raise when applying label smoothing?
PyTorch
import torch def label_smooth(targets, smoothing=0.1, num_classes=5): smooth_val = smoothing / num_classes one_hot = torch.zeros(targets.size(0), num_classes) one_hot.scatter_(1, targets.unsqueeze(1), 1.0) return one_hot * (1 - smoothing) + smooth_val targets = torch.tensor([1, 3, 4]) smoothed = label_smooth(targets) print(smoothed)
Attempts:
2 left
💡 Hint
Check the smoothing value calculation and scatter_ usage carefully.
✗ Incorrect
The code correctly implements a common variant of label smoothing using smoothing/num_classes, and scatter_ is used properly with valid indices (0 to 4 for num_classes=5). No error occurs.