0
0
PyTorchml~20 mins

Label smoothing in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Label Smoothing Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
What is the main purpose of label smoothing in classification?
Label smoothing is a technique used during training classification models. What is its main purpose?
ATo reduce overconfidence of the model by softening the target labels
BTo increase the learning rate dynamically during training
CTo add noise to the input data for better generalization
DTo convert multi-class problems into multiple binary problems
Attempts:
2 left
💡 Hint
Think about how label smoothing changes the target labels.
Predict Output
intermediate
2:00remaining
What is the output tensor after applying label smoothing?
Given the following PyTorch code applying label smoothing to a batch of 3 samples with 4 classes, what is the smoothed target tensor?
PyTorch
import torch

def smooth_labels(labels, smoothing=0.1, num_classes=4):
    with torch.no_grad():
        smooth_value = smoothing / num_classes
        one_hot = torch.zeros(labels.size(0), num_classes)
        one_hot.scatter_(1, labels.unsqueeze(1), 1.0)
        return one_hot * (1 - smoothing) + smooth_value

labels = torch.tensor([0, 2, 3])
smoothed = smooth_labels(labels)
print(smoothed)
A[[1, 0, 0, 0], [0, 0, 1, 0], [0, 0, 0, 1]]
B[[0.9, 0.1, 0.1, 0.1], [0.1, 0.1, 0.9, 0.1], [0.1, 0.1, 0.1, 0.9]]
C[[0.925, 0.025, 0.025, 0.025], [0.025, 0.025, 0.925, 0.025], [0.025, 0.025, 0.025, 0.925]]
D[[0.9, 0.0333, 0.0333, 0.0333], [0.0333, 0.0333, 0.9, 0.0333], [0.0333, 0.0333, 0.0333, 0.9]]
Attempts:
2 left
💡 Hint
Calculate smoothing value as smoothing/num_classes and add it to all classes after scaling the one-hot by (1-smoothing).
Hyperparameter
advanced
1:30remaining
Choosing the label smoothing factor
Which label smoothing factor is most likely to cause the model to underfit?
A0.5
B0.05
C0.0 (no smoothing)
D0.1
Attempts:
2 left
💡 Hint
Think about what happens if the smoothing factor is very large.
Metrics
advanced
1:30remaining
Effect of label smoothing on training loss
When using label smoothing during training, how does the training loss typically behave compared to training without label smoothing?
ATraining loss fluctuates randomly and unpredictably
BTraining loss is lower because the model fits the smoothed labels better
CTraining loss is the same because label smoothing does not affect loss values
DTraining loss is higher because the model is penalized for being too confident
Attempts:
2 left
💡 Hint
Consider how smoothing changes the target labels and model confidence.
🔧 Debug
expert
2:00remaining
Identify the error in this label smoothing implementation
What error will this PyTorch code raise when applying label smoothing?
PyTorch
import torch

def label_smooth(targets, smoothing=0.1, num_classes=5):
    smooth_val = smoothing / num_classes
    one_hot = torch.zeros(targets.size(0), num_classes)
    one_hot.scatter_(1, targets.unsqueeze(1), 1.0)
    return one_hot * (1 - smoothing) + smooth_val

targets = torch.tensor([1, 3, 4])
smoothed = label_smooth(targets)
print(smoothed)
ARuntimeError due to scatter_ index out of range
BNo error, outputs smoothed labels correctly
CTypeError because smoothing is divided by num_classes instead of num_classes - 1
DValueError because targets tensor shape is incompatible
Attempts:
2 left
💡 Hint
Check the smoothing value calculation and scatter_ usage carefully.