Complete the code to create a label smoothing loss using PyTorch.
import torch.nn as nn criterion = nn.CrossEntropyLoss(label_smoothing=[1])
Label smoothing is usually a small positive value like 0.1 to soften the labels.
Complete the code to apply label smoothing in a training loop.
outputs = model(inputs)
loss = criterion(outputs, [1])The loss function expects the original raw labels (class indices), not smoothed labels.
Fix the error in the label smoothing parameter to avoid invalid values.
criterion = nn.CrossEntropyLoss(label_smoothing=[1])Label smoothing must be between 0 and 1. 0.15 is valid, others are invalid.
Fill both blanks to create a smoothed label tensor for a batch of size 3 and 5 classes.
import torch batch_size = 3 num_classes = 5 smoothing = [1] labels = torch.tensor([0, 2, 4]) smoothed_labels = torch.full((batch_size, num_classes), [2]) smoothed_labels.scatter_(1, labels.unsqueeze(1), 1 - smoothing)
Label smoothing value is 0.1, so smoothing mass is spread as 0.025 to other classes (0.1 / (5-1)).
Fill all three blanks to compute the label smoothing loss manually.
import torch import torch.nn.functional as F outputs = torch.tensor([[2.0, 0.5, 0.3], [0.1, 1.0, 2.1]]) labels = torch.tensor([0, 2]) smoothing = [1] num_classes = outputs.size(1) with torch.no_grad(): true_dist = torch.full_like(outputs, [2]) true_dist.scatter_(1, labels.unsqueeze(1), [3]) log_probs = F.log_softmax(outputs, dim=1) loss = (-true_dist * log_probs).sum(dim=1).mean()
Smoothing is 0.1, off-target value is 0.1/(3-1)=0.05, and target class value is 0.9 (1 - smoothing).