0
0
PyTorchml~20 mins

Loss functions (MSELoss, CrossEntropyLoss) in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Loss Function Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of MSELoss with simple tensors
What is the output of the following PyTorch code that calculates Mean Squared Error loss?
PyTorch
import torch
import torch.nn as nn

loss_fn = nn.MSELoss()
pred = torch.tensor([2.0, 3.0, 4.0])
target = torch.tensor([1.0, 3.0, 5.0])
loss = loss_fn(pred, target)
print(loss.item())
A0.6666667
B1.0
C2.0
D0.5
Attempts:
2 left
💡 Hint
Recall that MSELoss computes the average of squared differences.
Predict Output
intermediate
2:00remaining
Output of CrossEntropyLoss with logits
What is the output of this PyTorch code using CrossEntropyLoss with logits for a 3-class classification?
PyTorch
import torch
import torch.nn as nn

loss_fn = nn.CrossEntropyLoss()
logits = torch.tensor([[1.0, 2.0, 0.5]])  # batch size 1, 3 classes
labels = torch.tensor([1])  # correct class index
loss = loss_fn(logits, labels)
print(round(loss.item(), 4))
A0.4649
B0.4170
C1.0986
D0.6931
Attempts:
2 left
💡 Hint
CrossEntropyLoss applies softmax internally and computes negative log likelihood.
Model Choice
advanced
2:00remaining
Choosing loss function for regression vs classification
You want to train a model to predict house prices (a continuous value) and another model to classify images into 5 categories. Which loss functions should you choose respectively?
AMSELoss for both tasks
BCrossEntropyLoss for house prices, MSELoss for image classification
CMSELoss for house prices, CrossEntropyLoss for image classification
DCrossEntropyLoss for both tasks
Attempts:
2 left
💡 Hint
Think about the type of output each task produces.
Hyperparameter
advanced
2:00remaining
Effect of reduction parameter in MSELoss
What is the effect of setting the 'reduction' parameter to 'sum' in PyTorch's MSELoss compared to the default 'mean'?
ALoss values are averaged over batch dimension only
BLoss values are squared twice
CLoss values are normalized by the number of classes
DLoss values are summed over all elements instead of averaged
Attempts:
2 left
💡 Hint
Check how reduction changes aggregation of individual losses.
Metrics
expert
3:00remaining
Interpreting CrossEntropyLoss output for multi-class classification
Given a batch of 2 samples with logits and true labels, what is the CrossEntropyLoss output? Code: import torch import torch.nn as nn loss_fn = nn.CrossEntropyLoss() logits = torch.tensor([[2.0, 1.0, 0.1], [0.5, 2.5, 0.3]]) labels = torch.tensor([0, 1]) loss = loss_fn(logits, labels) print(round(loss.item(), 4))
A0.5514
B0.3190
C0.4170
D1.0986
Attempts:
2 left
💡 Hint
Calculate softmax probabilities for each sample and average negative log likelihood.