Challenge - 5 Problems
Loss Function Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of MSELoss with simple tensors
What is the output of the following PyTorch code that calculates Mean Squared Error loss?
PyTorch
import torch import torch.nn as nn loss_fn = nn.MSELoss() pred = torch.tensor([2.0, 3.0, 4.0]) target = torch.tensor([1.0, 3.0, 5.0]) loss = loss_fn(pred, target) print(loss.item())
Attempts:
2 left
💡 Hint
Recall that MSELoss computes the average of squared differences.
✗ Incorrect
MSELoss calculates the average of (pred - target)^2. Here, differences are [1, 0, -1], squares are [1, 0, 1], average is (1+0+1)/3 = 0.6666667.
❓ Predict Output
intermediate2:00remaining
Output of CrossEntropyLoss with logits
What is the output of this PyTorch code using CrossEntropyLoss with logits for a 3-class classification?
PyTorch
import torch import torch.nn as nn loss_fn = nn.CrossEntropyLoss() logits = torch.tensor([[1.0, 2.0, 0.5]]) # batch size 1, 3 classes labels = torch.tensor([1]) # correct class index loss = loss_fn(logits, labels) print(round(loss.item(), 4))
Attempts:
2 left
💡 Hint
CrossEntropyLoss applies softmax internally and computes negative log likelihood.
✗ Incorrect
Softmax probabilities are approx [0.2311, 0.6284, 0.1404]. Negative log of probability for class 1 is -log(0.6284) ≈ 0.4649.
❓ Model Choice
advanced2:00remaining
Choosing loss function for regression vs classification
You want to train a model to predict house prices (a continuous value) and another model to classify images into 5 categories. Which loss functions should you choose respectively?
Attempts:
2 left
💡 Hint
Think about the type of output each task produces.
✗ Incorrect
Regression tasks predict continuous values, so MSELoss fits best. Classification tasks predict class probabilities, so CrossEntropyLoss is appropriate.
❓ Hyperparameter
advanced2:00remaining
Effect of reduction parameter in MSELoss
What is the effect of setting the 'reduction' parameter to 'sum' in PyTorch's MSELoss compared to the default 'mean'?
Attempts:
2 left
💡 Hint
Check how reduction changes aggregation of individual losses.
✗ Incorrect
'sum' adds all squared errors together, while 'mean' averages them. This changes the scale of the loss.
❓ Metrics
expert3:00remaining
Interpreting CrossEntropyLoss output for multi-class classification
Given a batch of 2 samples with logits and true labels, what is the CrossEntropyLoss output?
Code:
import torch
import torch.nn as nn
loss_fn = nn.CrossEntropyLoss()
logits = torch.tensor([[2.0, 1.0, 0.1], [0.5, 2.5, 0.3]])
labels = torch.tensor([0, 1])
loss = loss_fn(logits, labels)
print(round(loss.item(), 4))
Attempts:
2 left
💡 Hint
Calculate softmax probabilities for each sample and average negative log likelihood.
✗ Incorrect
For sample 1: softmax probs ≈ [0.659, 0.242, 0.099], loss = -log(0.659) ≈ 0.4170.
For sample 2: softmax probs ≈ [0.109, 0.803, 0.089], loss = -log(0.803) ≈ 0.2200.
Average loss = (0.4170 + 0.2200)/2 = 0.3185 (rounded to 0.3190).