Challenge - 5 Problems
Classifier Head Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of replacing classifier head in a PyTorch model
What is the output shape of the model's final layer after replacing the classifier head with a new linear layer of 10 output features?
PyTorch
import torch import torch.nn as nn from torchvision import models model = models.resnet18() num_features = model.fc.in_features model.fc = nn.Linear(num_features, 10) input_tensor = torch.randn(4, 3, 224, 224) output = model(input_tensor) output_shape = output.shape print(output_shape)
Attempts:
2 left
💡 Hint
Remember the batch size and the number of output classes in the new classifier head.
✗ Incorrect
Replacing the classifier head changes the output layer to have 10 output features. The batch size is 4, so the output shape is (4, 10).
❓ Model Choice
intermediate2:00remaining
Choosing the correct way to replace classifier head in PyTorch
Which option correctly replaces the classifier head of a pretrained VGG16 model to output 5 classes?
Attempts:
2 left
💡 Hint
Check the attribute name and index of the classifier layer in VGG16.
✗ Incorrect
VGG16's classifier is a Sequential module, and the last layer is at index 6. Replacing model.classifier[6] correctly changes the output layer.
❓ Hyperparameter
advanced2:00remaining
Effect of freezing layers when replacing classifier head
If you replace the classifier head of a pretrained ResNet50 and freeze all layers except the new head, which statement is true about training?
Attempts:
2 left
💡 Hint
Freezing layers means setting requires_grad to False for those parameters.
✗ Incorrect
Freezing all layers except the new head means only the new head's parameters have requires_grad=True and will update during training.
🔧 Debug
advanced2:00remaining
Debugging error after replacing classifier head
After replacing the classifier head of a pretrained ResNet18 with nn.Linear(512, 20), the model raises a runtime error during training: "size mismatch, m1: [4 x 512], m2: [1000 x 20]". What is the cause?
Attempts:
2 left
💡 Hint
Check if the model's classifier attribute was correctly assigned.
✗ Incorrect
The error shows a mismatch between expected input features and the classifier weight matrix, indicating the old classifier was not replaced properly.
🧠 Conceptual
expert2:00remaining
Why replace classifier head instead of retraining entire model?
Why is it common practice to replace only the classifier head of a pretrained model when adapting it to a new task?
Attempts:
2 left
💡 Hint
Think about transfer learning and feature reuse.
✗ Incorrect
Pretrained layers capture general features; retraining only the head adapts to new classes efficiently without needing large data or long training.