0
0
PyTorchml~10 mins

Replacing classifier head in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to replace the classifier head of a pretrained model with a new linear layer.

PyTorch
import torch.nn as nn
import torchvision
model = torchvision.models.resnet18(pretrained=True)
model.fc = nn.[1](512, 10)
Drag options to blanks, or click blank then click option'
AReLU
BConv2d
CBatchNorm2d
DLinear
Attempts:
3 left
💡 Hint
Common Mistakes
Using Conv2d instead of Linear for the classifier head.
Not matching the input feature size of the new layer.
2fill in blank
medium

Complete the code to freeze all layers except the new classifier head.

PyTorch
for param in model.parameters():
    param.[1] = False
for param in model.fc.parameters():
    param.requires_grad = True
Drag options to blanks, or click blank then click option'
Arequires_grad
Bgrad
Crequires_grad_
Drequires_grad__
Attempts:
3 left
💡 Hint
Common Mistakes
Using incorrect attribute names like grad or requires_grad_.
Not freezing the pretrained layers before training.
3fill in blank
hard

Fix the error in the code to correctly replace the classifier head with a new linear layer.

PyTorch
import torchvision
import torch.nn as nn
model = torchvision.models.resnet50(pretrained=True)
model.fc = nn.Linear([1], 5)
Drag options to blanks, or click blank then click option'
A2048
B512
C1024
D256
Attempts:
3 left
💡 Hint
Common Mistakes
Using 512 which is the input size for ResNet18, not ResNet50.
Using arbitrary numbers without checking the model architecture.
4fill in blank
hard

Fill both blanks to create a new classifier head with dropout and linear layers.

PyTorch
model.fc = nn.Sequential(
    nn.Dropout(p=[1]),
    nn.Linear(512, [2])
)
Drag options to blanks, or click blank then click option'
A0.5
B10
C5
D0.3
Attempts:
3 left
💡 Hint
Common Mistakes
Using dropout probability greater than 1 or less than 0.
Mismatch between linear layer output size and number of classes.
5fill in blank
hard

Fill all three blanks to replace the classifier head and freeze pretrained layers except the new head.

PyTorch
import torchvision
import torch.nn as nn
model = torchvision.models.resnet34(pretrained=True)
for param in model.parameters():
    param.[1] = False
model.fc = nn.Linear([2], [3])
for param in model.fc.parameters():
    param.requires_grad = True
Drag options to blanks, or click blank then click option'
Arequires_grad
B512
C7
D1000
Attempts:
3 left
💡 Hint
Common Mistakes
Not freezing pretrained layers before training.
Using wrong input or output sizes for the linear layer.