0
0
PyTorchml~20 mins

Model parameters inspection in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Parameter Inspector Pro
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of model parameter count code
What is the output of the following PyTorch code that counts the total number of parameters in a simple linear model?
PyTorch
import torch
import torch.nn as nn

model = nn.Linear(3, 2)

param_count = sum(p.numel() for p in model.parameters())
print(param_count)
A9
B6
C5
D8
Attempts:
2 left
💡 Hint
Remember that nn.Linear has weights and biases. Weights shape is (out_features, in_features).
🧠 Conceptual
intermediate
2:00remaining
Understanding requires_grad attribute
Which statement correctly describes the effect of setting requires_grad=False on a model parameter in PyTorch?
AThe parameter will not be updated during training because gradients are not computed for it.
BThe parameter will be updated normally but gradients will be zero.
CThe parameter will cause an error during backpropagation.
DThe parameter will be frozen but still receive gradient updates.
Attempts:
2 left
💡 Hint
Think about what happens when gradients are not computed for a parameter.
Metrics
advanced
2:00remaining
Inspecting model parameter shapes
Given the following PyTorch model, what is the shape of the parameter named 'fc2.weight'?
PyTorch
import torch.nn as nn

class SimpleNN(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(10, 5)
        self.fc2 = nn.Linear(5, 2)

model = SimpleNN()
print(model.fc2.weight.shape)
Atorch.Size([5, 2])
Btorch.Size([2, 5])
Ctorch.Size([10, 5])
Dtorch.Size([5, 10])
Attempts:
2 left
💡 Hint
Recall that nn.Linear weight shape is (out_features, in_features).
🔧 Debug
advanced
2:00remaining
Identifying error when accessing model parameters
What error will the following code raise when trying to access a non-existent parameter 'fc3.weight' in a PyTorch model?
PyTorch
import torch.nn as nn

model = nn.Sequential(
    nn.Linear(4, 3),
    nn.ReLU(),
    nn.Linear(3, 2)
)

param = dict(model.named_parameters())['fc3.weight']
AKeyError
BAttributeError
CTypeError
DIndexError
Attempts:
2 left
💡 Hint
Check what keys are in the named_parameters dictionary.
Model Choice
expert
3:00remaining
Choosing model to inspect parameters for convolutional layers
You want to inspect the parameters of a convolutional neural network with two Conv2d layers. Which model definition will allow you to correctly access the weight tensor of the second convolutional layer as 'conv2.weight'?
A
class CNN(nn.Module):
    def __init__(self):
        super().__init__()
        self.layer1 = nn.Conv2d(1, 16, 3)
        self.layer2 = nn.Conv2d(16, 32, 3)

model = CNN()
B
model = nn.Sequential(
    nn.Conv2d(1, 16, 3),
    nn.Conv2d(16, 32, 3)
)

# Access conv2.weight
C
class CNN(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 16, 3)
        self.conv2 = nn.Conv2d(16, 32, 3)

model = CNN()
D
model = nn.ModuleList([
    nn.Conv2d(1, 16, 3),
    nn.Conv2d(16, 32, 3)
])

# Access conv2.weight
Attempts:
2 left
💡 Hint
Only named attributes can be accessed by their names directly.