0
0
PyTorchml~20 mins

Defining a model class in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
PyTorch Model Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of a simple PyTorch model forward pass
What is the output shape of the tensor after running the forward method of this model with input shape (10, 5)?
PyTorch
import torch
import torch.nn as nn

class SimpleModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.linear = nn.Linear(5, 3)

    def forward(self, x):
        return self.linear(x)

model = SimpleModel()
input_tensor = torch.randn(10, 5)
output = model(input_tensor)
output_shape = output.shape
print(output_shape)
Atorch.Size([10, 3])
Btorch.Size([5, 3])
Ctorch.Size([10, 5])
Dtorch.Size([3, 10])
Attempts:
2 left
💡 Hint
Remember the linear layer changes the last dimension from input features to output features.
Model Choice
intermediate
2:00remaining
Choosing the correct model class definition
Which of the following PyTorch model class definitions correctly implements a model with one hidden layer of size 4 and ReLU activation?
A
class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.hidden = nn.Linear(4, 3)
        self.output = nn.Linear(3, 2)
    def forward(self, x):
        x = torch.relu(self.hidden(x))
        return self.output(x)
B
class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.hidden = nn.Linear(3, 4)
        self.output = nn.Linear(4, 2)
    def forward(self, x):
        x = self.hidden(x)
        x = torch.relu(self.output(x))
        return x
C
class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.hidden = nn.Linear(3, 2)
        self.output = nn.Linear(2, 4)
    def forward(self, x):
        x = torch.relu(self.hidden(x))
        return self.output(x)
D
class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.hidden = nn.Linear(3, 4)
        self.output = nn.Linear(4, 2)
    def forward(self, x):
        x = torch.relu(self.hidden(x))
        return self.output(x)
Attempts:
2 left
💡 Hint
Check the input and output sizes of each layer and the order of activation.
Hyperparameter
advanced
2:00remaining
Effect of changing hidden layer size in a model class
If you increase the hidden layer size from 10 to 100 in a fully connected PyTorch model, which of the following is a likely effect?
AThe model will train faster because it has more neurons.
BThe model will have more parameters and may overfit more easily.
CThe model will always have worse accuracy due to overfitting.
DThe model will use less memory during training.
Attempts:
2 left
💡 Hint
Think about how model size affects capacity and training behavior.
🔧 Debug
advanced
2:00remaining
Identify the error in this PyTorch model class
What error will this PyTorch model class raise when instantiated?
PyTorch
import torch.nn as nn

class FaultyModel(nn.Module):
    def __init__(self):
        self.linear = nn.Linear(5, 2)

    def forward(self, x):
        return self.linear(x)

model = FaultyModel()
AAttributeError because nn.Module __init__ is not called
BTypeError because forward method is missing
CSyntaxError due to missing colon
DNo error, model instantiates correctly
Attempts:
2 left
💡 Hint
Check the __init__ method and superclass initialization.
🧠 Conceptual
expert
2:00remaining
Why define a model class instead of using nn.Sequential?
Which reason best explains why you might define a custom PyTorch model class instead of using nn.Sequential?
ABecause custom classes always train faster than nn.Sequential models.
BBecause nn.Sequential cannot be used with linear layers.
CTo implement complex forward logic like multiple inputs, conditional operations, or loops.
DTo avoid writing the forward method manually.
Attempts:
2 left
💡 Hint
Think about flexibility in model design and forward pass control.