0
0
PyTorchml~20 mins

forward method in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Forward Method Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output shape after forward pass?

Given the PyTorch model below, what is the shape of the output tensor after running model(x) where x is a tensor of shape (10, 3, 32, 32)?

PyTorch
import torch
import torch.nn as nn

class SimpleCNN(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv = nn.Conv2d(3, 6, kernel_size=3, padding=1)
        self.pool = nn.MaxPool2d(2, 2)
        self.fc = nn.Linear(6 * 16 * 16, 10)

    def forward(self, x):
        x = self.pool(torch.relu(self.conv(x)))
        x = x.view(-1, 6 * 16 * 16)
        x = self.fc(x)
        return x

model = SimpleCNN()
x = torch.randn(10, 3, 32, 32)
out = model(x)
output_shape = out.shape
A(10, 6, 16, 16)
B(6, 16, 16)
C(10, 10)
D(10, 3, 32, 32)
Attempts:
2 left
💡 Hint

Remember the batch size is the first dimension and the final fully connected layer outputs 10 features.

Model Choice
intermediate
2:00remaining
Which forward method correctly applies dropout during training only?

Choose the correct forward method implementation that applies dropout only during training in PyTorch.

PyTorch
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(20, 10)
        self.dropout = nn.Dropout(0.5)
        self.fc2 = nn.Linear(10, 1)

    def forward(self, x):
        # Options below
        pass
A
def forward(self, x):
    x = torch.relu(self.fc1(x))
    x = self.dropout.train()(x)
    x = self.fc2(x)
    return x
B
def forward(self, x):
    x = torch.relu(self.fc1(x))
    x = self.dropout(x)
    x = self.fc2(x)
    return x
C
def forward(self, x):
    x = torch.relu(self.fc1(x))
    x = self.dropout(x) if self.training else x
    x = self.fc2(x)
    return x
D
def forward(self, x):
    x = torch.relu(self.fc1(x))
    x = self.dropout.eval()(x)
    x = self.fc2(x)
    return x
Attempts:
2 left
💡 Hint

PyTorch dropout layers automatically apply dropout only during training mode.

Hyperparameter
advanced
2:00remaining
Which learning rate will likely cause unstable training in this forward method?

Consider a neural network with the following forward method. Which learning rate is most likely to cause unstable training (loss exploding)?

PyTorch
import torch
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc = nn.Linear(100, 10)

    def forward(self, x):
        return torch.relu(self.fc(x))

model = Net()
optimizer = torch.optim.SGD(model.parameters(), lr=LR)
A0.001
B0.01
C0.1
D1.0
Attempts:
2 left
💡 Hint

Higher learning rates can cause the model weights to update too aggressively.

🔧 Debug
advanced
2:00remaining
Why does this forward method raise a runtime error?

Examine the forward method below. Why does it raise a runtime error when called?

PyTorch
import torch
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(10, 5)

    def forward(self, x):
        x = self.fc1(x)
        x = x.view(-1, 10)
        return x

model = Net()
x = torch.randn(3, 10)
out = model(x)
AThe view operation tries to reshape tensor to incompatible size causing runtime error.
BThe input tensor has wrong shape for the linear layer causing error.
CThe forward method is missing a return statement causing None output.
DThe model is missing a call to super().__init__() causing initialization error.
Attempts:
2 left
💡 Hint

Check the shape after the linear layer and what the view tries to reshape it into.

🧠 Conceptual
expert
2:00remaining
What is the role of the forward method in PyTorch's nn.Module?

Choose the best description of the role of the forward method in a PyTorch nn.Module subclass.

AIt defines how input data flows through the layers to produce output during the model's forward pass.
BIt initializes the model's parameters and layers before training starts.
CIt updates the model's weights during backpropagation after loss calculation.
DIt saves the model's state dictionary to disk for later use.
Attempts:
2 left
💡 Hint

Think about what happens when you call model(input).