Complete the code to define a PyTorch model class that inherits from nn.Module.
import torch.nn as nn class SimpleModel(nn.Module): def __init__(self): super(SimpleModel, self).__init__() self.linear = nn.Linear(10, 1) def [1](self, x): return self.linear(x)
The forward method defines how the input data passes through the model layers.
Complete the code to initialize the parent class in the model constructor.
class MyModel(nn.Module): def __init__(self): [1] self.layer = nn.Linear(5, 2)
Calling super(MyModel, self).__init__() properly initializes the parent nn.Module class.
Fix the error in the forward method to correctly use the linear layer.
class FixModel(nn.Module): def __init__(self): super(FixModel, self).__init__() self.linear = nn.Linear(3, 1) def forward(self, x): return self.linear[1](x)
In PyTorch, layers are called like functions with parentheses, e.g., self.linear(x).
Fill both blanks to create a model with two layers and use ReLU activation between them.
class TwoLayerModel(nn.Module): def __init__(self): super(TwoLayerModel, self).__init__() self.fc1 = nn.Linear(4, 3) self.fc2 = nn.Linear(3, 1) self.relu = nn.ReLU() def forward(self, x): x = self.fc1[1](x) x = self.relu[2](x) return self.fc2(x)
Layers and activations are called with parentheses like functions, e.g., self.fc1(x) and self.relu(x).
Fill all three blanks to define a model that stores layers as attributes, calls them in forward, and returns the output.
class CustomModel(nn.Module): def __init__(self): super(CustomModel, self).__init__() self.layer1 = nn.Linear(6, 4) self.activation = nn.Sigmoid() self.layer2 = nn.Linear(4, 2) def forward(self, input): out = self.layer1[1](input) out = self.activation[2](out) return self.layer2[3](out)
Each layer and activation is called like a function using parentheses, e.g., self.layer1(input).