0
0
PyTorchml~10 mins

Why nn.Module organizes model code in PyTorch - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to define a PyTorch model class that inherits from nn.Module.

PyTorch
import torch.nn as nn

class SimpleModel(nn.Module):
    def __init__(self):
        super(SimpleModel, self).__init__()
        self.linear = nn.Linear(10, 1)

    def [1](self, x):
        return self.linear(x)
Drag options to blanks, or click blank then click option'
Aforward
Bapply_activation
Cbackward
Dtrain
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'backward' or 'train' instead of 'forward' for the method name.
2fill in blank
medium

Complete the code to initialize the parent class in the model constructor.

PyTorch
class MyModel(nn.Module):
    def __init__(self):
        [1]
        self.layer = nn.Linear(5, 2)
Drag options to blanks, or click blank then click option'
Asuper().__init__()
Bself.__init__()
Cnn.Module.__init__(self)
Dsuper(MyModel, self).__init__()
Attempts:
3 left
💡 Hint
Common Mistakes
Calling self.__init__() causes infinite recursion.
Using super().__init__() without arguments is valid in Python 3 but here we want the explicit form.
3fill in blank
hard

Fix the error in the forward method to correctly use the linear layer.

PyTorch
class FixModel(nn.Module):
    def __init__(self):
        super(FixModel, self).__init__()
        self.linear = nn.Linear(3, 1)

    def forward(self, x):
        return self.linear[1](x)
Drag options to blanks, or click blank then click option'
A.call
B.forward
C()
D.apply
Attempts:
3 left
💡 Hint
Common Mistakes
Trying to call '.forward(x)' directly instead of using parentheses.
4fill in blank
hard

Fill both blanks to create a model with two layers and use ReLU activation between them.

PyTorch
class TwoLayerModel(nn.Module):
    def __init__(self):
        super(TwoLayerModel, self).__init__()
        self.fc1 = nn.Linear(4, 3)
        self.fc2 = nn.Linear(3, 1)
        self.relu = nn.ReLU()

    def forward(self, x):
        x = self.fc1[1](x)
        x = self.relu[2](x)
        return self.fc2(x)
Drag options to blanks, or click blank then click option'
A()
B.forward
C.call
D.apply
Attempts:
3 left
💡 Hint
Common Mistakes
Using '.forward' or '.call' instead of parentheses to call layers.
5fill in blank
hard

Fill all three blanks to define a model that stores layers as attributes, calls them in forward, and returns the output.

PyTorch
class CustomModel(nn.Module):
    def __init__(self):
        super(CustomModel, self).__init__()
        self.layer1 = nn.Linear(6, 4)
        self.activation = nn.Sigmoid()
        self.layer2 = nn.Linear(4, 2)

    def forward(self, input):
        out = self.layer1[1](input)
        out = self.activation[2](out)
        return self.layer2[3](out)
Drag options to blanks, or click blank then click option'
A()
B.forward
C.call
D.apply
Attempts:
3 left
💡 Hint
Common Mistakes
Trying to call layers with '.forward' or '.call' instead of parentheses.