0
0
PyTorchml~10 mins

Defining a model class in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to import the PyTorch module needed to define a model.

PyTorch
import torch.nn as [1]
Drag options to blanks, or click blank then click option'
Aoptim
Butils
Cnn
Ddata
Attempts:
3 left
💡 Hint
Common Mistakes
Importing 'optim' instead of 'nn' for model definition.
Using 'data' which is for datasets, not models.
2fill in blank
medium

Complete the code to define a model class that inherits from the correct PyTorch base class.

PyTorch
class MyModel([1]):
    def __init__(self):
        super().__init__()
Drag options to blanks, or click blank then click option'
Ann.Linear
Bnn.Module
Cnn.Parameter
Dnn.Sequential
Attempts:
3 left
💡 Hint
Common Mistakes
Inheriting from nn.Sequential which is a container, not a base class.
Using nn.Linear which is a layer, not a base class.
3fill in blank
hard

Fix the error in the forward method definition by completing the method signature correctly.

PyTorch
def [1](self, x):
    return self.layer(x)
Drag options to blanks, or click blank then click option'
Arun
Bpredict
Ccall
Dforward
Attempts:
3 left
💡 Hint
Common Mistakes
Naming the method 'predict' which is not recognized by PyTorch.
Using 'call' or 'run' which are not valid method names here.
4fill in blank
hard

Fill both blanks to define a linear layer with 10 inputs and 5 outputs inside the model's __init__ method.

PyTorch
self.layer = nn.[1]([2], 5)
Drag options to blanks, or click blank then click option'
ALinear
B10
CConv2d
DReLU
Attempts:
3 left
💡 Hint
Common Mistakes
Using Conv2d which is for images, not linear layers.
Putting the output size before the input size.
5fill in blank
hard

Fill all three blanks to complete the forward method that applies the linear layer and then a ReLU activation.

PyTorch
def forward(self, x):
    x = self.layer(x)
    x = nn.[1]()([2])
    return [3]
Drag options to blanks, or click blank then click option'
ASigmoid
Bx
CReLU
Dself
Attempts:
3 left
💡 Hint
Common Mistakes
Using Sigmoid instead of ReLU as activation here.
Returning 'self' instead of the output tensor.