0
0
PyTorchml~10 mins

forward method in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to define the forward method input parameter.

PyTorch
def forward(self, [1]):
    return x * 2
Drag options to blanks, or click blank then click option'
Ainput
Bself
Coutput
Dx
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'self' as the input parameter instead of 'x'.
Using 'output' or 'input' which are not standard names here.
2fill in blank
medium

Complete the code to apply a linear layer inside the forward method.

PyTorch
def forward(self, x):
    out = self.[1](x)
    return out
Drag options to blanks, or click blank then click option'
Alinear
Brelu
Cconv
Ddropout
Attempts:
3 left
💡 Hint
Common Mistakes
Using activation functions like 'relu' instead of the layer.
Using 'conv' or 'dropout' which are different layer types.
3fill in blank
hard

Fix the error in the forward method to correctly apply ReLU activation.

PyTorch
def forward(self, x):
    out = self.linear(x)
    out = [1](out)
    return out
Drag options to blanks, or click blank then click option'
Ann.ReLU
Brelu
Cself.relu
DReLU
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'relu(out)' without 'self.' causes a NameError.
Using the class name 'nn.ReLU' without instantiation.
4fill in blank
hard

Fill both blanks to complete the forward method with two layers and activation.

PyTorch
def forward(self, x):
    out = self.[1](x)
    out = self.[2](out)
    return out
Drag options to blanks, or click blank then click option'
Alinear1
Blinear2
Crelu
Ddropout
Attempts:
3 left
💡 Hint
Common Mistakes
Using linear2 first instead of linear1.
Using dropout instead of activation.
5fill in blank
hard

Fill all three blanks to complete the forward method with two linear layers and activation.

PyTorch
def forward(self, x):
    out = self.[1](x)
    out = self.[2](out)
    out = self.[3](out)
    return out
Drag options to blanks, or click blank then click option'
Alinear1
Brelu
Clinear2
Ddropout
Attempts:
3 left
💡 Hint
Common Mistakes
Mixing the order of layers and activation.
Using dropout instead of linear2.