Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to define the forward method input parameter.
PyTorch
def forward(self, [1]): return x * 2
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'self' as the input parameter instead of 'x'.
Using 'output' or 'input' which are not standard names here.
✗ Incorrect
The forward method takes the input tensor as a parameter, commonly named x.
2fill in blank
mediumComplete the code to apply a linear layer inside the forward method.
PyTorch
def forward(self, x): out = self.[1](x) return out
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using activation functions like 'relu' instead of the layer.
Using 'conv' or 'dropout' which are different layer types.
✗ Incorrect
The linear layer is applied by calling self.linear(x) inside the forward method.
3fill in blank
hardFix the error in the forward method to correctly apply ReLU activation.
PyTorch
def forward(self, x): out = self.linear(x) out = [1](out) return out
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'relu(out)' without 'self.' causes a NameError.
Using the class name 'nn.ReLU' without instantiation.
✗ Incorrect
The ReLU activation is applied as self.relu(out) if self.relu is defined in __init__.
4fill in blank
hardFill both blanks to complete the forward method with two layers and activation.
PyTorch
def forward(self, x): out = self.[1](x) out = self.[2](out) return out
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using
linear2 first instead of linear1.Using
dropout instead of activation.✗ Incorrect
The input is passed through linear1 then activated by relu.
5fill in blank
hardFill all three blanks to complete the forward method with two linear layers and activation.
PyTorch
def forward(self, x): out = self.[1](x) out = self.[2](out) out = self.[3](out) return out
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Mixing the order of layers and activation.
Using dropout instead of linear2.
✗ Incorrect
The input goes through linear1, then relu activation, then linear2 layer.