0
0
PyTorchml~10 mins

Forward pass computation in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to perform a forward pass using a PyTorch model.

PyTorch
output = model([1])
Drag options to blanks, or click blank then click option'
Aloss
Bmodel
Coptimizer
Dinput_tensor
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the model itself instead of the input data.
Passing the loss or optimizer instead of input data.
2fill in blank
medium

Complete the code to apply the ReLU activation function after the linear layer.

PyTorch
x = torch.nn.functional.[1](linear_layer(input_tensor))
Drag options to blanks, or click blank then click option'
Arelu
Btanh
Csoftmax
Dsigmoid
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid or softmax which are used in different contexts.
Forgetting to apply any activation function.
3fill in blank
hard

Fix the error in the forward pass by completing the missing method call.

PyTorch
output = model.[1](input_tensor)
Drag options to blanks, or click blank then click option'
Aforward
Bbackward
Ctrain
Deval
Attempts:
3 left
💡 Hint
Common Mistakes
Using backward which is for gradients, not forward pass.
Using train or eval which are mode settings.
4fill in blank
hard

Fill both blanks to compute the output and apply softmax activation.

PyTorch
logits = model([1])
probabilities = torch.nn.functional.[2](logits, dim=1)
Drag options to blanks, or click blank then click option'
Ainput_tensor
Brelu
Csoftmax
Dsigmoid
Attempts:
3 left
💡 Hint
Common Mistakes
Using ReLU or sigmoid instead of softmax for probabilities.
Passing something other than input tensor to the model.
5fill in blank
hard

Fill all three blanks to perform a forward pass, apply ReLU, and compute loss.

PyTorch
logits = model([1])
activated = torch.nn.functional.[2](logits)
loss = loss_fn(activated, [3])
Drag options to blanks, or click blank then click option'
Ainput_tensor
Brelu
Ctarget_tensor
Dsigmoid
Attempts:
3 left
💡 Hint
Common Mistakes
Mixing up input and target tensors.
Using sigmoid instead of ReLU for activation here.