0
0
PyTorchml~10 mins

Activation functions (ReLU, Sigmoid, Softmax) in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to apply the ReLU activation function to the tensor.

PyTorch
import torch
import torch.nn.functional as F

x = torch.tensor([-1.0, 0.0, 1.0, 2.0])
output = F.[1](x)
print(output)
Drag options to blanks, or click blank then click option'
Arelu
Bsigmoid
Csoftmax
Dtanh
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid or softmax instead of relu for this task.
Forgetting to import torch.nn.functional as F.
2fill in blank
medium

Complete the code to apply the Sigmoid activation function to the tensor.

PyTorch
import torch
import torch.nn.functional as F

x = torch.tensor([-2.0, 0.0, 2.0])
output = F.[1](x)
print(output)
Drag options to blanks, or click blank then click option'
Asoftmax
Brelu
Csigmoid
Dlog_softmax
Attempts:
3 left
💡 Hint
Common Mistakes
Using ReLU instead of Sigmoid for probability outputs.
Using softmax when only one dimension is present.
3fill in blank
hard

Fix the error in applying Softmax activation along the correct dimension.

PyTorch
import torch
import torch.nn.functional as F

x = torch.tensor([[1.0, 2.0, 3.0], [1.0, 2.0, 3.0]])
output = F.softmax(x, dim=[1])
print(output)
Drag options to blanks, or click blank then click option'
A0
B1
C2
D-1
Attempts:
3 left
💡 Hint
Common Mistakes
Using dim=0 applies softmax across batches, which is usually incorrect.
Using dim=2 causes an index error for 2D tensors.
4fill in blank
hard

Fill both blanks to create a tensor and apply ReLU activation.

PyTorch
import torch
import torch.nn.functional as F

x = torch.tensor([1])
output = F.[2](x)
print(output)
Drag options to blanks, or click blank then click option'
A[[-1.0, 0.0, 1.0], [2.0, -2.0, 3.0]]
B[1, 2, 3]
Crelu
Dsigmoid
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid instead of relu changes output range.
Using a 1D tensor when a 2D tensor is expected.
5fill in blank
hard

Fill all three blanks to create a tensor, apply Softmax along the correct dimension, and print the result.

PyTorch
import torch
import torch.nn.functional as F

x = torch.tensor([1])
output = F.[2](x, dim=[3])
print(output)
Drag options to blanks, or click blank then click option'
A[[2.0, 1.0, 0.1], [1.0, 3.0, 0.2]]
Brelu
C1
Dsoftmax
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu instead of softmax changes output meaning.
Using dim=0 applies softmax across batches, not features.