0
0
PyTorchml~20 mins

Activation functions (ReLU, Sigmoid, Softmax) in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Activation Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
1:30remaining
Output of ReLU activation on a tensor
What is the output of the following PyTorch code applying ReLU activation?
PyTorch
import torch
import torch.nn.functional as F
x = torch.tensor([-2.0, 0.0, 3.0, -1.5, 2.5])
output = F.relu(x)
print(output.tolist())
A[-2.0, 0.0, 0.0, -1.5, 0.0]
B[-2.0, 0.0, 3.0, -1.5, 2.5]
C[2.0, 0.0, 3.0, 1.5, 2.5]
D[0.0, 0.0, 3.0, 0.0, 2.5]
Attempts:
2 left
💡 Hint
ReLU replaces negative values with zero and keeps positive values unchanged.
Predict Output
intermediate
1:30remaining
Output of Sigmoid activation on a tensor
What is the output of the following PyTorch code applying Sigmoid activation?
PyTorch
import torch
import torch.nn.functional as F
x = torch.tensor([-1.0, 0.0, 1.0])
output = torch.sigmoid(x)
print([round(v.item(), 3) for v in output])
A[-0.731, 0.0, 0.731]
B[0.0, 0.5, 1.0]
C[0.269, 0.5, 0.731]
D[0.5, 0.5, 0.5]
Attempts:
2 left
💡 Hint
Sigmoid outputs values between 0 and 1, with 0.5 at input zero.
Model Choice
advanced
1:30remaining
Choosing activation function for multi-class classification
Which activation function is most appropriate for the output layer of a neural network performing multi-class classification?
ASigmoid
BSoftmax
CReLU
DTanh
Attempts:
2 left
💡 Hint
The output should represent probabilities that sum to 1 across classes.
Metrics
advanced
1:30remaining
Effect of activation function on model output range
Which activation function restricts the output values strictly between 0 and 1, making it suitable for binary classification output?
ASigmoid
BReLU
CSoftmax
DLinear
Attempts:
2 left
💡 Hint
This function squashes input values into a probability-like range.
🔧 Debug
expert
2:00remaining
Identifying error in Softmax usage
What error will the following PyTorch code raise when applying Softmax incorrectly?
PyTorch
import torch
import torch.nn.functional as F
x = torch.tensor([[1.0, 2.0, 3.0]])
output = F.softmax(x)
print(output)
ATypeError: softmax() missing 1 required positional argument: 'dim'
BRuntimeError: input tensor must be 1D or 2D
CValueError: invalid shape for softmax input
DNo error, outputs softmax probabilities
Attempts:
2 left
💡 Hint
Softmax requires the dimension along which to apply the function.