Challenge - 5 Problems
Activation Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate1:30remaining
Output of ReLU activation on a tensor
What is the output of the following PyTorch code applying ReLU activation?
PyTorch
import torch import torch.nn.functional as F x = torch.tensor([-2.0, 0.0, 3.0, -1.5, 2.5]) output = F.relu(x) print(output.tolist())
Attempts:
2 left
💡 Hint
ReLU replaces negative values with zero and keeps positive values unchanged.
✗ Incorrect
ReLU (Rectified Linear Unit) sets all negative values to zero and keeps positive values as they are. So, -2.0, -1.5 become 0.0, while 0.0, 3.0, 2.5 remain the same.
❓ Predict Output
intermediate1:30remaining
Output of Sigmoid activation on a tensor
What is the output of the following PyTorch code applying Sigmoid activation?
PyTorch
import torch import torch.nn.functional as F x = torch.tensor([-1.0, 0.0, 1.0]) output = torch.sigmoid(x) print([round(v.item(), 3) for v in output])
Attempts:
2 left
💡 Hint
Sigmoid outputs values between 0 and 1, with 0.5 at input zero.
✗ Incorrect
Sigmoid function formula is 1 / (1 + exp(-x)). For -1, 0, 1 inputs, outputs are approximately 0.269, 0.5, and 0.731 respectively.
❓ Model Choice
advanced1:30remaining
Choosing activation function for multi-class classification
Which activation function is most appropriate for the output layer of a neural network performing multi-class classification?
Attempts:
2 left
💡 Hint
The output should represent probabilities that sum to 1 across classes.
✗ Incorrect
Softmax converts raw scores into probabilities that sum to 1, which is ideal for multi-class classification tasks.
❓ Metrics
advanced1:30remaining
Effect of activation function on model output range
Which activation function restricts the output values strictly between 0 and 1, making it suitable for binary classification output?
Attempts:
2 left
💡 Hint
This function squashes input values into a probability-like range.
✗ Incorrect
Sigmoid outputs values strictly between 0 and 1, which can be interpreted as probabilities for binary classification.
🔧 Debug
expert2:00remaining
Identifying error in Softmax usage
What error will the following PyTorch code raise when applying Softmax incorrectly?
PyTorch
import torch import torch.nn.functional as F x = torch.tensor([[1.0, 2.0, 3.0]]) output = F.softmax(x) print(output)
Attempts:
2 left
💡 Hint
Softmax requires the dimension along which to apply the function.
✗ Incorrect
PyTorch's softmax function requires the 'dim' argument to specify the axis for normalization. Omitting it causes a TypeError.