Challenge - 5 Problems
Activation Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ Predict Output
intermediate2:00remaining
Output of ReLU activation on a tensor
What is the output of the following TensorFlow code applying ReLU activation?
TensorFlow
import tensorflow as tf x = tf.constant([-3.0, 0.0, 2.0, -1.0, 5.0]) output = tf.nn.relu(x) print(output.numpy())
Attempts:
2 left
💡 Hint
ReLU sets all negative values to zero and keeps positive values unchanged.
✗ Incorrect
ReLU (Rectified Linear Unit) replaces negative values with zero and leaves positive values as they are.
❓ Model Choice
intermediate1:30remaining
Choosing activation for binary classification output layer
Which activation function is most appropriate for the output layer of a binary classification model?
Attempts:
2 left
💡 Hint
Binary classification outputs a probability between 0 and 1.
✗ Incorrect
Sigmoid outputs values between 0 and 1, suitable for binary classification probabilities.
❓ Metrics
advanced2:30remaining
Effect of softmax on output probabilities
Given logits [2.0, 1.0, 0.1], what is the output of applying softmax activation?
TensorFlow
import tensorflow as tf logits = tf.constant([2.0, 1.0, 0.1]) probabilities = tf.nn.softmax(logits) print(probabilities.numpy())
Attempts:
2 left
💡 Hint
Softmax converts logits into probabilities that sum to 1.
✗ Incorrect
Softmax exponentiates each logit and normalizes by the sum of all exponentials, producing probabilities.
🔧 Debug
advanced2:00remaining
Identifying error in sigmoid activation usage
What error will this TensorFlow code raise?
TensorFlow
import tensorflow as tf x = tf.constant('string') output = tf.nn.sigmoid(x) print(output.numpy())
Attempts:
2 left
💡 Hint
Sigmoid expects numeric input, not strings.
✗ Incorrect
TensorFlow tries to convert string tensor to float for sigmoid but fails, raising ValueError.
🧠 Conceptual
expert3:00remaining
Why softmax is preferred over sigmoid for multi-class classification
Why is softmax activation preferred over sigmoid activation for the output layer in multi-class classification with mutually exclusive classes?
Attempts:
2 left
💡 Hint
Consider the sum of output probabilities for all classes.
✗ Incorrect
Softmax normalizes outputs so all class probabilities sum to 1, reflecting mutually exclusive classes. Sigmoid treats each class independently.