0
0
TensorFlowml~20 mins

Activation functions (ReLU, sigmoid, softmax) in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Activation Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of ReLU activation on a tensor
What is the output of the following TensorFlow code applying ReLU activation?
TensorFlow
import tensorflow as tf
x = tf.constant([-3.0, 0.0, 2.0, -1.0, 5.0])
output = tf.nn.relu(x)
print(output.numpy())
A[0. 0. 0. 0. 0.]
B[-3. 0. 2. -1. 5.]
C[3. 0. 2. 1. 5.]
D[0. 0. 2. 0. 5.]
Attempts:
2 left
💡 Hint
ReLU sets all negative values to zero and keeps positive values unchanged.
Model Choice
intermediate
1:30remaining
Choosing activation for binary classification output layer
Which activation function is most appropriate for the output layer of a binary classification model?
ASigmoid
BTanh
CReLU
DSoftmax
Attempts:
2 left
💡 Hint
Binary classification outputs a probability between 0 and 1.
Metrics
advanced
2:30remaining
Effect of softmax on output probabilities
Given logits [2.0, 1.0, 0.1], what is the output of applying softmax activation?
TensorFlow
import tensorflow as tf
logits = tf.constant([2.0, 1.0, 0.1])
probabilities = tf.nn.softmax(logits)
print(probabilities.numpy())
A[0.7310586 0.2689414 0.0000000]
B[0.500000 0.300000 0.200000]
C[0.65900114 0.24243297 0.09856589]
D[0.3333333 0.3333333 0.3333333]
Attempts:
2 left
💡 Hint
Softmax converts logits into probabilities that sum to 1.
🔧 Debug
advanced
2:00remaining
Identifying error in sigmoid activation usage
What error will this TensorFlow code raise?
TensorFlow
import tensorflow as tf
x = tf.constant('string')
output = tf.nn.sigmoid(x)
print(output.numpy())
ATypeError: Input must be a numeric tensor
BValueError: Cannot convert string to float
CAttributeError: 'str' object has no attribute 'numpy'
DNo error, outputs sigmoid of string
Attempts:
2 left
💡 Hint
Sigmoid expects numeric input, not strings.
🧠 Conceptual
expert
3:00remaining
Why softmax is preferred over sigmoid for multi-class classification
Why is softmax activation preferred over sigmoid activation for the output layer in multi-class classification with mutually exclusive classes?
ASoftmax outputs probabilities that sum to 1, sigmoid outputs independent probabilities that do not sum to 1
BSigmoid is computationally more expensive than softmax
CSoftmax outputs independent probabilities for each class, sigmoid outputs dependent probabilities
DSoftmax can only be used with binary classification
Attempts:
2 left
💡 Hint
Consider the sum of output probabilities for all classes.