0
0
TensorFlowml~10 mins

Activation functions (ReLU, sigmoid, softmax) in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to apply the ReLU activation function to the tensor.

TensorFlow
import tensorflow as tf

x = tf.constant([-3.0, 0.0, 5.0])
relu_output = tf.nn.[1](x)
print(relu_output.numpy())
Drag options to blanks, or click blank then click option'
Arelu
Bsigmoid
Csoftmax
Dtanh
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid or softmax instead of relu for this task.
Trying to call activation as a method on the tensor.
2fill in blank
medium

Complete the code to apply the sigmoid activation function to the tensor.

TensorFlow
import tensorflow as tf

x = tf.constant([-1.0, 0.0, 1.0])
sigmoid_output = tf.nn.[1](x)
print(sigmoid_output.numpy())
Drag options to blanks, or click blank then click option'
Asoftmax
Brelu
Csigmoid
Delu
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu or softmax instead of sigmoid.
Confusing sigmoid with tanh.
3fill in blank
hard

Fix the error in the code to correctly apply the softmax activation function along the last axis.

TensorFlow
import tensorflow as tf

logits = tf.constant([[1.0, 2.0, 3.0]])
softmax_output = tf.nn.softmax(logits, axis=[1])
print(softmax_output.numpy())
Drag options to blanks, or click blank then click option'
A0
B-1
C1
D2
Attempts:
3 left
💡 Hint
Common Mistakes
Using axis=0 or axis=1 which may not be the last axis.
Omitting the axis argument causing default behavior that may be incorrect.
4fill in blank
hard

Fill both blanks to create a dictionary comprehension that maps each word to its sigmoid activation applied to its length.

TensorFlow
words = ['hi', 'hello', 'hey']
import tensorflow as tf
result = {word: tf.nn.[1](tf.constant(len(word), dtype=tf.float32)) for word in [2]
print(result)
Drag options to blanks, or click blank then click option'
Asigmoid
Brelu
Cwords
Drange(3)
Attempts:
3 left
💡 Hint
Common Mistakes
Using relu instead of sigmoid for activation.
Iterating over range instead of the list of words.
5fill in blank
hard

Fill all three blanks to create a dictionary comprehension that maps each word to its softmax probability over lengths greater than 2.

TensorFlow
words = ['hi', 'hello', 'hey', 'greetings']
import tensorflow as tf
lengths = [len(word) for word in words if len(word) [1] 2]
lengths_tensor = tf.constant(lengths, dtype=tf.float32)
softmax_vals = tf.nn.[2](lengths_tensor, axis=[3])
result = {word: softmax_vals[i].numpy() for i, word in enumerate(words) if len(word) > 2}
print(result)
Drag options to blanks, or click blank then click option'
A>
Bsigmoid
C-1
Dsoftmax
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid instead of softmax for activation.
Using wrong axis value for softmax.
Using '<' instead of '>' in the filter condition.