0
0
TensorFlowml~10 mins

GRU layer in TensorFlow - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a GRU layer with 32 units.

TensorFlow
gru_layer = tf.keras.layers.GRU([1])
Drag options to blanks, or click blank then click option'
A32
B64
C16
D128
Attempts:
3 left
💡 Hint
Common Mistakes
Using a number too large or too small without reason.
Forgetting to specify the number of units.
2fill in blank
medium

Complete the code to add a GRU layer that returns sequences.

TensorFlow
gru_layer = tf.keras.layers.GRU(64, [1]=True)
Drag options to blanks, or click blank then click option'
Areturn_state
Bgo_backwards
Cstateful
Dreturn_sequences
Attempts:
3 left
💡 Hint
Common Mistakes
Using return_state instead, which returns the last state and hidden state separately.
Forgetting to set this argument when sequence output is needed.
3fill in blank
hard

Fix the error in the GRU layer creation by completing the missing argument.

TensorFlow
gru_layer = tf.keras.layers.GRU(128, activation=[1])
Drag options to blanks, or click blank then click option'
A'softmax'
B'sigmoid'
C'tanh'
D'relu'
Attempts:
3 left
💡 Hint
Common Mistakes
Using softmax which is for classification outputs.
Using sigmoid which is used inside gates but not as main activation.
4fill in blank
hard

Fill both blanks to create a GRU layer with 50 units and dropout rate of 0.2.

TensorFlow
gru_layer = tf.keras.layers.GRU([1], dropout=[2])
Drag options to blanks, or click blank then click option'
A50
B0.2
C0.5
D100
Attempts:
3 left
💡 Hint
Common Mistakes
Confusing dropout with recurrent_dropout.
Using dropout values greater than 1.
5fill in blank
hard

Fill all three blanks to build a GRU layer with 128 units, returning sequences, and using 'relu' activation.

TensorFlow
gru_layer = tf.keras.layers.GRU([1], [2]=True, activation=[3])
Drag options to blanks, or click blank then click option'
A128
Breturn_sequences
C'relu'
D'tanh'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'tanh' activation instead of 'relu' as requested.
Forgetting to set return_sequences to True.