Complete the code to create a GRU layer with 32 units.
gru_layer = tf.keras.layers.GRU([1])The GRU layer is created with 32 units, which controls the output dimensionality.
Complete the code to add a GRU layer that returns sequences.
gru_layer = tf.keras.layers.GRU(64, [1]=True)
return_state instead, which returns the last state and hidden state separately.The return_sequences argument makes the GRU layer output the full sequence of hidden states.
Fix the error in the GRU layer creation by completing the missing argument.
gru_layer = tf.keras.layers.GRU(128, activation=[1])
softmax which is for classification outputs.sigmoid which is used inside gates but not as main activation.The GRU layer typically uses the tanh activation function for its internal state updates.
Fill both blanks to create a GRU layer with 50 units and dropout rate of 0.2.
gru_layer = tf.keras.layers.GRU([1], dropout=[2])
The GRU layer is set with 50 units and a dropout rate of 0.2 to reduce overfitting.
Fill all three blanks to build a GRU layer with 128 units, returning sequences, and using 'relu' activation.
gru_layer = tf.keras.layers.GRU([1], [2]=True, activation=[3])
This GRU layer has 128 units, returns the full sequence, and uses the 'relu' activation function.