Complete the code to create a bidirectional RNN layer using TensorFlow.
import tensorflow as tf model = tf.keras.Sequential([ tf.keras.layers.[1]( tf.keras.layers.SimpleRNN(32), input_shape=(10, 8)) ])
The Bidirectional wrapper is used to create a bidirectional RNN layer in TensorFlow.
Complete the code to get the output shape of a bidirectional RNN layer.
import tensorflow as tf layer = tf.keras.layers.Bidirectional(tf.keras.layers.SimpleRNN(16)) input_data = tf.random.normal([5, 10, 8]) output = layer(input_data) print(output.shape) # Expected shape: (5, [1])
The output size doubles because the RNN runs forward and backward, so 16 units become 32.
Fix the error in the code to correctly create a bidirectional LSTM layer.
import tensorflow as tf model = tf.keras.Sequential([ tf.keras.layers.Bidirectional(tf.keras.layers.[1](64), input_shape=(20, 10)) ])
The LSTM layer is the correct RNN type to use inside the Bidirectional wrapper for LSTM networks.
Fill both blanks to create a bidirectional GRU layer that returns sequences.
import tensorflow as tf layer = tf.keras.layers.Bidirectional( tf.keras.layers.[1](128, return_sequences=[2]) )
Use GRU for the layer type and set return_sequences=True to get output at each time step.
Fill all three blanks to build a bidirectional RNN model with an LSTM layer, a dropout layer, and a dense output layer.
import tensorflow as tf model = tf.keras.Sequential([ tf.keras.layers.Bidirectional(tf.keras.layers.[1](64), input_shape=(15, 12)), tf.keras.layers.[2](0.5), tf.keras.layers.[3](10, activation='softmax') ])
The model uses a LSTM inside Bidirectional, followed by a Dropout layer to reduce overfitting, and a Dense layer with softmax activation for classification output.