Complete the code to import the Bidirectional layer from Keras.
from tensorflow.keras.layers import [1]
The Bidirectional layer wraps an RNN to make it process input in both forward and backward directions.
Complete the code to create a Bidirectional LSTM layer with 64 units.
model.add(Bidirectional([1](64)))
The Bidirectional wrapper is used with recurrent layers like LSTM to process sequences forward and backward.
Fix the error in the code to correctly compile the model with categorical crossentropy loss.
model.compile(optimizer='adam', loss='[1]', metrics=['accuracy'])
For multi-class classification, categorical_crossentropy is the correct loss function.
Fill both blanks to create a Bidirectional LSTM layer that returns sequences and uses 128 units.
model.add(Bidirectional([1](128, [2]=True)))
The LSTM layer inside Bidirectional should have return_sequences=True to output the full sequence.
Fill all three blanks to build a simple Bidirectional LSTM model for text classification.
model = Sequential() model.add(Embedding(input_dim=[1], output_dim=[2], input_length=100)) model.add(Bidirectional(LSTM([3]))) model.add(Dense(5, activation='softmax'))
The Embedding layer needs input_dim=10000 for vocabulary size, output_dim=64 for embedding size, and the LSTM layer uses 128 units inside Bidirectional.