A Bidirectional RNN reads data both forwards and backwards to understand context better. This helps the model learn from past and future information at the same time.
Bidirectional RNN in TensorFlow
tf.keras.layers.Bidirectional(tf.keras.layers.SimpleRNN(units))
The Bidirectional layer wraps a regular RNN layer to run it forwards and backwards.
You can use different RNN types inside, like SimpleRNN, LSTM, or GRU.
bidirectional_rnn = tf.keras.layers.Bidirectional(tf.keras.layers.SimpleRNN(32))bidirectional_lstm = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(64))model = tf.keras.Sequential([ tf.keras.layers.Bidirectional(tf.keras.layers.GRU(50)), tf.keras.layers.Dense(1, activation='sigmoid') ])
This code creates a small dataset and trains a bidirectional LSTM model to classify sequences. It then predicts on new data.
import tensorflow as tf import numpy as np # Create dummy sequential data: 100 samples, 10 time steps, 8 features x_train = np.random.random((100, 10, 8)) y_train = np.random.randint(2, size=(100, 1)) # Build model with Bidirectional LSTM model = tf.keras.Sequential([ tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(16)), tf.keras.layers.Dense(1, activation='sigmoid') ]) model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy']) # Train for 3 epochs history = model.fit(x_train, y_train, epochs=3, batch_size=16, verbose=2) # Predict on new data x_test = np.random.random((5, 10, 8)) predictions = model.predict(x_test) print('Predictions:', predictions.flatten())
Bidirectional RNNs double the number of parameters because they run two RNNs (forward and backward).
They work best when the entire sequence is available before processing.
Use them when future context helps understand the data better.
Bidirectional RNNs read sequences forwards and backwards to capture full context.
They improve performance on tasks where both past and future information matter.
Easy to use in TensorFlow by wrapping any RNN layer with tf.keras.layers.Bidirectional.