0
0
TensorFlowml~5 mins

Bidirectional RNN in TensorFlow

Choose your learning style9 modes available
Introduction

A Bidirectional RNN reads data both forwards and backwards to understand context better. This helps the model learn from past and future information at the same time.

When you want to understand sentences where words depend on both previous and next words.
When analyzing time series data where future and past values affect the current value.
When working with speech recognition to capture context from both directions.
When processing DNA sequences where patterns can depend on both ends.
When you want to improve accuracy by using information from the whole sequence.
Syntax
TensorFlow
tf.keras.layers.Bidirectional(tf.keras.layers.SimpleRNN(units))

The Bidirectional layer wraps a regular RNN layer to run it forwards and backwards.

You can use different RNN types inside, like SimpleRNN, LSTM, or GRU.

Examples
This creates a bidirectional SimpleRNN with 32 units.
TensorFlow
bidirectional_rnn = tf.keras.layers.Bidirectional(tf.keras.layers.SimpleRNN(32))
This creates a bidirectional LSTM with 64 units, useful for longer dependencies.
TensorFlow
bidirectional_lstm = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(64))
A simple model using a bidirectional GRU layer followed by a dense output layer for binary classification.
TensorFlow
model = tf.keras.Sequential([
  tf.keras.layers.Bidirectional(tf.keras.layers.GRU(50)),
  tf.keras.layers.Dense(1, activation='sigmoid')
])
Sample Model

This code creates a small dataset and trains a bidirectional LSTM model to classify sequences. It then predicts on new data.

TensorFlow
import tensorflow as tf
import numpy as np

# Create dummy sequential data: 100 samples, 10 time steps, 8 features
x_train = np.random.random((100, 10, 8))
y_train = np.random.randint(2, size=(100, 1))

# Build model with Bidirectional LSTM
model = tf.keras.Sequential([
    tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(16)),
    tf.keras.layers.Dense(1, activation='sigmoid')
])

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])

# Train for 3 epochs
history = model.fit(x_train, y_train, epochs=3, batch_size=16, verbose=2)

# Predict on new data
x_test = np.random.random((5, 10, 8))
predictions = model.predict(x_test)

print('Predictions:', predictions.flatten())
OutputSuccess
Important Notes

Bidirectional RNNs double the number of parameters because they run two RNNs (forward and backward).

They work best when the entire sequence is available before processing.

Use them when future context helps understand the data better.

Summary

Bidirectional RNNs read sequences forwards and backwards to capture full context.

They improve performance on tasks where both past and future information matter.

Easy to use in TensorFlow by wrapping any RNN layer with tf.keras.layers.Bidirectional.