RNNs read data step-by-step, remembering what happened before. This helps them understand sequences like sentences or time series.
0
0
Why RNNs process sequential data in TensorFlow
Introduction
When you want to predict the next word in a sentence.
When analyzing time-based data like stock prices.
When processing audio or speech signals.
When working with video frames in order.
When understanding sequences of user actions over time.
Syntax
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.SimpleRNN(units, input_shape=(timesteps, features)),
tf.keras.layers.Dense(output_units)
])units is how many memory cells the RNN has.
input_shape tells the model the length of the sequence and features per step.
Examples
RNN with 10 units processing sequences of length 5 with 1 feature each.
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.SimpleRNN(10, input_shape=(5, 1)),
tf.keras.layers.Dense(1)
])RNN with 20 units for sequences of length 10 and 3 features per step, outputting 2 values.
TensorFlow
model = tf.keras.Sequential([
tf.keras.layers.SimpleRNN(20, input_shape=(10, 3)),
tf.keras.layers.Dense(2)
])Sample Model
This code creates a small RNN that reads sequences step-by-step. It trains on two simple sequences and tries to predict target values. After training, it prints the loss and predictions.
TensorFlow
import tensorflow as tf import numpy as np # Create simple sequential data: batch of 2 sequences, each 4 steps long, 1 feature per step x = np.array([[[1], [2], [3], [4]], [[4], [3], [2], [1]]], dtype=np.float32) # Define RNN model model = tf.keras.Sequential([ tf.keras.layers.SimpleRNN(5, input_shape=(4, 1)), tf.keras.layers.Dense(1) ]) model.compile(optimizer='adam', loss='mse') # Dummy targets y = np.array([[10], [20]], dtype=np.float32) # Train model for 3 epochs history = model.fit(x, y, epochs=3, verbose=0) # Predict on input predictions = model.predict(x) print(f"Loss after training: {history.history['loss'][-1]:.4f}") print(f"Predictions:\n{predictions}")
OutputSuccess
Important Notes
RNNs keep a memory of previous steps, which helps with sequences.
They work well for short to medium sequences but can struggle with very long ones.
Other RNN types like LSTM or GRU improve memory handling.
Summary
RNNs read data one step at a time, remembering past information.
This makes them good for tasks with ordered data like text or time series.
Simple syntax lets you build RNNs easily in TensorFlow.