0
0
TensorFlowml~5 mins

Time series with RNN in TensorFlow

Choose your learning style9 modes available
Introduction

We use RNNs to understand data that changes over time, like weather or stock prices. They help predict what comes next by remembering past information.

Predicting tomorrow's temperature based on past days
Forecasting sales for the next month using previous sales data
Analyzing heart rate signals to detect irregularities
Predicting energy usage in a building over time
Understanding speech or text sequences for voice assistants
Syntax
TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.SimpleRNN(units, input_shape=(time_steps, features)),
    tf.keras.layers.Dense(output_units)
])

model.compile(optimizer='adam', loss='mse')

SimpleRNN layer processes sequences step-by-step.

input_shape is (time steps, features per step).

Examples
RNN with 10 units, input sequences of length 5 with 1 feature each, output is 1 number.
TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.SimpleRNN(10, input_shape=(5, 1)),
    tf.keras.layers.Dense(1)
])
Stacked RNN layers: first returns full sequence, second returns last output, then dense layer.
TensorFlow
model = tf.keras.Sequential([
    tf.keras.layers.SimpleRNN(20, return_sequences=True, input_shape=(10, 3)),
    tf.keras.layers.SimpleRNN(10),
    tf.keras.layers.Dense(1)
])
Sample Model

This code trains a simple RNN to predict the next number in a sequence. It uses sequences of 10 numbers and tries to guess the next number. After training, it predicts the next number after 100-109.

TensorFlow
import numpy as np
import tensorflow as tf

# Create simple time series data: y = next value after sequence
np.random.seed(0)
time_steps = 10
features = 1
samples = 100

# Generate sequences
X = np.array([np.arange(i, i + time_steps) for i in range(samples)])
X = X[..., np.newaxis]  # shape (samples, time_steps, features)

# Targets are next value after sequence
Y = np.array([i + time_steps for i in range(samples)])

# Build model
model = tf.keras.Sequential([
    tf.keras.layers.SimpleRNN(20, input_shape=(time_steps, features)),
    tf.keras.layers.Dense(1)
])

model.compile(optimizer='adam', loss='mse')

# Train model
history = model.fit(X, Y, epochs=10, verbose=0)

# Predict next value for a new sequence
test_seq = np.array([np.arange(100, 110)])[..., np.newaxis]
pred = model.predict(test_seq)

print(f"Prediction for input sequence 100-109: {pred[0,0]:.2f}")
print(f"Actual next value: 110")
print(f"Final training loss: {history.history['loss'][-1]:.4f}")
OutputSuccess
Important Notes

RNNs remember past steps but can forget long ago information.

For longer sequences, consider LSTM or GRU layers instead of SimpleRNN.

Normalize your data for better training results.

Summary

RNNs help predict future values from past sequences.

Use SimpleRNN layer for basic sequence learning.

Train with sequences shaped as (samples, time steps, features).