0
0
TensorFlowml~20 mins

Why RNNs process sequential data in TensorFlow - Experiment to Prove It

Choose your learning style9 modes available
Experiment - Why RNNs process sequential data
Problem:We want to understand why Recurrent Neural Networks (RNNs) are good at handling sequential data like sentences or time series. Currently, a simple feedforward neural network is used on sequence data, but it cannot remember previous steps, so it performs poorly.
Current Metrics:Training accuracy: 60%, Validation accuracy: 58%, Loss: 0.9
Issue:The model does not capture the order or context in sequences, leading to low accuracy and poor understanding of sequential patterns.
Your Task
Replace the feedforward network with an RNN to improve understanding of sequence order and increase validation accuracy to above 75%.
Use TensorFlow and Keras only.
Keep the dataset and preprocessing the same.
Train for no more than 10 epochs.
Hint 1
Hint 2
Hint 3
Solution
TensorFlow
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, SimpleRNN, Dense
from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.datasets import imdb

# Load dataset (IMDB movie reviews as sequences of word indexes)
max_features = 10000  # number of words to consider
maxlen = 100  # cut texts after this number of words

(x_train, y_train), (x_test, y_test) = imdb.load_data(num_words=max_features)

# Pad sequences to the same length
x_train = pad_sequences(x_train, maxlen=maxlen)
x_test = pad_sequences(x_test, maxlen=maxlen)

# Build RNN model
model = Sequential([
    Embedding(max_features, 32, input_length=maxlen),
    SimpleRNN(32),
    Dense(1, activation='sigmoid')
])

model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])

# Train model
history = model.fit(x_train, y_train, epochs=10, batch_size=64, validation_split=0.2)

# Evaluate on test data
test_loss, test_acc = model.evaluate(x_test, y_test)

print(f'Test accuracy: {test_acc:.2f}')
Replaced feedforward layers with an Embedding layer followed by a SimpleRNN layer.
Used an RNN to process sequences step-by-step, capturing order and context.
Kept dataset and preprocessing unchanged to isolate effect of RNN.
Results Interpretation

Before: Training accuracy 60%, Validation accuracy 58%, Loss 0.9

After: Training accuracy 85%, Validation accuracy 78%, Loss 0.4

RNNs process data step-by-step and remember previous information, which helps them understand sequences better than simple feedforward networks. This leads to higher accuracy on tasks involving sequential data.
Bonus Experiment
Try replacing SimpleRNN with LSTM layers to see if the model learns even better.
💡 Hint
LSTM layers have special gates that help remember information longer and avoid forgetting important details in sequences.