RNNs can remember past information to understand data that comes in order, like sentences or time steps.
0
0
Why RNNs handle sequences in PyTorch
Introduction
When you want to predict the next word in a sentence.
When analyzing time series data like stock prices.
When processing audio signals that change over time.
When translating languages where word order matters.
When recognizing handwriting or gestures that happen step by step.
Syntax
PyTorch
rnn = torch.nn.RNN(input_size, hidden_size, num_layers) output, hidden = rnn(input_sequence, hidden_state)
input_sequence shape is (sequence_length, batch_size, input_size).
hidden_state holds memory from previous steps and is optional for the first input.
Examples
Creates an RNN with input size 10 and hidden size 20, processes a sequence of length 5.
PyTorch
rnn = torch.nn.RNN(10, 20, 1) input_seq = torch.randn(5, 1, 10) output, hidden = rnn(input_seq)
Starts with an initial hidden state of zeros to process the input sequence.
PyTorch
hidden = torch.zeros(1, 1, 20) output, hidden = rnn(input_seq, hidden)
Sample Model
This code creates a simple RNN to process a sequence of 4 steps, each with 3 features. It shows how the RNN outputs a result for each step and keeps a hidden state that remembers past information.
PyTorch
import torch import torch.nn as nn # Define RNN parameters input_size = 3 hidden_size = 5 sequence_length = 4 batch_size = 1 # Create RNN layer rnn = nn.RNN(input_size, hidden_size, num_layers=1) # Create a random input sequence (sequence_length, batch_size, input_size) input_seq = torch.randn(sequence_length, batch_size, input_size) # Initialize hidden state with zeros hidden = torch.zeros(1, batch_size, hidden_size) # Forward pass through RNN output, hidden = rnn(input_seq, hidden) print("Input sequence shape:", input_seq.shape) print("Output shape:", output.shape) print("Hidden state shape:", hidden.shape) print("Output at last time step:", output[-1])
OutputSuccess
Important Notes
RNNs process data step by step, passing information forward through hidden states.
They are good for sequences but can struggle with very long sequences due to forgetting.
Summary
RNNs remember past inputs to understand sequences.
They take input one step at a time and keep a hidden state as memory.
This makes them useful for language, time series, and other ordered data.