0
0
PyTorchml~5 mins

nn.RNN layer in PyTorch - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is the purpose of the nn.RNN layer in PyTorch?
The nn.RNN layer processes sequences of data by passing information from one time step to the next, allowing the model to learn patterns over time.
Click to reveal answer
beginner
What are the main inputs to an nn.RNN layer?
The main inputs are the sequence data (shape: seq_len, batch_size, input_size) and an optional initial hidden state (shape: num_layers * num_directions, batch_size, hidden_size).
Click to reveal answer
beginner
What does the 'hidden_size' parameter control in nn.RNN?
It controls the size of the hidden state vector, which stores information from previous time steps and affects the model's capacity to learn patterns.
Click to reveal answer
intermediate
How does nn.RNN handle multiple layers and directions?
You can set 'num_layers' to stack multiple RNN layers, and 'bidirectional=True' to process sequences forwards and backwards, doubling the output size.
Click to reveal answer
beginner
What are the outputs of nn.RNN layer?
It outputs 'output' (all hidden states for each time step) and 'hidden' (the last hidden state for each layer and direction).
Click to reveal answer
What shape should the input sequence to nn.RNN have?
Abatch_size, seq_len, input_size
Bseq_len, batch_size, input_size
Cinput_size, seq_len, batch_size
Dbatch_size, input_size, seq_len
What does setting 'bidirectional=True' do in nn.RNN?
AProcesses the sequence forwards and backwards
BStacks multiple RNN layers
CChanges the activation function
DDisables the hidden state
Which of these is NOT an output of nn.RNN?
AOutput for all time steps
BLast hidden state
CPredicted class labels
DNone of the above
What does the 'hidden_size' parameter affect?
ASize of the hidden state vector
BLength of the input sequence
CNumber of layers
DBatch size
How can you provide an initial hidden state to nn.RNN?
ABy setting a parameter during initialization
BYou cannot provide an initial hidden state
CBy modifying the input sequence
DBy passing it as the second argument to the forward method
Explain how the nn.RNN layer processes a sequence of data step-by-step.
Think about how information flows through time steps in the RNN.
You got /4 concepts.
    Describe the difference between a unidirectional and bidirectional nn.RNN layer.
    Consider how reading the sequence backwards adds information.
    You got /3 concepts.