Recall & Review
beginner
What is the purpose of the nn.RNN layer in PyTorch?
The nn.RNN layer processes sequences of data by passing information from one time step to the next, allowing the model to learn patterns over time.
Click to reveal answer
beginner
What are the main inputs to an nn.RNN layer?
The main inputs are the sequence data (shape: seq_len, batch_size, input_size) and an optional initial hidden state (shape: num_layers * num_directions, batch_size, hidden_size).
Click to reveal answer
beginner
What does the 'hidden_size' parameter control in nn.RNN?
It controls the size of the hidden state vector, which stores information from previous time steps and affects the model's capacity to learn patterns.
Click to reveal answer
intermediate
How does nn.RNN handle multiple layers and directions?
You can set 'num_layers' to stack multiple RNN layers, and 'bidirectional=True' to process sequences forwards and backwards, doubling the output size.
Click to reveal answer
beginner
What are the outputs of nn.RNN layer?
It outputs 'output' (all hidden states for each time step) and 'hidden' (the last hidden state for each layer and direction).
Click to reveal answer
What shape should the input sequence to nn.RNN have?
✗ Incorrect
The nn.RNN expects input with shape (seq_len, batch_size, input_size).
What does setting 'bidirectional=True' do in nn.RNN?
✗ Incorrect
Bidirectional RNN processes the input sequence in both forward and backward directions.
Which of these is NOT an output of nn.RNN?
✗ Incorrect
nn.RNN outputs hidden states, not direct class predictions.
What does the 'hidden_size' parameter affect?
✗ Incorrect
'hidden_size' controls the size of the hidden state vector in the RNN.
How can you provide an initial hidden state to nn.RNN?
✗ Incorrect
You can pass the initial hidden state as the second argument when calling the RNN layer.
Explain how the nn.RNN layer processes a sequence of data step-by-step.
Think about how information flows through time steps in the RNN.
You got /4 concepts.
Describe the difference between a unidirectional and bidirectional nn.RNN layer.
Consider how reading the sequence backwards adds information.
You got /3 concepts.