Recall & Review
beginner
What is a Bidirectional RNN?
A Bidirectional RNN is a type of recurrent neural network that processes data in both forward and backward directions. This helps the model understand context from past and future inputs, improving sequence learning.
Click to reveal answer
beginner
How does a Bidirectional RNN differ from a standard RNN?
A standard RNN processes the sequence only from start to end (forward). A Bidirectional RNN processes the sequence twice: once forward and once backward, then combines both outputs for better context understanding.
Click to reveal answer
beginner
In PyTorch, how do you enable bidirectionality in an RNN layer?
You set the argument `bidirectional=True` when creating the RNN, LSTM, or GRU layer. For example: `nn.LSTM(input_size, hidden_size, bidirectional=True)`.
Click to reveal answer
intermediate
What is the shape of the output from a bidirectional RNN layer in PyTorch?
The output shape is `(seq_len, batch, num_directions * hidden_size)`. Since `num_directions` is 2 for bidirectional, the hidden size doubles in the output dimension.
Click to reveal answer
intermediate
Why might bidirectional RNNs improve performance on tasks like speech recognition or text analysis?
Because they consider both past and future context in the sequence, bidirectional RNNs can better understand dependencies and meaning, leading to more accurate predictions.
Click to reveal answer
What does setting `bidirectional=True` do in a PyTorch RNN layer?
✗ Incorrect
Setting `bidirectional=True` makes the RNN process the input sequence in both forward and backward directions.
If a unidirectional LSTM has hidden size 128, what is the hidden size of a bidirectional LSTM output?
✗ Incorrect
Bidirectional LSTM doubles the hidden size in output because it concatenates forward and backward outputs: 128 * 2 = 256.
Which of these tasks benefits most from bidirectional RNNs?
✗ Incorrect
Sequence labeling tasks benefit from understanding context before and after each element, which bidirectional RNNs provide.
In PyTorch, what is the output shape of a bidirectional RNN given input shape (seq_len, batch, input_size)?
✗ Incorrect
The output shape is (seq_len, batch, 2 * hidden_size) because of concatenation of forward and backward outputs.
What is a key advantage of using bidirectional RNNs?
✗ Incorrect
Bidirectional RNNs improve context understanding by looking at both past and future inputs.
Explain how a bidirectional RNN processes a sequence differently than a standard RNN.
Think about reading a sentence forwards and backwards.
You got /3 concepts.
Describe how to implement a bidirectional LSTM in PyTorch and what changes in the output shape.
Check the PyTorch LSTM parameters and output dimensions.
You got /3 concepts.