0
0
PyTorchml~5 mins

Bidirectional RNNs in PyTorch - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is a Bidirectional RNN?
A Bidirectional RNN is a type of recurrent neural network that processes data in both forward and backward directions. This helps the model understand context from past and future inputs, improving sequence learning.
Click to reveal answer
beginner
How does a Bidirectional RNN differ from a standard RNN?
A standard RNN processes the sequence only from start to end (forward). A Bidirectional RNN processes the sequence twice: once forward and once backward, then combines both outputs for better context understanding.
Click to reveal answer
beginner
In PyTorch, how do you enable bidirectionality in an RNN layer?
You set the argument `bidirectional=True` when creating the RNN, LSTM, or GRU layer. For example: `nn.LSTM(input_size, hidden_size, bidirectional=True)`.
Click to reveal answer
intermediate
What is the shape of the output from a bidirectional RNN layer in PyTorch?
The output shape is `(seq_len, batch, num_directions * hidden_size)`. Since `num_directions` is 2 for bidirectional, the hidden size doubles in the output dimension.
Click to reveal answer
intermediate
Why might bidirectional RNNs improve performance on tasks like speech recognition or text analysis?
Because they consider both past and future context in the sequence, bidirectional RNNs can better understand dependencies and meaning, leading to more accurate predictions.
Click to reveal answer
What does setting `bidirectional=True` do in a PyTorch RNN layer?
AProcesses the sequence forwards and backwards
BProcesses the sequence only forwards
CProcesses the sequence only backwards
DDisables the RNN layer
If a unidirectional LSTM has hidden size 128, what is the hidden size of a bidirectional LSTM output?
A128
B512
C64
D256
Which of these tasks benefits most from bidirectional RNNs?
AImage classification
BSequence labeling like named entity recognition
CSorting numbers
DSimple linear regression
In PyTorch, what is the output shape of a bidirectional RNN given input shape (seq_len, batch, input_size)?
A(seq_len, batch, hidden_size)
B(batch, seq_len, hidden_size)
C(seq_len, batch, 2 * hidden_size)
D(seq_len, 2 * batch, hidden_size)
What is a key advantage of using bidirectional RNNs?
ABetter context understanding from both past and future
BUses less memory
CFaster training time
DSimpler model architecture
Explain how a bidirectional RNN processes a sequence differently than a standard RNN.
Think about reading a sentence forwards and backwards.
You got /3 concepts.
    Describe how to implement a bidirectional LSTM in PyTorch and what changes in the output shape.
    Check the PyTorch LSTM parameters and output dimensions.
    You got /3 concepts.