0
0
PyTorchml~10 mins

Bidirectional RNNs in PyTorch - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a bidirectional RNN layer in PyTorch.

PyTorch
import torch.nn as nn

rnn = nn.RNN(input_size=10, hidden_size=20, num_layers=1, bidirectional=[1])
Drag options to blanks, or click blank then click option'
ANone
BTrue
CFalse
D0
Attempts:
3 left
💡 Hint
Common Mistakes
Setting bidirectional to False or 0 will create a unidirectional RNN.
Using None will cause an error because bidirectional expects a boolean.
2fill in blank
medium

Complete the code to get the output shape of a bidirectional RNN layer.

PyTorch
output, hidden = rnn(input_seq)
output_shape = output.shape  # (seq_len, batch, [1] * hidden_size)
Drag options to blanks, or click blank then click option'
A1
Bnum_layers
C2
Dinput_size
Attempts:
3 left
💡 Hint
Common Mistakes
Using 1 instead of 2 will give incorrect output shape.
Confusing num_layers with bidirectionality factor.
3fill in blank
hard

Fix the error in the code to correctly initialize a bidirectional LSTM.

PyTorch
lstm = nn.LSTM(input_size=15, hidden_size=30, num_layers=2, bidirectional=[1])
Drag options to blanks, or click blank then click option'
ATrue
BFalse
CNone
D2
Attempts:
3 left
💡 Hint
Common Mistakes
Using integer 2 instead of boolean True causes an error.
Setting bidirectional to False disables bidirectionality.
4fill in blank
hard

Complete the code to slice the forward and backward outputs from a bidirectional RNN.

PyTorch
forward_out = output[:, :, :[1]]
backward_out = output[:, :, [1]:[2]]
Drag options to blanks, or click blank then click option'
Ahidden_size
B2 * hidden_size
Cnum_layers
Dinput_size
Attempts:
3 left
💡 Hint
Common Mistakes
Using 2 * hidden_size for the forward slice.
Forgetting the end slice for backward output.
Confusing output slicing with hidden state slicing.
5fill in blank
hard

Fill all three blanks to extract forward and backward hidden states using slicing.

PyTorch
forward_hn = hn[[1]::[2]]
backward_hn = hn[[3]::[2]]
Drag options to blanks, or click blank then click option'
A0
B1
C2
Dnum_layers
Attempts:
3 left
💡 Hint
Common Mistakes
Swapping start indices (using 1::2 for forward).
Using step=1 or other values instead of 2.
Using contiguous view instead of simple slicing.