0
0
PyTorchml~5 mins

nn.GRU layer in PyTorch - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What does the nn.GRU layer in PyTorch do?
The nn.GRU layer processes sequences by using Gated Recurrent Units to keep track of information over time, helping models understand order and context in data like sentences or time series.
Click to reveal answer
intermediate
What are the main components inside a GRU cell?
A GRU cell has two gates: the update gate, which decides how much past information to keep, and the reset gate, which decides how to combine new input with past memory.
Click to reveal answer
beginner
How do you create a simple nn.GRU layer in PyTorch for input size 10 and hidden size 20?
Use: nn.GRU(input_size=10, hidden_size=20). This sets the input feature size to 10 and the hidden state size to 20.
Click to reveal answer
intermediate
What is the shape of the output from nn.GRU when batch_first=True and input shape is (batch, seq_len, input_size)?
The output shape is (batch, seq_len, num_directions * hidden_size). It gives the hidden states for each time step in the sequence.
Click to reveal answer
intermediate
Why might you choose GRU over LSTM in a model?
GRUs are simpler and faster to train because they have fewer gates than LSTMs, but still handle sequence data well, making them good for smaller datasets or faster experiments.
Click to reveal answer
What does the update gate in a GRU control?
AHow much past information to keep
BHow to reset the hidden state
CThe input feature size
DThe output sequence length
In PyTorch, what argument makes nn.GRU expect input shape as (batch, seq_len, input_size)?
Adropout=0.5
Bbidirectional=True
Cnum_layers=2
Dbatch_first=True
Which of these is NOT a gate in a GRU cell?
AForget gate
BUpdate gate
CReset gate
DNone of the above
What is the main advantage of GRU compared to LSTM?
ARequires more memory
BHandles longer sequences better
CSimpler and faster to train
DHas more gates
What does the hidden_size parameter in nn.GRU specify?
AThe length of the input sequence
BThe size of the hidden state vector
CThe number of layers
DThe batch size
Explain how a GRU layer processes sequence data and why it is useful.
Think about how GRU keeps important information from the past while reading new data.
You got /5 concepts.
    Describe how to set up and use an nn.GRU layer in PyTorch including input and output shapes.
    Consider the shape of input and output tensors and the key parameters.
    You got /5 concepts.