0
0
PyTorchml~20 mins

Why RNNs handle sequences in PyTorch - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Sequence Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
Why do RNNs remember previous inputs?

Recurrent Neural Networks (RNNs) are special because they can remember information from earlier in a sequence. Why is this possible?

ABecause RNNs treat each input independently without sharing information.
BBecause RNNs have loops that pass information from one step to the next in the sequence.
CBecause RNNs use random noise to guess previous inputs.
DBecause RNNs use convolution layers to scan the entire sequence at once.
Attempts:
2 left
💡 Hint

Think about how RNNs connect one step to the next in a sequence.

Predict Output
intermediate
2:00remaining
Output shape of RNN with sequence input

Consider this PyTorch code snippet creating an RNN and passing a batch of sequences through it. What is the shape of the output tensor?

PyTorch
import torch
import torch.nn as nn

rnn = nn.RNN(input_size=5, hidden_size=3, num_layers=1, batch_first=True)
input_seq = torch.randn(4, 7, 5)  # batch=4, seq_len=7, input_size=5
output, hidden = rnn(input_seq)
print(output.shape)
Atorch.Size([4, 7, 3])
Btorch.Size([7, 4, 3])
Ctorch.Size([4, 3, 7])
Dtorch.Size([3, 4, 7])
Attempts:
2 left
💡 Hint

Remember that batch_first=True means batch size is the first dimension.

Model Choice
advanced
1:30remaining
Choosing RNN for sequence data

You want to build a model that predicts the next word in a sentence. Which model type is best suited to handle the sequence nature of this task?

AA clustering algorithm that groups similar words.
BA feedforward neural network that treats each word independently.
CA convolutional neural network that scans fixed-size windows of words.
DA recurrent neural network that processes words one by one and remembers previous words.
Attempts:
2 left
💡 Hint

Think about which model can remember previous words in a sentence.

Hyperparameter
advanced
1:30remaining
Effect of hidden size in RNNs

In an RNN, what is the effect of increasing the hidden_size parameter?

AIt increases the number of features the RNN can remember at each step, allowing it to capture more complex patterns.
BIt reduces the sequence length the RNN can process.
CIt changes the input size of the RNN to accept larger vectors.
DIt decreases the number of layers in the RNN.
Attempts:
2 left
💡 Hint

Think about what hidden_size controls inside the RNN cell.

Metrics
expert
2:00remaining
Evaluating sequence prediction with RNN

You trained an RNN to predict the next character in a text sequence. After training, you want to measure how well it predicts. Which metric is most appropriate to evaluate this task?

AMean squared error between predicted and true characters encoded as numbers.
BSilhouette score measuring cluster separation.
CAccuracy of predicted characters compared to true next characters.
DConfusion matrix of predicted vs true sentence lengths.
Attempts:
2 left
💡 Hint

Think about what it means to predict the next character correctly.