Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to create a simple sequence classification model using PyTorch's nn.Module.
PyTorch
import torch.nn as nn class SequenceClassifier(nn.Module): def __init__(self, input_dim, hidden_dim, output_dim): super().__init__() self.rnn = nn.LSTM(input_dim, hidden_dim, batch_first=True) self.fc = nn.Linear(hidden_dim, [1]) def forward(self, x): _, (hidden, _) = self.rnn(x) out = self.fc(hidden[-1]) return out
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using hidden_dim instead of output_dim for the linear layer output size.
Confusing input_dim with output_dim.
✗ Incorrect
The final linear layer should map from the hidden dimension to the output dimension, which represents the number of classes.
2fill in blank
mediumComplete the code to apply a softmax activation to the model output for classification probabilities.
PyTorch
import torch.nn.functional as F outputs = model(inputs) probabilities = F.[1](outputs, dim=1)
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using sigmoid instead of softmax for multi-class outputs.
Forgetting to specify the dimension for softmax.
✗ Incorrect
Softmax converts raw model outputs (logits) into probabilities that sum to 1 across classes.
3fill in blank
hardFix the error in the training loop by completing the missing optimizer step.
PyTorch
for inputs, labels in dataloader: optimizer.zero_grad() outputs = model(inputs) loss = criterion(outputs, labels) loss.backward() [1]
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Calling step() on loss or model instead of optimizer.
Forgetting to call optimizer.step() causing no training progress.
✗ Incorrect
After computing gradients with loss.backward(), optimizer.step() updates the model parameters.
4fill in blank
hardFill both blanks to create a dictionary comprehension that maps each word to its length only if the length is greater than 3.
PyTorch
words = ['apple', 'cat', 'banana', 'dog'] lengths = {word: [1] for word in words if [2]
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using word > 3 which compares string to int and causes error.
Using word.length which is not valid in Python.
✗ Incorrect
We map each word to its length using len(word), and filter words with length greater than 3 using len(word) > 3.
5fill in blank
hardFill all three blanks to create a dictionary comprehension that maps each word in words to its uppercase form only if the word length is less than 5.
PyTorch
words = ['apple', 'cat', 'banana', 'dog'] result = { [1]: [2] for [3] in words if len([3]) < 5 }
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'words' instead of 'word' in the for loop.
Using the wrong variable name causing NameError.
✗ Incorrect
We use 'word' as the key, 'word.upper()' as the value, and iterate with 'for word in words'.