Bird
0
0

You are designing a GRU-based model to classify text documents of varying lengths. Which PyTorch utility helps efficiently process batches with padded sequences?

hard📝 Application Q8 of 15
NLP - Sequence Models for NLP
You are designing a GRU-based model to classify text documents of varying lengths. Which PyTorch utility helps efficiently process batches with padded sequences?
Apack_padded_sequence and pad_packed_sequence
Btorch.nn.utils.rnn.pad_sequence only
Ctorch.nn.functional.one_hot encoding
Dtorch.utils.data.DataLoader with shuffle=True
Step-by-Step Solution
Solution:
  1. Step 1: Understand variable-length sequence handling

    GRUs require fixed-length inputs, so padding is used.
  2. Step 2: Efficient processing with packing

    PyTorch provides pack_padded_sequence to pack padded sequences and avoid computation on padding tokens.
  3. Step 3: Unpack outputs

    pad_packed_sequence converts packed sequences back to padded tensors.
  4. Final Answer:

    pack_padded_sequence and pad_packed_sequence -> Option A
  5. Quick Check:

    Use packing utilities for variable-length batches [OK]
Quick Trick: Use pack_padded_sequence for variable-length inputs [OK]
Common Mistakes:
MISTAKES
  • Using only pad_sequence without packing
  • Confusing one-hot encoding with padding
  • Ignoring sequence length information

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes