Bird
0
0

Why is it important to choose the correct padding side ('pre' vs 'post') when preparing sequences for models like LSTM?

hard📝 Conceptual Q10 of 15
NLP - Sequence Models for NLP
Why is it important to choose the correct padding side ('pre' vs 'post') when preparing sequences for models like LSTM?
ABecause it affects the position of meaningful tokens relative to model input
BBecause padding side changes the vocabulary size
CBecause only 'post' padding is supported by LSTM layers
DBecause padding side determines the batch size
Step-by-Step Solution
Solution:
  1. Step 1: Understand token position importance

    LSTMs process sequences in order; padding side affects token positions.
  2. Step 2: Effect on model learning

    Incorrect padding side can shift important tokens, hurting model performance.
  3. Final Answer:

    Because it affects the position of meaningful tokens relative to model input -> Option A
  4. Quick Check:

    Padding side impacts token order and model input [OK]
Quick Trick: Padding side affects token order and model input [OK]
Common Mistakes:
MISTAKES
  • Thinking padding changes vocabulary size
  • Believing only one padding side works with LSTM
  • Confusing padding side with batch size

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes