Bird
0
0

You have text sequences of varying lengths. You want to pad them to length 10 but keep the last 10 words only if longer. Which code correctly achieves this using Keras?

hard📝 Model Choice Q15 of 15
NLP - Sequence Models for NLP
You have text sequences of varying lengths. You want to pad them to length 10 but keep the last 10 words only if longer. Which code correctly achieves this using Keras?
Apad_sequences(sequences, maxlen=10, padding='post', truncating='pre')
Bpad_sequences(sequences, maxlen=10, padding='post', truncating='post')
Cpad_sequences(sequences, maxlen=10, padding='pre', truncating='post')
Dpad_sequences(sequences, maxlen=10, padding='pre', truncating='pre')
Step-by-Step Solution
Solution:
  1. Step 1: Understand padding and truncating sides

    Padding='pre' adds zeros at the start; truncating='pre' removes words from the start, keeping last words.
  2. Step 2: Match requirement to keep last 10 words

    To keep last 10 words, truncate from the start ('pre') and pad at the start ('pre').
  3. Final Answer:

    pad_sequences(sequences, maxlen=10, padding='pre', truncating='pre') -> Option D
  4. Quick Check:

    Keep last words = truncating='pre' [OK]
Quick Trick: Use truncating='pre' to keep last words, padding='pre' to pad start [OK]
Common Mistakes:
MISTAKES
  • Using padding='post' which pads end instead of start
  • Using truncating='post' which removes last words
  • Confusing padding and truncating parameters

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes