Bird
0
0

How can you ensure sequences are truncated to length 5 keeping the last tokens and padded if shorter using Keras?

hard📝 Application Q9 of 15
NLP - Sequence Models for NLP
How can you ensure sequences are truncated to length 5 keeping the last tokens and padded if shorter using Keras?
Apad_sequences(sequences, maxlen=5, truncating='pre', padding='post')
Bpad_sequences(sequences, maxlen=5, truncating='post', padding='post')
Cpad_sequences(sequences, maxlen=5, truncating='post', padding='pre')
Dpad_sequences(sequences, maxlen=5, truncating='pre', padding='pre')
Step-by-Step Solution
Solution:
  1. Step 1: Understand truncating parameter

    'truncating="post"' removes tokens from the end, so to keep last tokens, use 'post'.
  2. Step 2: Understand padding parameter

    'padding="post"' adds padding tokens at the end to reach maxlen.
  3. Step 3: Combine parameters

    To keep last tokens and pad at the end, use truncating='post' and padding='post'.
  4. Final Answer:

    pad_sequences(sequences, maxlen=5, truncating='post', padding='post') -> Option B
  5. Quick Check:

    Use truncating='post' to keep last tokens [OK]
Quick Trick: Use truncating='post' and padding='post' to keep last tokens [OK]
Common Mistakes:
MISTAKES
  • Using truncating='pre' which removes last tokens
  • Padding on the wrong side
  • Not specifying truncating when needed

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes