Bird
0
0

You want to improve a chatbot's understanding by using embeddings. Which approach best captures semantic meaning for similar questions like "How are you?" and "How do you do?"?

hard📝 Application Q15 of 15
NLP - Word Embeddings
You want to improve a chatbot's understanding by using embeddings. Which approach best captures semantic meaning for similar questions like "How are you?" and "How do you do?"?
AUse only the first word's embedding as sentence meaning
BUse pretrained word embeddings and average their vectors for the whole sentence
CUse random vectors for each word without training
DUse one-hot encoding for each word and sum them
Step-by-Step Solution
Solution:
  1. Step 1: Understand sentence embedding from word embeddings

    Averaging pretrained word embeddings creates a vector representing the whole sentence's meaning.
  2. Step 2: Compare other options

    One-hot encoding loses semantic info, random vectors have no meaning, and using only first word misses context.
  3. Final Answer:

    Use pretrained word embeddings and average their vectors for the whole sentence -> Option B
  4. Quick Check:

    Average pretrained embeddings = better sentence meaning [OK]
Quick Trick: Average pretrained embeddings for sentence meaning [OK]
Common Mistakes:
MISTAKES
  • Using one-hot encoding which lacks meaning
  • Using random vectors without training
  • Ignoring all words except the first

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes