Bird
0
0

Why can cosine similarity between embeddings sometimes fail to capture true semantic similarity?

hard📝 Conceptual Q10 of 15
NLP - Text Similarity and Search
Why can cosine similarity between embeddings sometimes fail to capture true semantic similarity?
ABecause embeddings are always random vectors
BBecause cosine similarity ignores vector length
CBecause embeddings may not capture context or polysemy well
DBecause cosine similarity is sensitive to vector magnitude
Step-by-Step Solution
Solution:
  1. Step 1: Understand limitations of embeddings

    Embeddings may not fully capture word meanings in different contexts or multiple meanings.
  2. Step 2: Analyze cosine similarity behavior

    Cosine similarity measures angle, but if embeddings lack context, similarity may be misleading.
  3. Final Answer:

    Because embeddings may not capture context or polysemy well -> Option C
  4. Quick Check:

    Embedding limits cause similarity errors [OK]
Quick Trick: Embeddings may miss context or multiple meanings [OK]
Common Mistakes:
MISTAKES
  • Thinking cosine similarity depends on vector length
  • Assuming embeddings are random
  • Confusing cosine similarity sensitivity

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes