NLP - Text Similarity and SearchGiven two sentence embeddings `emb1 = [0.5, 0.5]` and `emb2 = [0.5, -0.5]`, what is their cosine similarity?A1.0B0.5C-1.0D0.0Check Answer
Step-by-Step SolutionSolution:Step 1: Calculate dot product of emb1 and emb2Dot product = 0.5*0.5 + 0.5*(-0.5) = 0.25 - 0.25 = 0Step 2: Calculate magnitudes and cosine similarityMagnitudes are both sqrt(0.5^2 + 0.5^2) = sqrt(0.5) ≈ 0.707; cosine similarity = 0 / (0.707*0.707) = 0Final Answer:0.0 -> Option DQuick Check:Dot product zero means cosine similarity zero [OK]Quick Trick: Zero dot product means zero cosine similarity [OK]Common Mistakes:MISTAKESIgnoring negative component in dot productConfusing cosine similarity with Euclidean distanceAssuming similarity is always positive
Master "Text Similarity and Search" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Text Generation - Temperature and sampling - Quiz 1easy Text Generation - Temperature and sampling - Quiz 9hard Text Generation - N-gram language models - Quiz 6medium Text Similarity and Search - Edit distance (Levenshtein) - Quiz 4medium Text Similarity and Search - Cosine similarity - Quiz 10hard Topic Modeling - Why topic modeling discovers themes - Quiz 8hard Topic Modeling - Topic coherence evaluation - Quiz 8hard Topic Modeling - Choosing number of topics - Quiz 10hard Word Embeddings - GloVe embeddings - Quiz 13medium Word Embeddings - Training Word2Vec with Gensim - Quiz 14medium