0
0
NLPml~5 mins

Semantic similarity with embeddings in NLP - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is semantic similarity in the context of embeddings?
Semantic similarity measures how close the meanings of two pieces of text are, using embeddings that represent their meanings as numbers.
Click to reveal answer
beginner
How do embeddings help in measuring semantic similarity?
Embeddings convert words or sentences into vectors of numbers, so we can compare these vectors mathematically to find how similar their meanings are.
Click to reveal answer
intermediate
Which common metric is used to calculate similarity between two embedding vectors?
Cosine similarity is commonly used; it measures the angle between two vectors to see how close their directions are, indicating similarity.
Click to reveal answer
beginner
What is a real-life example of semantic similarity using embeddings?
Finding similar questions in a FAQ by comparing their embeddings to a user's question, so the system can suggest the closest matching answers.
Click to reveal answer
intermediate
Why might two sentences with different words have high semantic similarity?
Because embeddings capture meaning beyond exact words, sentences with different words but similar meanings can have vectors close together.
Click to reveal answer
What does an embedding represent in NLP?
AA numerical vector representing the meaning of text
BA list of keywords from the text
CThe length of the text in characters
DThe frequency of each word in the text
Which similarity metric is most commonly used with embeddings?
AHamming distance
BEuclidean distance
CJaccard index
DCosine similarity
If two sentences have very different words but similar meanings, their embeddings will likely be:
AVery different vectors
BZero vectors
CClose vectors
DRandom vectors
Semantic similarity helps machines to:
AUnderstand how close meanings are between texts
BTranslate text into another language
CCount words in a sentence
DDetect spelling errors
Which of these is NOT a use case of semantic similarity with embeddings?
ARecommending similar articles
BSorting numbers in ascending order
CFinding duplicate questions
DMatching customer queries to answers
Explain how embeddings are used to measure semantic similarity between two sentences.
Think about how numbers can represent meaning and how we compare those numbers.
You got /4 concepts.
    Describe a simple real-world example where semantic similarity with embeddings can improve user experience.
    Consider how a FAQ or search engine might use this.
    You got /3 concepts.