Recall & Review
beginner
What is a word embedding in simple terms?
A word embedding is a way to turn words into numbers so a computer can understand them. It puts similar words close together in a space, like friends sitting near each other.
Click to reveal answer
beginner
What does Word2Vec do?
Word2Vec learns to represent words as vectors (lists of numbers) by looking at the words around them in sentences. It helps computers understand word meanings based on context.
Click to reveal answer
intermediate
What are the two main models in Word2Vec?
The two main models are CBOW (Continuous Bag of Words) which predicts a word from its neighbors, and Skip-gram which predicts neighbors from a word.
Click to reveal answer
beginner
Why are word embeddings better than one-hot encoding?
Word embeddings capture meaning and similarity between words, while one-hot encoding treats every word as completely different with no relation.
Click to reveal answer
intermediate
How can you tell if two words are similar using Word2Vec embeddings?
You can measure the distance or angle between their vectors. If they are close or point in similar directions, the words have similar meanings.
Click to reveal answer
What does Word2Vec use to learn word meanings?
✗ Incorrect
Word2Vec learns word meanings by looking at the context words around a target word.
Which Word2Vec model predicts a word from its neighbors?
✗ Incorrect
CBOW (Continuous Bag of Words) predicts a word based on its surrounding words.
Why are word embeddings useful compared to one-hot encoding?
✗ Incorrect
Word embeddings capture relationships and similarities between words, unlike one-hot encoding.
What does a vector in Word2Vec represent?
✗ Incorrect
In Word2Vec, each word is represented as a vector, which is a list of numbers capturing its meaning.
If two word vectors are close in space, what does it mean?
✗ Incorrect
Close vectors mean the words are similar in meaning or used in similar contexts.
Explain how Word2Vec learns to represent words as vectors.
Think about how the words around a word help guess it or its neighbors.
You got /4 concepts.
Describe why word embeddings are important in natural language processing.
Consider how computers need numbers and how embeddings help with meaning.
You got /4 concepts.