Recall & Review
beginner
What is the main goal of Word2Vec?
Word2Vec aims to learn word meanings by turning words into numbers (vectors) so that words with similar meanings have similar vectors.
Click to reveal answer
beginner
Explain the Continuous Bag of Words (CBOW) model in Word2Vec.
CBOW predicts a target word based on its surrounding words (context). It looks at the words around a missing word and guesses what the missing word is.
Click to reveal answer
beginner
What does the Skip-gram model do in Word2Vec?
Skip-gram takes a target word and tries to predict the words around it (context). It learns which words tend to appear near the target word.
Click to reveal answer
beginner
Why is Word2Vec useful in real life?
Word2Vec helps computers understand language better, which improves things like search engines, chatbots, and translation by knowing word meanings and relationships.
Click to reveal answer
intermediate
How do CBOW and Skip-gram differ in training focus?
CBOW predicts the center word from context words, while Skip-gram predicts context words from the center word. CBOW is faster; Skip-gram works better with rare words.
Click to reveal answer
What does the Skip-gram model predict?
✗ Incorrect
Skip-gram predicts the context words around a given target word.
Which Word2Vec model is generally faster to train?
✗ Incorrect
CBOW is usually faster because it predicts one word from many context words.
What is the main output of Word2Vec?
✗ Incorrect
Word2Vec outputs word vectors that capture word meanings.
Which model is better for rare words?
✗ Incorrect
Skip-gram works better with rare words by focusing on predicting context from the target word.
In CBOW, what is used to predict the target word?
✗ Incorrect
CBOW uses the surrounding words to predict the missing target word.
Describe how the CBOW and Skip-gram models work in Word2Vec and their main differences.
Think about which words are inputs and which are outputs in each model.
You got /4 concepts.
Explain why Word2Vec embeddings are useful for language tasks.
Consider how turning words into numbers helps machines.
You got /4 concepts.