0
0
NLPml~5 mins

Word2Vec (CBOW and Skip-gram) in NLP - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is the main goal of Word2Vec?
Word2Vec aims to learn word meanings by turning words into numbers (vectors) so that words with similar meanings have similar vectors.
Click to reveal answer
beginner
Explain the Continuous Bag of Words (CBOW) model in Word2Vec.
CBOW predicts a target word based on its surrounding words (context). It looks at the words around a missing word and guesses what the missing word is.
Click to reveal answer
beginner
What does the Skip-gram model do in Word2Vec?
Skip-gram takes a target word and tries to predict the words around it (context). It learns which words tend to appear near the target word.
Click to reveal answer
beginner
Why is Word2Vec useful in real life?
Word2Vec helps computers understand language better, which improves things like search engines, chatbots, and translation by knowing word meanings and relationships.
Click to reveal answer
intermediate
How do CBOW and Skip-gram differ in training focus?
CBOW predicts the center word from context words, while Skip-gram predicts context words from the center word. CBOW is faster; Skip-gram works better with rare words.
Click to reveal answer
What does the Skip-gram model predict?
ASurrounding words from the target word
BThe target word from surrounding words
CThe next sentence in a paragraph
DThe frequency of a word in a document
Which Word2Vec model is generally faster to train?
ANeither, they are slow
BSkip-gram
CBoth are equally fast
DCBOW
What is the main output of Word2Vec?
AWord frequency counts
BPart-of-speech tags
CWord vectors (embeddings)
DSentence summaries
Which model is better for rare words?
ACBOW
BSkip-gram
CBoth perform the same
DNeither handles rare words
In CBOW, what is used to predict the target word?
AThe surrounding context words
BRandom words from the corpus
CThe entire sentence
DThe target word itself
Describe how the CBOW and Skip-gram models work in Word2Vec and their main differences.
Think about which words are inputs and which are outputs in each model.
You got /4 concepts.
    Explain why Word2Vec embeddings are useful for language tasks.
    Consider how turning words into numbers helps machines.
    You got /4 concepts.