0
0
ML Pythonml~5 mins

Word embeddings concept (Word2Vec) in ML Python - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is a word embedding in simple terms?
A word embedding is a way to turn words into numbers so a computer can understand them. It puts similar words close together in a space, like friends sitting near each other.
Click to reveal answer
beginner
What does Word2Vec do?
Word2Vec learns to represent words as vectors (lists of numbers) by looking at the words around them in sentences. It helps computers understand word meanings based on context.
Click to reveal answer
intermediate
What are the two main models in Word2Vec?
The two main models are CBOW (Continuous Bag of Words) which predicts a word from its neighbors, and Skip-gram which predicts neighbors from a word.
Click to reveal answer
beginner
Why are word embeddings better than one-hot encoding?
Word embeddings capture meaning and similarity between words, while one-hot encoding treats every word as completely different with no relation.
Click to reveal answer
intermediate
How can you tell if two words are similar using Word2Vec embeddings?
You can measure the distance or angle between their vectors. If they are close or point in similar directions, the words have similar meanings.
Click to reveal answer
What does Word2Vec use to learn word meanings?
AThe length of the word
BThe words around a target word
CThe number of vowels in the word
DThe font style of the word
Which Word2Vec model predicts a word from its neighbors?
ACBOW
BTF-IDF
COne-hot encoding
DSkip-gram
Why are word embeddings useful compared to one-hot encoding?
AThey show relationships between words
BThey use less memory
CThey are easier to read
DThey count word frequency
What does a vector in Word2Vec represent?
AA word's spelling
BA sentence length
CA word as a list of numbers
DA document's topic
If two word vectors are close in space, what does it mean?
AThe words have different meanings
BThe words appear in different languages
CThe words are spelled the same
DThe words have similar meanings
Explain how Word2Vec learns to represent words as vectors.
Think about how the words around a word help guess it or its neighbors.
You got /4 concepts.
    Describe why word embeddings are important in natural language processing.
    Consider how computers need numbers and how embeddings help with meaning.
    You got /4 concepts.