Recall & Review
beginner
What does GloVe stand for in GloVe embeddings?
GloVe stands for Global Vectors for Word Representation. It is a method to create word embeddings by capturing global word co-occurrence statistics.
Click to reveal answer
intermediate
How does GloVe differ from Word2Vec in learning word embeddings?
GloVe uses a matrix factorization approach on the global word co-occurrence matrix, while Word2Vec learns embeddings by predicting words in local context windows using a neural network.
Click to reveal answer
beginner
What is the main input data structure used by GloVe to learn embeddings?
GloVe uses a word co-occurrence matrix that counts how often pairs of words appear together in a large text corpus.
Click to reveal answer
beginner
Why are GloVe embeddings useful in natural language processing tasks?
They capture semantic relationships between words by encoding how frequently words co-occur globally, helping models understand word meanings and similarities.
Click to reveal answer
intermediate
What kind of mathematical operation does GloVe use to learn word vectors from the co-occurrence matrix?
GloVe performs matrix factorization by minimizing a weighted least squares objective to find word vectors that reconstruct the co-occurrence counts.
Click to reveal answer
What is the main data structure GloVe uses to learn word embeddings?
✗ Incorrect
GloVe learns embeddings by factorizing the word co-occurrence matrix, which counts how often word pairs appear together.
Which of the following best describes GloVe's approach?
✗ Incorrect
GloVe uses matrix factorization on the global co-occurrence matrix, unlike Word2Vec which predicts words from local context.
What does a GloVe embedding vector represent?
✗ Incorrect
GloVe embeddings capture semantic meaning by encoding how words co-occur globally in the corpus.
Which is a key advantage of GloVe embeddings?
✗ Incorrect
GloVe embeddings capture global co-occurrence statistics, unlike some methods that only consider local context.
What kind of loss function does GloVe minimize during training?
✗ Incorrect
GloVe minimizes a weighted least squares loss to factorize the co-occurrence matrix effectively.
Explain in your own words how GloVe embeddings are created from a text corpus.
Think about how often words appear together and how that information is turned into vectors.
You got /4 concepts.
Describe the main difference between GloVe and Word2Vec embeddings.
Focus on what data each method uses to learn embeddings.
You got /4 concepts.