Bird
0
0

Given the following code snippet using Gensim's Word2Vec with CBOW, what will be the shape of the learned word vectors?

medium📝 Predict Output Q4 of 15
NLP - Word Embeddings
Given the following code snippet using Gensim's Word2Vec with CBOW, what will be the shape of the learned word vectors? from gensim.models import Word2Vec sentences = [['cat', 'sat', 'on', 'mat'], ['dog', 'barked', 'loudly']] model = Word2Vec(sentences, vector_size=50, window=2, sg=0, min_count=1) print(model.wv['cat'].shape)
A(4, 50)
B(50,)
C(2, 50)
D(1, 50)
Step-by-Step Solution
Solution:
  1. Step 1: Understand vector_size parameter

    The vector_size=50 means each word is represented by a 50-dimensional vector.
  2. Step 2: Check output of model.wv[word]

    model.wv[word] returns a 1D numpy array of length 50, so shape is (50,).
  3. Final Answer:

    (50,) -> Option B
  4. Quick Check:

    Word vector shape = vector_size [OK]
Quick Trick: Word vectors have shape equal to vector_size [OK]
Common Mistakes:
MISTAKES
  • Confusing window size with vector size
  • Expecting 2D shape for single word vector
  • Assuming shape depends on sentence length

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes