Complete the code to create an embedding layer with input dimension 1000 and output dimension 64.
embedding = nn.Embedding([1], 64)
The input dimension is the size of the vocabulary, which is 1000 here.
Complete the code to convert input word indices to embeddings using the embedding layer.
embedded_words = embedding([1])You pass the word indices tensor to the embedding layer to get embeddings.
Fix the error in the embedding layer initialization by filling the correct padding index.
embedding = nn.Embedding(1000, 64, padding_idx=[1])
The padding index is usually 0, which tells the embedding layer to ignore that index during training.
Fill both blanks to create an embedding layer and get embeddings for input indices.
embedding = nn.Embedding([1], [2]) embedded = embedding(input_indices)
The embedding layer is created with input dimension 1000 and output dimension 64.
Fill all three blanks to create an embedding layer, set padding index, and get embeddings.
embedding = nn.Embedding([1], [2], padding_idx=[3]) embedded = embedding(batch_indices)
The embedding layer uses vocabulary size 10000, embedding size 300, and padding index 0.