Recall & Review
beginner
What is a pre-trained embedding in NLP?
A pre-trained embedding is a set of word or token vectors learned from a large text dataset before being used in a new task. It helps represent words as numbers that capture their meanings and relationships.
Click to reveal answer
beginner
Why use pre-trained embeddings instead of training from scratch?
Pre-trained embeddings save time and resources because they already capture useful language patterns. They improve model performance, especially when you have limited data for your task.
Click to reveal answer
beginner
Name two popular pre-trained embedding models.
Two popular pre-trained embedding models are Word2Vec and GloVe. Both learn word vectors from large text corpora but use different training methods.
Click to reveal answer
intermediate
How do you use pre-trained embeddings in a neural network?
You load the pre-trained vectors and use them as the initial weights for the embedding layer in your neural network. You can keep them fixed or allow fine-tuning during training.
Click to reveal answer
intermediate
What is fine-tuning in the context of pre-trained embeddings?
Fine-tuning means updating the pre-trained embedding weights slightly during your task's training to better fit your specific data and improve performance.
Click to reveal answer
What is the main benefit of using pre-trained embeddings?
✗ Incorrect
Pre-trained embeddings reduce training time and improve performance by providing meaningful word representations learned from large datasets.
Which of these is NOT a popular pre-trained embedding model?
✗ Incorrect
RandomForest is a machine learning algorithm, not a pre-trained embedding model.
What does fine-tuning pre-trained embeddings involve?
✗ Incorrect
Fine-tuning means updating the embeddings during training to better fit the new task's data.
How are pre-trained embeddings usually integrated into a model?
✗ Incorrect
Pre-trained embeddings are loaded as the initial weights of the embedding layer in a neural network.
Which statement about pre-trained embeddings is true?
✗ Incorrect
Pre-trained embeddings capture semantic relationships between words, helping models understand meaning.
Explain what pre-trained embeddings are and why they are useful in NLP tasks.
Think about how embeddings represent words and how pre-training helps.
You got /4 concepts.
Describe how you would use pre-trained embeddings in a neural network model and the role of fine-tuning.
Consider the embedding layer and training process.
You got /4 concepts.