NLP - Word EmbeddingsWhy does FastText produce embeddings for words not seen during training, unlike traditional word2vec embeddings?ABecause it ignores word order completelyBBecause it stores all possible words in a dictionaryCBecause it uses character n-grams to build word vectors from subword unitsDBecause it uses one-hot encoding for unknown wordsCheck Answer
Step-by-Step SolutionSolution:Step 1: Recall FastText's subword modelingFastText breaks words into character n-grams, allowing it to create vectors for unseen words by combining subword vectors.Step 2: Contrast with traditional word2vecWord2vec treats words as atomic units and cannot generate vectors for unseen words.Final Answer:Because it uses character n-grams to build word vectors from subword units -> Option CQuick Check:Subword n-grams enable OOV embeddings = A [OK]Quick Trick: Subword n-grams let FastText embed new words [OK]Common Mistakes:MISTAKESThinking FastText stores all words explicitlyBelieving it ignores word orderConfusing one-hot encoding with embeddings
Master "Word Embeddings" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sentiment Analysis Advanced - Hybrid approaches - Quiz 2easy Sequence Models for NLP - Attention mechanism basics - Quiz 13medium Sequence Models for NLP - Embedding layer usage - Quiz 10hard Sequence Models for NLP - Bidirectional LSTM - Quiz 5medium Text Generation - Temperature and sampling - Quiz 8hard Text Similarity and Search - Jaccard similarity - Quiz 5medium Topic Modeling - LDA with Gensim - Quiz 7medium Topic Modeling - Why topic modeling discovers themes - Quiz 6medium Topic Modeling - LDA with scikit-learn - Quiz 15hard Word Embeddings - Word2Vec (CBOW and Skip-gram) - Quiz 12easy