NLP - Text GenerationHow does combining temperature scaling with top-k sampling improve text generation?ATemperature and top-k have no combined effectBTemperature selects top-k words; top-k scales probabilitiesCBoth methods randomly select words ignoring probabilitiesDTemperature smooths probabilities; top-k limits choices to likely wordsCheck Answer
Step-by-Step SolutionSolution:Step 1: Understand temperature scalingTemperature adjusts probability sharpness, controlling randomness.Step 2: Understand top-k samplingTop-k restricts sampling to top k probable words, avoiding rare words.Step 3: Combined effectTemperature smooths probabilities; top-k limits choices, improving quality and diversity.Final Answer:Temperature smooths probabilities; top-k limits choices to likely words -> Option DQuick Check:Temperature + top-k = smoother, focused sampling [OK]Quick Trick: Temperature smooths; top-k filters choices [OK]Common Mistakes:MISTAKESConfusing roles of temperature and top-kThinking both ignore probabilitiesAssuming no combined effect
Master "Text Generation" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sequence Models for NLP - Attention mechanism basics - Quiz 4medium Sequence Models for NLP - Embedding layer usage - Quiz 2easy Text Generation - Beam search decoding - Quiz 12easy Text Similarity and Search - Semantic similarity with embeddings - Quiz 10hard Topic Modeling - Latent Dirichlet Allocation (LDA) - Quiz 6medium Topic Modeling - Choosing number of topics - Quiz 5medium Topic Modeling - LDA with Gensim - Quiz 11easy Topic Modeling - Topic coherence evaluation - Quiz 12easy Word Embeddings - GloVe embeddings - Quiz 13medium Word Embeddings - FastText embeddings - Quiz 4medium