NLP - Text GenerationWhy does setting temperature too high (e.g., > 5) often produce poor text generation results?ABecause the model ignores the temperature parameterBBecause probabilities become nearly uniform, causing random outputsCBecause the model always picks the highest probability wordDBecause the model outputs only the first word repeatedlyCheck Answer
Step-by-Step SolutionSolution:Step 1: Understand effect of very high temperatureHigh temperature flattens logits, making probabilities close to equal.Step 2: Effect on output randomnessNearly uniform probabilities cause random, less meaningful word choices.Final Answer:Because probabilities become nearly uniform, causing random outputs -> Option BQuick Check:High temperature = random uniform output [OK]Quick Trick: High temperature flattens probabilities, increasing randomness [OK]Common Mistakes:MISTAKESThinking high temperature picks highest probability wordAssuming model ignores temperatureBelieving output repeats first word
Master "Text Generation" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sequence Models for NLP - Attention mechanism basics - Quiz 4medium Sequence Models for NLP - Embedding layer usage - Quiz 2easy Text Generation - Beam search decoding - Quiz 12easy Text Similarity and Search - Semantic similarity with embeddings - Quiz 10hard Topic Modeling - Latent Dirichlet Allocation (LDA) - Quiz 6medium Topic Modeling - Choosing number of topics - Quiz 5medium Topic Modeling - LDA with Gensim - Quiz 11easy Topic Modeling - Topic coherence evaluation - Quiz 12easy Word Embeddings - GloVe embeddings - Quiz 13medium Word Embeddings - FastText embeddings - Quiz 4medium