NLP - Text GenerationWhich sampling method is most affected by the temperature parameter in language models?AGreedy samplingBRandom uniform samplingCTemperature-scaled softmax samplingDTop-k samplingCheck Answer
Step-by-Step SolutionSolution:Step 1: Identify sampling methodsGreedy picks highest probability, top-k limits choices, uniform ignores probabilities.Step 2: Temperature affects probability distributionTemperature scales logits before softmax, changing probabilities used in sampling.Final Answer:Temperature-scaled softmax sampling -> Option CQuick Check:Temperature affects softmax probabilities [OK]Quick Trick: Temperature changes softmax probabilities, not greedy picks [OK]Common Mistakes:MISTAKESThinking temperature affects greedy samplingConfusing top-k with temperature scalingAssuming uniform sampling uses temperature
Master "Text Generation" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sequence Models for NLP - Attention mechanism basics - Quiz 4medium Sequence Models for NLP - Embedding layer usage - Quiz 2easy Text Generation - Beam search decoding - Quiz 12easy Text Similarity and Search - Semantic similarity with embeddings - Quiz 10hard Topic Modeling - Latent Dirichlet Allocation (LDA) - Quiz 6medium Topic Modeling - Choosing number of topics - Quiz 5medium Topic Modeling - LDA with Gensim - Quiz 11easy Topic Modeling - Topic coherence evaluation - Quiz 12easy Word Embeddings - GloVe embeddings - Quiz 13medium Word Embeddings - FastText embeddings - Quiz 4medium