NLP - Text GenerationWhy does dividing logits by temperature before softmax affect the entropy of the output distribution?ABecause it changes the sharpness, increasing or decreasing entropyBBecause it adds noise to logits, increasing entropyCBecause it normalizes logits to sum to oneDBecause it removes the highest logit valueCheck Answer
Step-by-Step SolutionSolution:Step 1: Understand entropy in probability distributionsEntropy measures uncertainty or randomness in output probabilities.Step 2: Effect of temperature scaling on logitsDividing logits by temperature changes their relative differences, making distribution sharper or flatter.Step 3: Impact on entropySharper distributions have lower entropy; flatter have higher entropy.Final Answer:Because it changes the sharpness, increasing or decreasing entropy -> Option AQuick Check:Temperature controls distribution sharpness and entropy [OK]Quick Trick: Temperature changes sharpness, which changes entropy [OK]Common Mistakes:MISTAKESThinking temperature adds noise instead of scalingConfusing normalization with temperature scalingAssuming logits are removed or zeroed
Master "Text Generation" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sequence Models for NLP - Attention mechanism basics - Quiz 4medium Sequence Models for NLP - Embedding layer usage - Quiz 2easy Text Generation - Beam search decoding - Quiz 12easy Text Similarity and Search - Semantic similarity with embeddings - Quiz 10hard Topic Modeling - Latent Dirichlet Allocation (LDA) - Quiz 6medium Topic Modeling - Choosing number of topics - Quiz 5medium Topic Modeling - LDA with Gensim - Quiz 11easy Topic Modeling - Topic coherence evaluation - Quiz 12easy Word Embeddings - GloVe embeddings - Quiz 13medium Word Embeddings - FastText embeddings - Quiz 4medium