Bird
0
0

Why does dividing logits by temperature before softmax affect the entropy of the output distribution?

hard📝 Conceptual Q10 of 15
NLP - Text Generation
Why does dividing logits by temperature before softmax affect the entropy of the output distribution?
ABecause it changes the sharpness, increasing or decreasing entropy
BBecause it adds noise to logits, increasing entropy
CBecause it normalizes logits to sum to one
DBecause it removes the highest logit value
Step-by-Step Solution
Solution:
  1. Step 1: Understand entropy in probability distributions

    Entropy measures uncertainty or randomness in output probabilities.
  2. Step 2: Effect of temperature scaling on logits

    Dividing logits by temperature changes their relative differences, making distribution sharper or flatter.
  3. Step 3: Impact on entropy

    Sharper distributions have lower entropy; flatter have higher entropy.
  4. Final Answer:

    Because it changes the sharpness, increasing or decreasing entropy -> Option A
  5. Quick Check:

    Temperature controls distribution sharpness and entropy [OK]
Quick Trick: Temperature changes sharpness, which changes entropy [OK]
Common Mistakes:
MISTAKES
  • Thinking temperature adds noise instead of scaling
  • Confusing normalization with temperature scaling
  • Assuming logits are removed or zeroed

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes