Bird
0
0

Why does setting temperature too high (e.g., > 5) often produce poor text generation results?

medium📝 Debug Q7 of 15
NLP - Text Generation
Why does setting temperature too high (e.g., > 5) often produce poor text generation results?
ABecause the model ignores the temperature parameter
BBecause probabilities become nearly uniform, causing random outputs
CBecause the model always picks the highest probability word
DBecause the model outputs only the first word repeatedly
Step-by-Step Solution
Solution:
  1. Step 1: Understand effect of very high temperature

    High temperature flattens logits, making probabilities close to equal.
  2. Step 2: Effect on output randomness

    Nearly uniform probabilities cause random, less meaningful word choices.
  3. Final Answer:

    Because probabilities become nearly uniform, causing random outputs -> Option B
  4. Quick Check:

    High temperature = random uniform output [OK]
Quick Trick: High temperature flattens probabilities, increasing randomness [OK]
Common Mistakes:
MISTAKES
  • Thinking high temperature picks highest probability word
  • Assuming model ignores temperature
  • Believing output repeats first word

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes