Bird
0
0

What is the main effect of setting a very low temperature (close to 0) in text generation sampling?

easy📝 Conceptual Q1 of 15
NLP - Text Generation
What is the main effect of setting a very low temperature (close to 0) in text generation sampling?
AThe model outputs words completely at random
BThe model outputs the most likely next word almost deterministically
CThe model outputs words with equal probability
DThe model outputs only rare words
Step-by-Step Solution
Solution:
  1. Step 1: Understand temperature's role in sampling

    Temperature controls randomness; low temperature sharpens probabilities.
  2. Step 2: Effect of very low temperature

    When temperature approaches zero, the highest probability word dominates, making output almost deterministic.
  3. Final Answer:

    The model outputs the most likely next word almost deterministically -> Option B
  4. Quick Check:

    Low temperature = deterministic output [OK]
Quick Trick: Low temperature means less randomness, more certainty [OK]
Common Mistakes:
MISTAKES
  • Confusing low temperature with high randomness
  • Thinking low temperature outputs rare words
  • Assuming equal probability at low temperature

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes