Bird
0
0

How does combining temperature scaling with top-k sampling improve text generation?

hard📝 Application Q9 of 15
NLP - Text Generation
How does combining temperature scaling with top-k sampling improve text generation?
ATemperature and top-k have no combined effect
BTemperature selects top-k words; top-k scales probabilities
CBoth methods randomly select words ignoring probabilities
DTemperature smooths probabilities; top-k limits choices to likely words
Step-by-Step Solution
Solution:
  1. Step 1: Understand temperature scaling

    Temperature adjusts probability sharpness, controlling randomness.
  2. Step 2: Understand top-k sampling

    Top-k restricts sampling to top k probable words, avoiding rare words.
  3. Step 3: Combined effect

    Temperature smooths probabilities; top-k limits choices, improving quality and diversity.
  4. Final Answer:

    Temperature smooths probabilities; top-k limits choices to likely words -> Option D
  5. Quick Check:

    Temperature + top-k = smoother, focused sampling [OK]
Quick Trick: Temperature smooths; top-k filters choices [OK]
Common Mistakes:
MISTAKES
  • Confusing roles of temperature and top-k
  • Thinking both ignore probabilities
  • Assuming no combined effect

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More NLP Quizzes