Recall & Review
beginner
What does the 'temperature' parameter control in text generation models?
Temperature controls how random or creative the model's output is. A low temperature (close to 0) makes the output more predictable and focused, while a high temperature (above 1) makes it more random and diverse.
Click to reveal answer
beginner
Explain 'top-k sampling' in simple terms.
Top-k sampling means the model only picks the next word from the top 'k' most likely words. This limits choices to the best options and helps keep the output sensible.
Click to reveal answer
intermediate
How does 'top-p' (nucleus) sampling differ from 'top-k' sampling?
Top-p sampling picks words from the smallest group whose combined probability is at least 'p' (like 0.9). This group size can change, unlike top-k which always picks a fixed number of words.
Click to reveal answer
beginner
What happens if you set temperature to 0 in a language model?
Setting temperature to 0 makes the model always pick the most likely next word, making the output very predictable and repetitive.
Click to reveal answer
beginner
Why might you want to use a higher temperature when generating creative text?
A higher temperature adds randomness, which can make the text more creative and surprising, useful for stories or poems.
Click to reveal answer
What does increasing the temperature parameter do to the model's output?
✗ Incorrect
Increasing temperature increases randomness, making output more diverse.
In top-k sampling, what does 'k' represent?
✗ Incorrect
Top-k sampling picks the next word from the top 'k' most likely words.
Which sampling method adapts the number of candidate words based on cumulative probability?
✗ Incorrect
Top-p sampling selects words from the smallest set whose total probability exceeds 'p'.
What is the effect of setting temperature to 0?
✗ Incorrect
Temperature 0 means deterministic output by always choosing the highest probability word.
Why use sampling methods like top-k or top-p instead of always picking the most likely word?
✗ Incorrect
Sampling adds randomness, making text more interesting and less repetitive.
Describe how temperature affects the randomness of text generated by a language model.
Think about how predictable or surprising the text feels.
You got /3 concepts.
Explain the difference between top-k and top-p sampling in simple terms.
Consider how many words the model can choose from next.
You got /3 concepts.