Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to set the temperature parameter for text generation.
Prompt Engineering / GenAI
output = model.generate(input_text, temperature=[1]) Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using negative temperature values causes errors.
Setting temperature too high leads to nonsensical output.
✗ Incorrect
The temperature controls randomness. A value of 0.7 gives balanced creativity.
2fill in blank
mediumComplete the code to set the top_p parameter for nucleus sampling.
Prompt Engineering / GenAI
output = model.generate(input_text, top_p=[1]) Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using values greater than 1 for top_p.
Using negative values for top_p.
✗ Incorrect
Top_p controls cumulative probability cutoff. 0.9 keeps the most likely tokens.
3fill in blank
hardFix the error in the code by choosing the correct temperature value.
Prompt Engineering / GenAI
output = model.generate(input_text, temperature=[1]) Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using zero or negative temperature causes the model to fail.
Using values above 1 can cause too random output.
✗ Incorrect
Temperature must be positive and typically less than or equal to 1 for stable output.
4fill in blank
hardFill both blanks to set temperature and top_p for balanced sampling.
Prompt Engineering / GenAI
output = model.generate(input_text, temperature=[1], top_p=[2])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Setting top_p above 1 causes errors.
Using temperature too low makes output boring.
✗ Incorrect
Temperature 0.8 and top_p 0.95 balance creativity and coherence.
5fill in blank
hardFill all three blanks to set temperature, top_p, and max_tokens for generation.
Prompt Engineering / GenAI
output = model.generate(input_text, temperature=[1], top_p=[2], max_tokens=[3])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using max_tokens too low cuts output short.
Setting temperature or top_p outside 0-1 range causes errors.
✗ Incorrect
Temperature 0.6 and top_p 0.9 give good randomness; max_tokens 100 limits output length.