0
0
NLPml~20 mins

Temperature and sampling in NLP - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Temperature and Sampling Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Effect of Temperature on Sampling Distribution

In natural language generation, the temperature parameter controls randomness in sampling. What happens to the output distribution when the temperature is set very high (e.g., 10)?

AThe distribution becomes nearly uniform, making all words almost equally likely.
BThe distribution becomes very peaked, favoring the most probable word strongly.
CThe sampling ignores the model probabilities and picks words randomly from the vocabulary.
DThe model always picks the word with the highest probability deterministically.
Attempts:
2 left
💡 Hint

Think about how raising temperature affects the sharpness of probabilities.

Predict Output
intermediate
2:00remaining
Output of Sampling with Temperature

Given the logits array and temperature, what is the output probability distribution after applying softmax with temperature?

NLP
import numpy as np

def softmax_with_temperature(logits, temperature):
    scaled_logits = logits / temperature
    exp_logits = np.exp(scaled_logits - np.max(scaled_logits))
    return exp_logits / exp_logits.sum()

logits = np.array([2.0, 1.0, 0.1])
temperature = 1.0
probs = softmax_with_temperature(logits, temperature)
print(probs.round(3))
A[0.576 0.298 0.126]
B[0.500 0.300 0.200]
C[0.843 0.128 0.029]
D[0.659 0.242 0.099]
Attempts:
2 left
💡 Hint

Lower temperature sharpens the distribution, increasing the highest probability.

Model Choice
advanced
2:00remaining
Choosing Sampling Strategy for Creative Text Generation

You want to generate creative and diverse text using a language model. Which sampling strategy and temperature setting is best?

ABeam search with beam width 10 and temperature 0.5
BTop-k sampling with k=5 and temperature 1.0
CRandom sampling with temperature 5.0
DGreedy sampling with temperature 0.1
Attempts:
2 left
💡 Hint

Consider how top-k limits choices and temperature controls randomness.

Hyperparameter
advanced
2:00remaining
Impact of Temperature on Model Output Diversity

During training a text generation model, you observe that increasing temperature from 0.7 to 1.5 changes output diversity. What is the expected effect?

AOutput quality improves with fewer grammatical errors.
BOutput becomes more deterministic and repetitive.
COutput becomes more random and diverse, with less repetition.
DOutput length drastically decreases.
Attempts:
2 left
💡 Hint

Think about how temperature affects randomness in sampling.

Metrics
expert
3:00remaining
Evaluating Sampling Temperature Effects on Perplexity

You generate text samples from a language model at different temperatures and compute perplexity on a fixed test set. Which temperature setting is likely to yield the lowest perplexity?

ATemperature 0.1
BTemperature 1.0
CTemperature 2.0
DTemperature 5.0
Attempts:
2 left
💡 Hint

Lower perplexity means the model predicts test data better; consider how temperature affects prediction confidence.