NLP - Text GenerationWhich of the following code snippets correctly applies temperature scaling to logits before sampling in Python?Aprobs = softmax(logits / temperature)Bprobs = softmax(logits * temperature)Cprobs = softmax(logits + temperature)Dprobs = softmax(logits - temperature)Check Answer
Step-by-Step SolutionSolution:Step 1: Recall temperature scaling formulaTemperature is applied by dividing logits by temperature before softmax to adjust randomness.Step 2: Identify correct operationDividing logits by temperature scales the logits correctly; multiplying or adding is incorrect.Final Answer:probs = softmax(logits / temperature) -> Option AQuick Check:Divide logits by temperature before softmax [OK]Quick Trick: Divide logits by temperature before softmax [OK]Common Mistakes:MISTAKESMultiplying logits by temperature instead of dividingAdding temperature to logitsSubtracting temperature from logits
Master "Text Generation" in NLP9 interactive learning modes - each teaches the same concept differentlyLearnWhyDeepModelTryChallengeExperimentRecallMetrics
More NLP Quizzes Sequence Models for NLP - Attention mechanism basics - Quiz 4medium Sequence Models for NLP - Embedding layer usage - Quiz 2easy Text Generation - Beam search decoding - Quiz 12easy Text Similarity and Search - Semantic similarity with embeddings - Quiz 10hard Topic Modeling - Latent Dirichlet Allocation (LDA) - Quiz 6medium Topic Modeling - Choosing number of topics - Quiz 5medium Topic Modeling - LDA with Gensim - Quiz 11easy Topic Modeling - Topic coherence evaluation - Quiz 12easy Word Embeddings - GloVe embeddings - Quiz 13medium Word Embeddings - FastText embeddings - Quiz 4medium