0
0
NLPml~5 mins

N-gram language models in NLP - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is an N-gram in language modeling?
An N-gram is a sequence of N words used together to predict the next word in a sentence. For example, a bigram is two words, a trigram is three words.
Click to reveal answer
beginner
How does an N-gram language model predict the next word?
It looks at the previous N-1 words and calculates the probability of each possible next word based on how often those words appeared together in training data.
Click to reveal answer
beginner
What is the difference between unigram, bigram, and trigram models?
Unigram models consider single words independently. Bigram models consider pairs of words. Trigram models consider triples of words to predict the next word.
Click to reveal answer
intermediate
Why do N-gram models face the problem of data sparsity?
Because many word combinations may not appear in the training data, making it hard to estimate their probabilities accurately.
Click to reveal answer
intermediate
What is smoothing in N-gram language models?
Smoothing is a technique to adjust probabilities so that unseen word sequences get a small, non-zero probability instead of zero, helping the model handle new phrases.
Click to reveal answer
What does a trigram model use to predict the next word?
AThe previous two words
BThe previous three words
COnly the last word
DAll words in the sentence
Why is smoothing important in N-gram models?
ATo remove rare words from the model
BTo assign zero probability to unseen word sequences
CTo increase the size of the training data
DTo assign small probabilities to unseen word sequences
Which problem occurs because many word sequences are rare or missing in training data?
AOverfitting
BData sparsity
CUnderfitting
DBias
In a bigram model, what is the probability of a word based on?
ARandom chance
BThe word after it
CThe word before it
DThe entire sentence
Which of these is NOT a type of N-gram model?
AQuadrigram
BUnigram
CTrigram
DBigram
Explain how an N-gram language model predicts the next word in a sentence.
Think about how you guess the next word when typing a message.
You got /3 concepts.
    Describe the challenges N-gram models face and how smoothing helps.
    Consider what happens when the model sees a new phrase it never learned.
    You got /3 concepts.