0
0
NLPml~5 mins

N-grams in NLP - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is an N-gram in natural language processing?
An N-gram is a sequence of N words or tokens that appear together in text. For example, a 2-gram (bigram) is two words in a row, like "good morning."
Click to reveal answer
beginner
What is the difference between a unigram, bigram, and trigram?
A unigram is a single word, a bigram is a pair of two consecutive words, and a trigram is a group of three consecutive words.
Click to reveal answer
intermediate
Why are N-grams useful in language models?
N-grams help predict the next word by looking at the previous N-1 words. This helps computers understand context and improve tasks like text prediction or spelling correction.
Click to reveal answer
intermediate
What is a limitation of using large N in N-grams?
Using large N (like 5-grams or more) can cause data sparsity, meaning many sequences appear rarely or never, making it hard for the model to learn well.
Click to reveal answer
beginner
How can N-grams be used to detect common phrases?
By counting how often N-grams appear in text, we can find common phrases or word combinations that occur frequently, like "New York City" or "machine learning."
Click to reveal answer
What does a bigram represent?
ATwo consecutive words
BOne single word
CThree consecutive words
DA sentence
Which N-gram size is called a trigram?
A2
B3
C1
D4
What problem can happen if N is too large in N-grams?
AData sparsity
BToo many common phrases
CWords lose meaning
DModel runs faster
How do N-grams help in text prediction?
ABy counting characters
BBy ignoring previous words
CBy translating text
DBy looking at previous N-1 words to predict the next
Which of these is an example of a unigram?
A"machine learning"
B"learning model"
C"machine"
D"deep neural network"
Explain what N-grams are and how they are used in language processing.
Think about sequences of words and how they help computers understand text.
You got /3 concepts.
    Describe one advantage and one limitation of using N-grams in text analysis.
    Consider what N-grams help with and what problems happen when N grows.
    You got /2 concepts.