0
0
Prompt Engineering / GenAIml~5 mins

Text embedding models in Prompt Engineering / GenAI - Cheat Sheet & Quick Revision

Choose your learning style9 modes available
Recall & Review
beginner
What is a text embedding model?
A text embedding model converts words or sentences into numbers (vectors) that computers can understand. These numbers capture the meaning of the text in a way that similar texts have similar numbers.
Click to reveal answer
beginner
Why do we use text embeddings in machine learning?
We use text embeddings to turn text into numbers so machines can process and compare text easily. This helps in tasks like search, recommendation, and understanding language.
Click to reveal answer
intermediate
Name two popular types of text embedding models.
Two popular types are Word2Vec, which creates embeddings for individual words, and Sentence Transformers, which create embeddings for whole sentences or paragraphs.
Click to reveal answer
intermediate
How does a text embedding model help in finding similar sentences?
The model turns sentences into vectors. Sentences with similar meanings have vectors close to each other. We measure closeness using math tools like cosine similarity.
Click to reveal answer
intermediate
What is cosine similarity and why is it used with embeddings?
Cosine similarity measures how close two vectors point in the same direction. It helps compare text embeddings to find how similar two texts are, ignoring their length.
Click to reveal answer
What does a text embedding model output?
AA corrected spelling of the text
BA translated version of the text
CA summary of the text
DA list of numbers representing the text
Which of these is a common use of text embeddings?
AFinding similar sentences
BImage classification
CAudio recording
DVideo editing
What does cosine similarity measure?
AThe distance between two points on a map
BThe angle between two vectors
CThe length of a vector
DThe number of words in a sentence
Which model creates embeddings for whole sentences?
ASentence Transformers
BWord2Vec
CK-Means
DDecision Trees
Why do we convert text into numbers using embeddings?
ATo print text faster
BTo make text colorful
CBecause computers only understand numbers
DTo translate text into other languages
Explain in your own words what a text embedding model does and why it is useful.
Think about how you might explain turning words into numbers to a friend.
You got /4 concepts.
    Describe how cosine similarity works with text embeddings to find similar sentences.
    Imagine comparing directions of arrows to see if they point the same way.
    You got /4 concepts.