0
0
Prompt Engineering / GenAIml~20 mins

Why embeddings capture semantic meaning in Prompt Engineering / GenAI - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Semantic Embedding Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why do embeddings place similar words close together?

Embeddings are vectors that represent words or items. Why do embeddings place words with similar meanings close to each other in the vector space?

ABecause embeddings are randomly assigned and happen to cluster similar words by chance.
BBecause embeddings store the exact dictionary definitions of words as numbers.
CBecause embeddings are trained to minimize distance between words that appear in similar contexts, capturing their meaning.
DBecause embeddings only consider word length and frequency, not meaning.
Attempts:
2 left
💡 Hint

Think about how words used in similar sentences might relate.

Predict Output
intermediate
2:00remaining
Output of cosine similarity between embeddings

Given two word embeddings as vectors, what is the output of the cosine similarity calculation?

Prompt Engineering / GenAI
import numpy as np

def cosine_similarity(vec1, vec2):
    dot_product = np.dot(vec1, vec2)
    norm1 = np.linalg.norm(vec1)
    norm2 = np.linalg.norm(vec2)
    return dot_product / (norm1 * norm2)

embedding_a = np.array([1, 2, 3])
embedding_b = np.array([2, 4, 6])
result = cosine_similarity(embedding_a, embedding_b)
print(round(result, 2))
A0.00
B1.00
C0.77
D0.50
Attempts:
2 left
💡 Hint

Consider the angle between vectors that are multiples of each other.

Model Choice
advanced
2:00remaining
Which model type best learns semantic embeddings?

Which type of model is best suited to learn embeddings that capture semantic meaning of words?

AA model that predicts the next word given previous words (e.g., language model).
BA model that randomly assigns vectors to words without training.
CA model that only counts word frequency in documents.
DA model that sorts words alphabetically.
Attempts:
2 left
💡 Hint

Think about models that learn from context and sequence.

Hyperparameter
advanced
2:00remaining
Effect of embedding dimension size on semantic capture

How does increasing the size of the embedding dimension affect the model's ability to capture semantic meaning?

ALarger dimensions can capture more semantic details but may require more data to train well.
BLarger dimensions always reduce semantic capture due to overfitting.
CEmbedding size does not affect semantic meaning capture at all.
DSmaller dimensions always capture more semantic meaning because they are simpler.
Attempts:
2 left
💡 Hint

Think about the trade-off between detail and data needed.

Metrics
expert
2:00remaining
Choosing the best metric to evaluate semantic similarity of embeddings

Which metric is most appropriate to evaluate how well embeddings capture semantic similarity between words?

AMean Squared Error (MSE) between embedding vectors.
BAccuracy of predicting the exact word from embedding.
CCounting the number of non-zero elements in embeddings.
DCosine similarity between embedding vectors.
Attempts:
2 left
💡 Hint

Consider metrics that measure angle or direction rather than magnitude.