Challenge - 5 Problems
Semantic Search Embedding Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate1:30remaining
Understanding Embedding Vectors
Which statement best describes what an embedding vector represents in semantic search?
Attempts:
2 left
💡 Hint
Think about how semantic similarity is measured between texts.
✗ Incorrect
Embedding vectors convert text into numbers that capture meaning, allowing comparison by distance in space.
❓ Predict Output
intermediate2:00remaining
Output of Cosine Similarity Calculation
What is the output of this Python code calculating cosine similarity between two embedding vectors?
Agentic AI
import numpy as np def cosine_similarity(vec1, vec2): return np.dot(vec1, vec2) / (np.linalg.norm(vec1) * np.linalg.norm(vec2)) vec_a = np.array([1, 2, 3]) vec_b = np.array([4, 5, 6]) result = cosine_similarity(vec_a, vec_b) print(round(result, 2))
Attempts:
2 left
💡 Hint
Recall the cosine similarity formula and calculate dot product and norms.
✗ Incorrect
Cosine similarity between [1,2,3] and [4,5,6] is approximately 0.97 after rounding to two decimals.
❓ Model Choice
advanced2:30remaining
Choosing an Embedding Model for Semantic Search
You want to build a semantic search system for short product descriptions. Which embedding model is best suited?
Attempts:
2 left
💡 Hint
Consider model size, training data relevance, and ability to capture sentence meaning.
✗ Incorrect
Fine-tuned sentence-transformer models capture sentence-level semantics well and adapt to domain-specific data.
❓ Hyperparameter
advanced2:00remaining
Effect of Embedding Dimension Size
What is a likely effect of increasing the embedding vector dimension size in a semantic search model?
Attempts:
2 left
💡 Hint
Think about trade-offs between detail captured and resource use.
✗ Incorrect
Higher dimension embeddings can capture more detail but require more computation and may overfit if data is limited.
🔧 Debug
expert2:30remaining
Debugging Semantic Search with Embeddings
Given this code snippet for semantic search, what error will it raise when run?
Agentic AI
import numpy as np embeddings = {'doc1': np.array([0.1, 0.2]), 'doc2': np.array([0.3, 0.4])} query = np.array([0.1, 0.2, 0.3]) scores = {doc: np.dot(vec, query) for doc, vec in embeddings.items()} print(scores)
Attempts:
2 left
💡 Hint
Check the shapes of vectors used in dot product.
✗ Incorrect
np.dot requires vectors of the same size; here query has 3 elements but embeddings have 2, causing ValueError.