0
0
Agentic AIml~10 mins

Embedding models for semantic search in Agentic AI - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create an embedding vector from text using a model.

Agentic AI
embedding = model.[1](text)
Drag options to blanks, or click blank then click option'
Atrain
Bdecode
Cencode
Dpredict
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'decode' instead of 'encode' which is for reversing embeddings.
Using 'train' which is for model training, not embedding generation.
2fill in blank
medium

Complete the code to compute cosine similarity between two embedding vectors.

Agentic AI
similarity = cosine_similarity(vec1, [1])
Drag options to blanks, or click blank then click option'
Avec2
Bmodel
Cembedding
Dtext
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the model or raw text instead of the second vector.
Using the embedding variable without specifying which vector.
3fill in blank
hard

Fix the error in the code to normalize an embedding vector.

Agentic AI
normalized_vec = vec / np.[1](vec)
Drag options to blanks, or click blank then click option'
Amax
Bmean
Csum
Dlinalg.norm
Attempts:
3 left
💡 Hint
Common Mistakes
Using sum or mean which do not give vector length.
Using max which only finds the largest element.
4fill in blank
hard

Fill both blanks to create a dictionary of words and their embedding lengths greater than 5.

Agentic AI
lengths = {word: [1] for word in words if [2] > 5}
Drag options to blanks, or click blank then click option'
Alen(word)
Bword
Dembedding[word]
Attempts:
3 left
💡 Hint
Common Mistakes
Using the word itself as value instead of embedding length.
Confusing word length with embedding vector length.
5fill in blank
hard

Fill all three blanks to filter embeddings with similarity above 0.8 and create a result dictionary.

Agentic AI
result = [1]: [2] for [3] in embeddings if similarity(embeddings[query], embeddings[[1]]) > 0.8}
Drag options to blanks, or click blank then click option'
Aword
Bembeddings[word]
Ditem
Attempts:
3 left
💡 Hint
Common Mistakes
Using incorrect variable names causing runtime errors.
Mixing keys and values in the dictionary comprehension.