0
0
Prompt Engineering / GenAIml~6 mins

Vector similarity metrics in Prompt Engineering / GenAI - Full Explanation

Choose your learning style9 modes available
Introduction
Imagine you have two lists of numbers representing things like images, words, or sounds, and you want to know how alike they are. Vector similarity metrics help us measure how close or similar these lists, called vectors, are to each other.
Explanation
Cosine Similarity
Cosine similarity measures the angle between two vectors, showing how much they point in the same direction regardless of their length. It gives a value between -1 and 1, where 1 means exactly the same direction and 0 means orthogonal (no similarity).
Cosine similarity tells us how aligned two vectors are by measuring the angle between them.
Euclidean Distance
Euclidean distance calculates the straight-line distance between two points in space. The smaller the distance, the more similar the vectors are. It works like measuring the shortest path between two points on a map.
Euclidean distance measures how far apart two vectors are in space.
Manhattan Distance
Manhattan distance sums the absolute differences of each dimension between two vectors. It is like walking along a grid of city streets, moving only up/down and left/right, rather than straight across.
Manhattan distance measures similarity by adding up the absolute differences across all dimensions.
Jaccard Similarity
Jaccard similarity compares two sets by dividing the size of their overlap by the size of their combined elements. When vectors represent sets, this metric shows how much they share in common.
Jaccard similarity measures how much two sets overlap compared to their total size.
Real World Analogy

Imagine comparing two playlists of songs to see how similar they are. Cosine similarity is like checking if both playlists have songs that fit the same mood. Euclidean distance is like measuring how different the playlists are by counting how many songs differ. Manhattan distance is like walking through a city comparing each street (song) one by one. Jaccard similarity is like seeing how many songs both playlists share compared to all songs combined.

Cosine Similarity → Checking if two playlists have songs that fit the same mood regardless of playlist length
Euclidean Distance → Measuring how many songs differ between two playlists by counting straight differences
Manhattan Distance → Walking through a city grid comparing each street (song) one by one between playlists
Jaccard Similarity → Seeing how many songs both playlists share compared to all songs combined
Diagram
Diagram
┌───────────────────────────────┐
│         Vector Space           │
│                               │
│   ● A                        ● B│
│    \                        / │
│     \  Angle (Cosine)       /  │
│      \                    /   │
│       ●------------------●    │
│       Euclidean Distance       │
│                               │
│  Manhattan Distance: sum of   │
│  grid steps between A and B   │
│                               │
│  Jaccard Similarity: overlap  │
│  of sets represented by A & B │
└───────────────────────────────┘
This diagram shows two vectors A and B in space, illustrating cosine similarity as the angle between them, Euclidean distance as the straight line, Manhattan distance as grid steps, and Jaccard similarity as overlap of sets.
Key Facts
Cosine SimilarityMeasures the angle between two vectors to find how similar their directions are.
Euclidean DistanceCalculates the straight-line distance between two points in vector space.
Manhattan DistanceSums the absolute differences of vector components, like walking city blocks.
Jaccard SimilarityMeasures overlap between two sets divided by their combined size.
Common Confusions
Believing cosine similarity measures distance between vectors.
Believing cosine similarity measures distance between vectors. Cosine similarity measures the angle between vectors, not the distance; vectors can be far apart but still have a small angle.
Thinking Euclidean and Manhattan distances always give the same similarity result.
Thinking Euclidean and Manhattan distances always give the same similarity result. Euclidean distance measures straight-line distance, while Manhattan distance sums absolute differences along axes; they can give different similarity rankings.
Assuming Jaccard similarity works on any numeric vectors.
Assuming Jaccard similarity works on any numeric vectors. Jaccard similarity applies to sets or binary vectors, not to general numeric vectors.
Summary
Vector similarity metrics help compare how alike two lists of numbers are by measuring angles, distances, or overlaps.
Cosine similarity focuses on direction, Euclidean and Manhattan distances focus on how far apart vectors are, and Jaccard similarity measures shared elements in sets.
Choosing the right metric depends on the type of data and what kind of similarity matters most.