0
0
LangChainframework~5 mins

FAISS vector store setup in LangChain

Choose your learning style9 modes available
Introduction

FAISS helps you quickly find similar items by comparing numbers called vectors. Setting it up lets your program search and match data fast.

You want to find similar documents or images quickly.
You have many pieces of data and need fast searching.
You want to build a recommendation system based on similarity.
You want to store and search text or images as vectors.
You want to use LangChain to manage vector searches easily.
Syntax
LangChain
from langchain.vectorstores import FAISS
from langchain.embeddings import OpenAIEmbeddings

# Create embeddings object
embeddings = OpenAIEmbeddings()

# Create FAISS vector store from texts
vector_store = FAISS.from_texts(texts, embeddings)

# Save vector store to disk
vector_store.save_local("faiss_index")

# Load vector store from disk
vector_store = FAISS.load_local("faiss_index", embeddings)

Use OpenAIEmbeddings() or any other embedding model to convert text to vectors.

Saving and loading lets you reuse the vector store without rebuilding it.

Examples
Create a FAISS vector store from a small list of texts.
LangChain
from langchain.vectorstores import FAISS
from langchain.embeddings import OpenAIEmbeddings

texts = ["Hello world", "Hi there"]
embeddings = OpenAIEmbeddings()
vector_store = FAISS.from_texts(texts, embeddings)
Save the vector store to a folder named 'my_faiss' on your computer.
LangChain
vector_store.save_local("my_faiss")
Load the saved vector store back into your program to use it again.
LangChain
vector_store = FAISS.load_local("my_faiss", embeddings)
Sample Program

This program creates a FAISS vector store from a few fruit-related sentences, saves it, loads it again, and searches for texts similar to 'I like fruit'. It prints the two closest matches.

LangChain
from langchain.vectorstores import FAISS
from langchain.embeddings import OpenAIEmbeddings

# Sample texts to store
texts = ["I love apples", "I enjoy oranges", "Bananas are great"]

# Create embeddings object
embeddings = OpenAIEmbeddings()

# Build FAISS vector store from texts
vector_store = FAISS.from_texts(texts, embeddings)

# Save the vector store locally
vector_store.save_local("faiss_example")

# Load the vector store back
loaded_store = FAISS.load_local("faiss_example", embeddings)

# Search for similar text
query = "I like fruit"
results = loaded_store.similarity_search(query, k=2)

# Print the top 2 similar texts
for i, doc in enumerate(results, 1):
    print(f"Result {i}: {doc.page_content}")
OutputSuccess
Important Notes

Make sure you have the OpenAI API key set up to use OpenAIEmbeddings.

FAISS works best with many vectors; small sets still work but are less efficient.

Saving and loading vector stores helps avoid rebuilding embeddings every time.

Summary

FAISS vector store helps find similar items fast using vectors.

Use LangChain's FAISS class with embeddings to create and manage the store.

Save your vector store to reuse it later without rebuilding.