Multi-query retrieval helps find better answers by asking several questions instead of just one. It improves how well you get the right information.
Multi-query retrieval for better recall in LangChain
from langchain.vectorstores import FAISS from langchain.chains import MultiQueryRetriever # Create a vector store (example with FAISS) vector_store = FAISS.load_local("path_to_index", embedding_function) # Create a MultiQueryRetriever multi_query_retriever = MultiQueryRetriever( retriever=vector_store.as_retriever(), queries=["query 1", "query 2", "query 3"] ) # Retrieve documents results = multi_query_retriever.get_relevant_documents("main question")
The MultiQueryRetriever takes multiple queries to improve recall.
You need a vector store retriever like FAISS or others to use it.
multi_query_retriever = MultiQueryRetriever(
retriever=vector_store.as_retriever(),
queries=["climate change", "global warming"]
)
results = multi_query_retriever.get_relevant_documents("effects on environment")multi_query_retriever = MultiQueryRetriever(
retriever=vector_store.as_retriever(),
queries=[]
)
results = multi_query_retriever.get_relevant_documents("renewable energy")multi_query_retriever = MultiQueryRetriever(
retriever=vector_store.as_retriever(),
queries=["AI ethics", "machine learning fairness", "bias in AI"]
)
results = multi_query_retriever.get_relevant_documents("ethical AI development")This program loads a saved FAISS vector store, sets up a multi-query retriever with three related queries about AI history, and retrieves documents for a main question. It then prints the titles of the found documents.
from langchain.vectorstores import FAISS from langchain.embeddings import OpenAIEmbeddings from langchain.chains import MultiQueryRetriever # Setup embeddings and vector store embedding_function = OpenAIEmbeddings() # Assume we have a FAISS index saved locally vector_store = FAISS.load_local("./faiss_index", embedding_function) # Create MultiQueryRetriever with multiple queries multi_query_retriever = MultiQueryRetriever( retriever=vector_store.as_retriever(), queries=["history of AI", "AI milestones", "AI breakthroughs"] ) # Retrieve documents for a main question documents = multi_query_retriever.get_relevant_documents("important events in AI development") # Print titles of retrieved documents print("Retrieved document titles:") for doc in documents: print(doc.metadata.get('title', 'No Title'))
Multi-query retrieval improves recall by combining results from several queries.
Time complexity depends on the number of queries and size of the vector store.
Common mistake: Using unrelated queries can reduce result quality.
Use multi-query retrieval when you want broader coverage than a single query.
Multi-query retrieval asks several related questions to get better search results.
It works by combining results from multiple queries in a vector store retriever.
This method helps find more complete and relevant documents for complex topics.