0
0
LangChainframework~5 mins

Pinecone cloud vector store in LangChain

Choose your learning style9 modes available
Introduction

Pinecone cloud vector store helps you save and search data using numbers that computers understand easily. It makes finding similar things very fast.

You want to find similar images or texts quickly.
You need to store and search large amounts of data by meaning, not exact words.
You are building a recommendation system that suggests items based on user preferences.
You want to add smart search to your app that understands context.
You need a cloud service to handle vector data without managing servers.
Syntax
LangChain
from langchain.vectorstores import Pinecone
import pinecone

# Initialize Pinecone
pinecone.init(api_key="YOUR_API_KEY", environment="YOUR_ENVIRONMENT")

# Connect to your Pinecone index
index = pinecone.Index("your-index-name")

# Create Pinecone vector store
vector_store = Pinecone(index, embedding_function)

# Use vector_store to add or search vectors

Replace YOUR_API_KEY and YOUR_ENVIRONMENT with your Pinecone account details.

embedding_function converts your data into vectors before storing.

Examples
This sets up the Pinecone vector store using your index and a function that turns text into vectors.
LangChain
from langchain.vectorstores import Pinecone

# Create vector store with embedding function
vector_store = Pinecone(index, embedding_function)
Adds a list of texts to the Pinecone vector store after converting them to vectors.
LangChain
vector_store.add_texts(["Hello world", "How are you?"])
Searches for the 2 most similar texts to "Hello" in the vector store.
LangChain
results = vector_store.similarity_search("Hello", k=2)
Sample Program

This example shows how to connect to Pinecone, add some texts, and search for texts similar to "Hello".

LangChain
from langchain.vectorstores import Pinecone
from langchain.embeddings.openai import OpenAIEmbeddings
import pinecone

# Initialize Pinecone
pinecone.init(api_key="YOUR_API_KEY", environment="us-west1-gcp")

# Connect to index
index = pinecone.Index("example-index")

# Create embedding function
embedding_function = OpenAIEmbeddings()

# Create Pinecone vector store
vector_store = Pinecone(index, embedding_function)

# Add texts
texts = ["Hello world", "Goodbye world", "Hello there"]
vector_store.add_texts(texts)

# Search for similar texts
results = vector_store.similarity_search("Hello", k=2)

# Print results
for res in results:
    print(res.page_content)
OutputSuccess
Important Notes

Make sure your Pinecone index is created before running the code.

Embedding functions must match the data type you want to store (text, images, etc.).

Check your API key and environment carefully to avoid connection errors.

Summary

Pinecone stores data as vectors to help find similar items fast.

Use it when you want smart search or recommendations based on meaning.

Connect with Langchain by initializing Pinecone and using an embedding function.