0
0
LangchainHow-ToBeginner ยท 4 min read

How to Persist Memory in Langchain: Simple Guide

To persist memory in Langchain, use a persistent memory backend like Redis or Chroma that stores conversation history outside the app's runtime. This allows your chatbot to remember past interactions across sessions by loading and saving memory data from these storage systems.
๐Ÿ“

Syntax

In Langchain, persistent memory is created by initializing a memory class with a storage backend. Common options include Redis, Chroma, or file-based memory. You then pass this memory instance to your ConversationChain or other chains.

  • Memory class: Manages conversation state.
  • Storage backend: External system like Redis or Chroma to save memory.
  • Chain: The Langchain component that uses memory to maintain context.
python
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

# Example of in-memory (non-persistent) memory
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

# Pass memory to chain
conversation = ConversationChain(llm=llm, memory=memory)
๐Ÿ’ป

Example

This example shows how to persist memory using Redis as the backend. Redis stores the chat history so it can be retrieved even after the program restarts.

python
from langchain.memory import RedisChatMessageHistory
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain
from langchain.chat_models import ChatOpenAI

# Connect to Redis server (make sure Redis is running locally or remotely)
redis_url = "redis://localhost:6379"

# Create Redis-backed message history
message_history = RedisChatMessageHistory(redis_url=redis_url, session_id="user123")

# Create memory with Redis message history
memory = ConversationBufferMemory(memory_key="chat_history", chat_memory=message_history, return_messages=True)

# Initialize LLM
llm = ChatOpenAI(temperature=0)

# Create conversation chain with persistent memory
conversation = ConversationChain(llm=llm, memory=memory)

# Example conversation
print(conversation.run("Hello!"))
print(conversation.run("What did I just say?"))
Output
Hello! How can I assist you today? You just said: Hello!
โš ๏ธ

Common Pitfalls

1. Forgetting to set a unique session ID: Persistent memory needs a unique session_id to separate conversations. Without it, data may mix.

2. Not running the storage backend: Redis or other storage must be running and accessible or memory will fail.

3. Using in-memory memory for persistence: Classes like ConversationBufferMemory alone do not persist after program ends.

python
from langchain.memory import RedisChatMessageHistory

# Wrong: Missing session_id causes shared or lost memory
message_history = RedisChatMessageHistory(redis_url="redis://localhost:6379")  # No session_id

# Right: Provide session_id to isolate user memory
message_history = RedisChatMessageHistory(redis_url="redis://localhost:6379", session_id="user123")
๐Ÿ“Š

Quick Reference

  • Use RedisChatMessageHistory for Redis persistence.
  • Use session_id to separate conversations.
  • Pass memory to chains to maintain context.
  • Ensure storage backend is running and reachable.
โœ…

Key Takeaways

Use persistent memory classes like RedisChatMessageHistory to save conversation state outside runtime.
Always provide a unique session_id to keep conversations separate in persistent storage.
Pass the persistent memory instance to your Langchain chains to enable context retention.
Ensure your storage backend (e.g., Redis) is running and accessible before using persistent memory.
In-memory memory classes alone do not persist data after the program ends.