0
0
LangchainHow-ToBeginner ยท 4 min read

How to Use Chat History in LangChain for Contextual Conversations

In LangChain, you use ChatMessageHistory or ConversationBufferMemory to store and manage chat history. This history keeps track of past messages, allowing your chatbot to remember context and respond accordingly.
๐Ÿ“

Syntax

LangChain provides classes like ChatMessageHistory to store messages and ConversationBufferMemory to manage memory in chatbots. You create a history object, add messages, and pass it to your chat model or chain to maintain context.

Key parts:

  • ChatMessageHistory(): Stores messages as a list.
  • add_user_message() and add_ai_message(): Add user and AI messages.
  • ConversationBufferMemory(memory_key="chat_history", chat_memory=chat_history): Wraps history for chains.
python
from langchain.memory import ConversationBufferMemory
from langchain.schema import ChatMessageHistory

# Create chat history
chat_history = ChatMessageHistory()

# Add messages
chat_history.add_user_message("Hello!")
chat_history.add_ai_message("Hi there! How can I help?")

# Use memory in a chain
memory = ConversationBufferMemory(memory_key="chat_history", chat_memory=chat_history)
๐Ÿ’ป

Example

This example shows how to create chat history, add messages, and use it with an OpenAI chat model to keep conversation context.

python
from langchain.chat_models import ChatOpenAI
from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain
from langchain.schema import ChatMessageHistory

# Initialize chat history
chat_history = ChatMessageHistory()
chat_history.add_user_message("Hello, who won the world cup in 2018?")
chat_history.add_ai_message("France won the 2018 FIFA World Cup.")

# Create memory with chat history
memory = ConversationBufferMemory(memory_key="chat_history", chat_memory=chat_history)

# Initialize chat model
chat = ChatOpenAI(temperature=0)

# Create conversation chain with memory
conversation = ConversationChain(llm=chat, memory=memory)

# Continue conversation
response = conversation.predict(input="Who was the top scorer?")
print(response)
Output
Harry Kane was the top scorer in the 2018 FIFA World Cup with 6 goals.
โš ๏ธ

Common Pitfalls

Common mistakes when using chat history in LangChain include:

  • Not initializing ChatMessageHistory before adding messages, causing errors.
  • Forgetting to pass the chat history to ConversationBufferMemory, so memory is empty.
  • Using incompatible memory keys or forgetting to use the same key in chains.
  • Not updating the history with new messages after each interaction, losing context.
python
from langchain.memory import ConversationBufferMemory
from langchain.schema import ChatMessageHistory

# Wrong: Not passing chat_history to memory
memory = ConversationBufferMemory(memory_key="chat_history")  # No chat_memory argument

# Right: Pass chat_history
chat_history = ChatMessageHistory()
memory = ConversationBufferMemory(memory_key="chat_history", chat_memory=chat_history)
๐Ÿ“Š

Quick Reference

ConceptDescriptionUsage
ChatMessageHistoryStores chat messages in orderchat_history = ChatMessageHistory()
add_user_messageAdd a user message to historychat_history.add_user_message("Hi")
add_ai_messageAdd an AI message to historychat_history.add_ai_message("Hello!")
ConversationBufferMemoryMemory wrapper for chainsmemory = ConversationBufferMemory(memory_key="chat_history", chat_memory=chat_history)
ConversationChainChain that uses memory and LLMconversation = ConversationChain(llm=chat, memory=memory)
โœ…

Key Takeaways

Use ChatMessageHistory to store and manage chat messages in LangChain.
Wrap chat history with ConversationBufferMemory to maintain context in conversation chains.
Always add new messages to history after each interaction to keep context updated.
Pass the chat history object explicitly to memory to avoid empty or lost context.
Use consistent memory keys when integrating memory with chains.