0
0
LangChainframework~10 mins

Chat history management in LangChain - Step-by-Step Execution

Choose your learning style9 modes available
Concept Flow - Chat history management
User sends message
Add message to chat history
Pass chat history to language model
Model generates response
Add response to chat history
Return response to user
Wait for next user message or end
This flow shows how each user message and model response is saved in chat history, which is then used to generate context-aware replies.
Execution Sample
LangChain
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory
from langchain_community.llms.fake import FakeListLLM

llm = FakeListLLM(responses=["Hi! How can I help you today?"])
memory = ConversationBufferMemory()
chain = ConversationChain(llm=llm, memory=memory)

response = chain.predict(input="Hello!")
This code creates a conversation chain with memory to store chat history and generates a response to 'Hello!'.
Execution Table
StepActionChat History BeforeInput/OutputChat History After
1User sends message[]Hello![{'role': 'user', 'content': 'Hello!'}]
2Pass history to model[{'role': 'user', 'content': 'Hello!'}]Model generates response[{'role': 'user', 'content': 'Hello!'}]
3Add model response[{'role': 'user', 'content': 'Hello!'}]Hi! How can I help you today?[{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]
4Return response[{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]Hi! How can I help you today?[{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]
5Wait for next message[{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]Waiting...[{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]
💡 Execution stops waiting for next user input or conversation end.
Variable Tracker
VariableStartAfter Step 1After Step 3Final
chat_history[][{'role': 'user', 'content': 'Hello!'}][{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}][{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]
input_messageNone"Hello!""Hello!""Hello!"
model_responseNoneNone"Hi! How can I help you today?""Hi! How can I help you today?"
Key Moments - 2 Insights
Why does the chat history include both user and assistant messages?
Because the model needs the full conversation context to generate relevant responses, as shown in steps 1 and 3 of the execution_table.
What happens if we don't update the chat history after the model response?
The conversation loses context for future messages, so the model can't remember past exchanges, breaking the flow (see step 3 vs step 4).
Visual Quiz - 3 Questions
Test your understanding
Look at the execution_table, what is the chat history after step 1?
A[{'role': 'assistant', 'content': 'Hi! How can I help you today?'}]
B[]
C[{'role': 'user', 'content': 'Hello!'}]
D[{'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi! How can I help you today?'}]
💡 Hint
Check the 'Chat History After' column for step 1 in the execution_table.
At which step does the model response get added to chat history?
AStep 3
BStep 4
CStep 2
DStep 1
💡 Hint
Look for the step where chat history grows to include the assistant's message.
If the user sends a new message, how will the chat history change?
AIt will reset to empty.
BIt will add the new user message to the existing history.
CIt will only keep the last user message.
DIt will remove the assistant messages.
💡 Hint
Refer to how chat_history updates after user input in variable_tracker.
Concept Snapshot
Chat history management stores all user and assistant messages.
Each new message is added to history.
History is passed to the model for context.
Model response is appended to history.
This keeps conversation context for better replies.
Full Transcript
Chat history management in langchain means saving every user message and model reply in a list. When a user sends a message, it is added to the chat history. Then the whole history is sent to the language model to generate a response. The model's response is also added to the history. This way, the conversation remembers what was said before. The example code shows creating a ConversationChain with ConversationBufferMemory to handle this automatically. The execution table traces each step: user message added, model generates reply, reply added, and response returned. Variables like chat_history update after each step. Key points include why both user and assistant messages are stored and what happens if history is not updated. The quiz checks understanding of chat history state at different steps and how it changes with new messages. This helps beginners see how chat history keeps conversations connected.