0
0
Prompt Engineering / GenAIml~12 mins

Memory for conversation history in Prompt Engineering / GenAI - Model Pipeline Trace

Choose your learning style9 modes available
Model Pipeline - Memory for conversation history

This pipeline shows how a conversation AI remembers past messages to give better answers. It stores and updates conversation history, then uses it to understand new questions and respond well.

Data Flow - 6 Stages
1User Input
1 message stringReceive new user message1 message string
"What is the weather today?"
2Retrieve Conversation History
Previous conversation messages (5 messages x 1 string each)Load stored past messages5 messages x 1 string each
["Hi!", "Hello, how can I help?", "Tell me a joke.", "Why did the chicken cross the road?", "To get to the other side."]
3Combine Input and History
1 new message + 5 past messagesConcatenate messages into one context1 combined text string
"Hi! Hello, how can I help? Tell me a joke. Why did the chicken cross the road? To get to the other side. What is the weather today?"
4Encode Context
1 combined text stringConvert text to numerical vectors1 vector of length 768
[0.12, -0.05, 0.33, ..., 0.07]
5Generate Response
1 vector of length 768Model predicts next message tokens1 response string
"The weather today is sunny with a high of 75 degrees."
6Update Conversation History
Previous 5 messages + new user message + new responseAdd new messages and remove oldest if neededUpdated 5 messages x 1 string each
["Tell me a joke.", "Why did the chicken cross the road?", "To get to the other side.", "What is the weather today?", "The weather today is sunny with a high of 75 degrees."]
Training Trace - Epoch by Epoch

Loss
1.2 |*       
1.0 | **     
0.8 |  ***   
0.6 |   **** 
0.4 |    *****
     --------
     Epochs
EpochLoss ↓Accuracy ↑Observation
11.20.45Model starts learning to remember conversation context.
20.90.60Loss decreases as model better understands history.
30.70.72Model improves at generating relevant responses.
40.50.80Training converges; model remembers longer history.
50.40.85Final epoch with good balance of memory and response quality.
Prediction Trace - 6 Layers
Layer 1: Input message
Layer 2: Retrieve conversation history
Layer 3: Combine input and history
Layer 4: Encode context
Layer 5: Generate response
Layer 6: Update conversation history
Model Quiz - 3 Questions
Test your understanding
What happens to the conversation history after a new response is generated?
AIt stays the same without changes.
BIt is updated by adding new messages and removing oldest ones.
CIt is deleted completely.
DIt is converted into images.
Key Insight
Remembering past conversation messages helps the AI give relevant and coherent answers. The model learns to store and update history, improving response quality over time.