0
0
Agentic AIml~20 mins

Short-term memory (conversation context) in Agentic AI - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Short-term Memory Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
1:30remaining
Understanding Short-term Memory in Conversation Context
Which of the following best describes the role of short-term memory in conversation context for an AI agent?
AIt temporarily holds recent conversation details to maintain context during interaction.
BIt stores all past conversations permanently for future reference.
CIt deletes all conversation data immediately after each user input.
DIt only stores user preferences without any conversation details.
Attempts:
2 left
💡 Hint
Think about how an AI keeps track of what was just said to respond properly.
Model Choice
intermediate
1:30remaining
Choosing a Model for Short-term Memory in Dialogue
Which model architecture is best suited for capturing short-term memory in conversation context?
AK-Nearest Neighbors (KNN) for classification
BRecurrent Neural Network (RNN) with gated units like LSTM or GRU
CConvolutional Neural Network (CNN) for image recognition
DFeedforward Neural Network without recurrence
Attempts:
2 left
💡 Hint
Consider models designed to handle sequences and remember recent inputs.
Metrics
advanced
2:00remaining
Evaluating Short-term Memory Effectiveness
Which metric would best help evaluate how well an AI model retains short-term conversation context?
ABLEU score measuring translation quality
BPerplexity measuring prediction uncertainty on next tokens
CRecall measuring how many relevant recent context tokens are used in response
DMean Squared Error measuring regression accuracy
Attempts:
2 left
💡 Hint
Think about measuring how much recent relevant information is captured in the output.
🔧 Debug
advanced
2:00remaining
Debugging Short-term Memory Loss in Conversation
An AI agent forgets the last user question immediately after responding. Which of the following is the most likely cause?
AThe AI is using a large context window.
BThe training data is too large.
CThe model uses attention mechanisms correctly.
DThe short-term memory buffer is cleared after each response.
Attempts:
2 left
💡 Hint
Consider what happens if the memory holding recent conversation is reset too soon.
Predict Output
expert
2:30remaining
Output of Short-term Memory Simulation Code
What is the output of this Python code simulating short-term memory with a fixed-size queue?
Agentic AI
from collections import deque

memory = deque(maxlen=3)
inputs = ['Hi', 'How are you?', 'What is AI?', 'Tell me a joke']
outputs = []
for inp in inputs:
    memory.append(inp)
    outputs.append(list(memory))
print(outputs)
A[['Hi'], ['Hi', 'How are you?'], ['Hi', 'How are you?', 'What is AI?'], ['How are you?', 'What is AI?', 'Tell me a joke']]
B[['Hi'], ['How are you?'], ['What is AI?'], ['Tell me a joke']]
C[['Hi'], ['Hi', 'How are you?'], ['Hi', 'How are you?', 'What is AI?'], ['Hi', 'How are you?', 'What is AI?', 'Tell me a joke']]
D[['Hi'], ['Hi', 'How are you?'], ['How are you?', 'What is AI?'], ['What is AI?', 'Tell me a joke']]
Attempts:
2 left
💡 Hint
Remember that deque with maxlen drops oldest items when full.