0
0
DynamoDBquery~20 mins

Stream processing patterns in DynamoDB - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Stream Processing Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
query_result
intermediate
2:00remaining
Identify the correct DynamoDB Stream event type for a new item insertion
When a new item is added to a DynamoDB table with streams enabled, which event type will appear in the stream record?
AMODIFY
BINSERT
CREMOVE
DUPDATE
Attempts:
2 left
💡 Hint
Think about what event type corresponds to adding new data.
🧠 Conceptual
intermediate
2:00remaining
Understanding the purpose of DynamoDB Streams in event-driven architectures
Why are DynamoDB Streams commonly used in event-driven architectures?
ATo capture and react to data changes in real-time for downstream processing
BTo store backups of the entire DynamoDB table periodically
CTo improve query performance by caching data
DTo encrypt data at rest automatically
Attempts:
2 left
💡 Hint
Think about what streams provide besides just storing data.
📝 Syntax
advanced
2:00remaining
Identify the correct AWS Lambda event source mapping filter for DynamoDB Streams
Which JSON filter pattern correctly filters DynamoDB Stream events to only process INSERT events in an AWS Lambda event source mapping?
A{"eventName": ["INSERT"]}
B{"eventName": "INSERT"}
C{"Records": [{"eventName": "INSERT"}]}
D{"eventName": {"equals": "INSERT"}}
Attempts:
2 left
💡 Hint
AWS Lambda event source filters expect arrays for matching values.
optimization
advanced
2:00remaining
Optimize processing of high-volume DynamoDB Streams to avoid Lambda throttling
You have a DynamoDB table with a high write rate generating many stream records. Which approach best helps avoid AWS Lambda throttling when processing these streams?
AIncrease the number of shards by splitting the table into multiple partitions
BSet the Lambda batch size to 1 and increase the number of concurrent Lambda executions
CUse DynamoDB Streams with enhanced fan-out to allow multiple consumers to process records independently
DDisable DynamoDB Streams and poll the table directly for changes
Attempts:
2 left
💡 Hint
Consider how to scale stream consumers efficiently.
🔧 Debug
expert
2:00remaining
Diagnose why a Lambda function triggered by DynamoDB Streams processes duplicate records
A Lambda function triggered by DynamoDB Streams sometimes processes the same record multiple times. What is the most likely cause?
AThe Lambda function is configured with an incorrect IAM role
BDynamoDB Streams duplicates records by design to ensure delivery
CThe table's stream view type is set to KEYS_ONLY instead of NEW_IMAGE
DThe Lambda function does not checkpoint processed records, causing retries on failure
Attempts:
2 left
💡 Hint
Think about how Lambda handles stream record processing and retries.