Challenge - 5 Problems
Stream Processing Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ query_result
intermediate2:00remaining
Identify the correct DynamoDB Stream event type for a new item insertion
When a new item is added to a DynamoDB table with streams enabled, which event type will appear in the stream record?
Attempts:
2 left
💡 Hint
Think about what event type corresponds to adding new data.
✗ Incorrect
The INSERT event type is generated when a new item is added to the table. MODIFY is for updates, REMOVE is for deletions, and UPDATE is not a valid DynamoDB stream event type.
🧠 Conceptual
intermediate2:00remaining
Understanding the purpose of DynamoDB Streams in event-driven architectures
Why are DynamoDB Streams commonly used in event-driven architectures?
Attempts:
2 left
💡 Hint
Think about what streams provide besides just storing data.
✗ Incorrect
DynamoDB Streams capture data modification events in real-time, enabling other services to react to changes immediately. They are not used for backups, caching, or encryption.
📝 Syntax
advanced2:00remaining
Identify the correct AWS Lambda event source mapping filter for DynamoDB Streams
Which JSON filter pattern correctly filters DynamoDB Stream events to only process INSERT events in an AWS Lambda event source mapping?
Attempts:
2 left
💡 Hint
AWS Lambda event source filters expect arrays for matching values.
✗ Incorrect
The correct filter pattern uses an array for the eventName key to match INSERT events. Option A is invalid because it uses a string instead of an array. Option A is incorrect because the filter does not include the Records wrapper. Option A uses an invalid syntax.
❓ optimization
advanced2:00remaining
Optimize processing of high-volume DynamoDB Streams to avoid Lambda throttling
You have a DynamoDB table with a high write rate generating many stream records. Which approach best helps avoid AWS Lambda throttling when processing these streams?
Attempts:
2 left
💡 Hint
Consider how to scale stream consumers efficiently.
✗ Incorrect
Enhanced fan-out allows multiple Lambda functions or consumers to receive stream records with dedicated throughput, reducing throttling. Increasing shards by splitting the table is not directly possible. Setting batch size to 1 may increase overhead. Disabling streams removes the event-driven benefit.
🔧 Debug
expert2:00remaining
Diagnose why a Lambda function triggered by DynamoDB Streams processes duplicate records
A Lambda function triggered by DynamoDB Streams sometimes processes the same record multiple times. What is the most likely cause?
Attempts:
2 left
💡 Hint
Think about how Lambda handles stream record processing and retries.
✗ Incorrect
Lambda retries processing records if the function does not successfully checkpoint or acknowledge them, leading to duplicates. DynamoDB Streams does not duplicate records arbitrarily. Stream view type affects data content, not duplication. IAM role issues cause permission errors, not duplicates.