How to Process Stream Records in DynamoDB Efficiently
To process
DynamoDB stream records, enable streams on your table and use an AWS Lambda function or another consumer to read the stream events. Each stream record contains information about item changes, which you can handle in your code to react to inserts, updates, or deletes.Syntax
To process DynamoDB stream records, you typically use an AWS Lambda function triggered by the stream. The Lambda handler receives an event object containing an array of Records. Each record has details about the change, such as eventName (INSERT, MODIFY, REMOVE) and dynamodb data with the old and new images.
Key parts of the event record:
eventName: Type of change (INSERT, MODIFY, REMOVE)dynamodb.Keys: The primary key of the changed itemdynamodb.NewImage: The new state of the item (for INSERT and MODIFY)dynamodb.OldImage: The old state of the item (for MODIFY and REMOVE)
javascript
exports.handler = async (event) => { for (const record of event.Records) { console.log('Event Name:', record.eventName); console.log('Keys:', JSON.stringify(record.dynamodb.Keys)); if (record.eventName === 'INSERT' || record.eventName === 'MODIFY') { console.log('New Image:', JSON.stringify(record.dynamodb.NewImage)); } if (record.eventName === 'MODIFY' || record.eventName === 'REMOVE') { console.log('Old Image:', JSON.stringify(record.dynamodb.OldImage)); } } };
Example
This example shows a simple AWS Lambda function that processes DynamoDB stream records. It logs the event type and the changed item data. This helps you react to database changes in real time.
javascript
const AWS = require('aws-sdk'); exports.handler = async (event) => { for (const record of event.Records) { console.log(`Processing record with event: ${record.eventName}`); if (record.eventName === 'INSERT') { const newItem = AWS.DynamoDB.Converter.unmarshall(record.dynamodb.NewImage); console.log('New item added:', newItem); } else if (record.eventName === 'MODIFY') { const newItem = AWS.DynamoDB.Converter.unmarshall(record.dynamodb.NewImage); const oldItem = AWS.DynamoDB.Converter.unmarshall(record.dynamodb.OldImage); console.log('Item modified from:', oldItem, 'to:', newItem); } else if (record.eventName === 'REMOVE') { const oldItem = AWS.DynamoDB.Converter.unmarshall(record.dynamodb.OldImage); console.log('Item removed:', oldItem); } } };
Output
Processing record with event: INSERT
New item added: { id: '123', name: 'Alice', age: 30 }
Processing record with event: MODIFY
Item modified from: { id: '123', name: 'Alice', age: 30 } to: { id: '123', name: 'Alice', age: 31 }
Processing record with event: REMOVE
Item removed: { id: '123', name: 'Alice', age: 31 }
Common Pitfalls
Some common mistakes when processing DynamoDB stream records include:
- Not enabling streams on the DynamoDB table or choosing the wrong stream view type.
- Assuming all records have
NewImageorOldImagewithout checking the event type. - Not handling batch processing properly, which can cause missed records.
- Failing to handle errors inside the Lambda, causing retries or data loss.
Always check eventName before accessing images and test your function with different event types.
javascript
/* Wrong way: Assuming NewImage always exists */ exports.handler = async (event) => { for (const record of event.Records) { // This will fail on REMOVE events console.log(record.dynamodb.NewImage); } }; /* Right way: Check eventName before accessing images */ exports.handler = async (event) => { for (const record of event.Records) { if (record.eventName === 'INSERT' || record.eventName === 'MODIFY') { console.log(record.dynamodb.NewImage); } else if (record.eventName === 'REMOVE') { console.log(record.dynamodb.OldImage); } } };
Quick Reference
| Concept | Description |
|---|---|
| Stream Enabled | Must enable DynamoDB Streams on the table with a view type (e.g., NEW_AND_OLD_IMAGES) |
| Event Types | INSERT, MODIFY, REMOVE indicate the type of change |
| NewImage | Contains the new item state for INSERT and MODIFY events |
| OldImage | Contains the old item state for MODIFY and REMOVE events |
| AWS Lambda | Common consumer to process stream records automatically |
| Error Handling | Handle errors inside Lambda to avoid data loss or retries |
Key Takeaways
Enable DynamoDB Streams on your table with the correct view type before processing records.
Use AWS Lambda triggered by the stream to process records in real time.
Check the event type before accessing NewImage or OldImage to avoid errors.
Handle batch records and errors properly to ensure reliable processing.
Use AWS SDK's Converter to transform DynamoDB data to plain JavaScript objects.