0
0
DynamodbConceptBeginner · 4 min read

What is DynamoDB Streams: Overview and Usage

DynamoDB Streams is a feature that captures a time-ordered sequence of item-level changes in a DynamoDB table. It lets you track inserts, updates, and deletes in real time, enabling applications to react to data changes asynchronously.
⚙️

How It Works

Imagine you have a notebook where every time you write, erase, or change a note, a copy of that action is automatically recorded on a separate page. DynamoDB Streams works similarly by recording every change made to items in a DynamoDB table. These changes are stored as a stream of events in the order they happened.

Each event in the stream contains information about the type of change (insert, update, or delete) and the data before and/or after the change. This stream is like a timeline that other programs can read to know exactly what changed and when.

This mechanism allows developers to build applications that respond to data changes without constantly checking the database. Instead, they listen to the stream and act only when something changes.

💻

Example

This example shows how to enable DynamoDB Streams on a table and read records from the stream using AWS SDK for JavaScript (v3).

javascript
import { DynamoDBClient, UpdateTableCommand, DescribeTableCommand } from "@aws-sdk/client-dynamodb";
import { DynamoDBStreamsClient, GetRecordsCommand, GetShardIteratorCommand, DescribeStreamCommand } from "@aws-sdk/client-dynamodb-streams";

const tableName = "MyTable";

// Enable streams on the table
const ddbClient = new DynamoDBClient({ region: "us-east-1" });
await ddbClient.send(new UpdateTableCommand({
  TableName: tableName,
  StreamSpecification: {
    StreamEnabled: true,
    StreamViewType: "NEW_AND_OLD_IMAGES"
  }
}));

// Get stream ARN
const tableDescription = await ddbClient.send(new DescribeTableCommand({ TableName: tableName }));
const streamArn = tableDescription.Table.LatestStreamArn;

// Create Streams client
const streamsClient = new DynamoDBStreamsClient({ region: "us-east-1" });

// Describe stream to get shards
const streamDesc = await streamsClient.send(new DescribeStreamCommand({ StreamArn: streamArn }));
const shardId = streamDesc.StreamDescription.Shards[0].ShardId;

// Get shard iterator
const shardIteratorResponse = await streamsClient.send(new GetShardIteratorCommand({
  StreamArn: streamArn,
  ShardId: shardId,
  ShardIteratorType: "TRIM_HORIZON"
}));
const shardIterator = shardIteratorResponse.ShardIterator;

// Read records from the stream
const recordsResponse = await streamsClient.send(new GetRecordsCommand({ ShardIterator: shardIterator }));
console.log(recordsResponse.Records);
Output
[ { eventID: "1", eventName: "INSERT", dynamodb: { Keys: { id: { S: "123" } }, NewImage: { id: { S: "123" }, name: { S: "Alice" } }, SequenceNumber: "111", SizeBytes: 26, StreamViewType: "NEW_AND_OLD_IMAGES" } } ]
🎯

When to Use

DynamoDB Streams is useful when you want to react to changes in your database without slowing down your main application. For example:

  • Building real-time analytics dashboards that update when data changes.
  • Synchronizing data between DynamoDB and other systems like search indexes or caches.
  • Triggering workflows or notifications when specific data changes occur.
  • Implementing audit logs to track who changed what and when.

It is ideal when you need an event-driven architecture that responds quickly and scales well.

Key Points

  • DynamoDB Streams captures item-level changes in a time-ordered sequence.
  • Streams can include old and new images of changed items.
  • Applications can read streams to react asynchronously to data changes.
  • Useful for real-time processing, replication, and auditing.
  • Streams retain data for 24 hours by default.

Key Takeaways

DynamoDB Streams records every change to your table as a sequence of events.
You can use streams to build applications that react to data changes in real time.
Streams provide both old and new versions of changed items for detailed tracking.
They are great for syncing data, triggering workflows, and auditing changes.
Stream data is available for 24 hours, so process it promptly.