0
0
DynamoDBquery~5 mins

TTL with Streams for archival in DynamoDB - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: TTL with Streams for archival
O(n)
Understanding Time Complexity

When using TTL with Streams for archival in DynamoDB, it's important to understand how the processing time changes as data grows.

We want to know how the time to handle expired items and archive them scales with the number of expired records.

Scenario Under Consideration

Analyze the time complexity of the following DynamoDB TTL and Stream processing snippet.


// DynamoDB table with TTL enabled
// Stream triggers a Lambda on item expiration
exports.handler = async (event) => {
  for (const record of event.Records) {
    if (record.eventName === 'REMOVE') {
      await archiveItem(record.dynamodb.OldImage);
    }
  }
};

async function archiveItem(item) {
  // Save expired item to archival storage
}
    

This code listens to expired items removed by TTL and archives each one individually.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: Looping over each expired record in the event stream.
  • How many times: Once per expired item in the batch received from the stream.
How Execution Grows With Input

As the number of expired items in the stream batch grows, the processing time grows proportionally.

Input Size (n)Approx. Operations
1010 archive calls
100100 archive calls
10001000 archive calls

Pattern observation: The time grows linearly with the number of expired items processed.

Final Time Complexity

Time Complexity: O(n)

This means the time to archive expired items grows directly in proportion to how many items expire at once.

Common Mistake

[X] Wrong: "Processing expired items is constant time regardless of how many expire."

[OK] Correct: Each expired item triggers a separate archival operation, so more expired items mean more work.

Interview Connect

Understanding how batch sizes affect processing time helps you design scalable data pipelines and handle real-world data flows confidently.

Self-Check

"What if we batch multiple expired items into a single archival call? How would the time complexity change?"