Scan with filter expressions in DynamoDB - Time & Space Complexity
When using DynamoDB's scan with filter expressions, we want to know how the time to get results changes as the table grows.
We ask: How does the scan operation's cost grow when the table has more items?
Analyze the time complexity of the following code snippet.
const params = {
TableName: "Products",
FilterExpression: "Price > :minPrice",
ExpressionAttributeValues: {
":minPrice": { N: "100" }
}
};
const result = await dynamodb.scan(params).promise();
This code scans the entire "Products" table and returns only items where the Price is greater than 100.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Scanning each item in the table to check the filter condition.
- How many times: Once for every item in the table, regardless of the filter.
As the number of items in the table grows, the scan checks each item once.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 item checks |
| 100 | 100 item checks |
| 1000 | 1000 item checks |
Pattern observation: The number of operations grows directly with the number of items.
Time Complexity: O(n)
This means the time to scan grows linearly as the table gets bigger.
[X] Wrong: "The filter expression makes the scan only look at matching items, so it's fast even for big tables."
[OK] Correct: The scan still reads every item; the filter only removes items after reading them, so the work grows with table size.
Understanding how scan with filters works helps you explain performance trade-offs clearly, a useful skill when designing or troubleshooting databases.
"What if we replaced scan with a query using a key condition? How would the time complexity change?"