Filter expressions in DynamoDB - Time & Space Complexity
When using filter expressions in DynamoDB, it's important to know how the time to get results changes as the data grows.
We want to understand how filtering affects the work DynamoDB does behind the scenes.
Analyze the time complexity of the following DynamoDB scan with a filter expression.
const params = {
TableName: "Products",
FilterExpression: "Price > :minPrice",
ExpressionAttributeValues: {
":minPrice": { N: "100" }
}
};
const result = await dynamodb.scan(params).promise();
This code scans the entire "Products" table and filters items where the price is greater than 100.
Look for repeated work done during the scan and filtering.
- Primary operation: Reading each item in the table one by one.
- How many times: Once for every item in the table, because scan reads all items.
As the number of items grows, the scan reads more items, checking each against the filter.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 item reads and filter checks |
| 100 | 100 item reads and filter checks |
| 1000 | 1000 item reads and filter checks |
Pattern observation: The work grows directly with the number of items in the table.
Time Complexity: O(n)
This means the time to complete the scan with filter grows linearly as the table size grows.
[X] Wrong: "Filter expressions make the scan only read matching items."
[OK] Correct: The scan reads every item first, then applies the filter, so all items are processed regardless.
Understanding how filter expressions affect scan time helps you explain real-world trade-offs when working with large datasets.
"What if we replaced scan with a query using a key condition and filter expression? How would the time complexity change?"