When Scan is acceptable in DynamoDB - Time & Space Complexity
We want to understand how the time to run a Scan operation in DynamoDB changes as the amount of data grows.
How does scanning many items affect the work DynamoDB does?
Analyze the time complexity of the following Scan operation.
const params = {
TableName: "Products",
FilterExpression: "Price > :minPrice",
ExpressionAttributeValues: { ":minPrice": 100 }
};
const data = await dynamodb.scan(params).promise();
console.log(data.Items);
This code scans the entire "Products" table and filters items with Price greater than 100.
Look for repeated work inside the Scan.
- Primary operation: Reading each item in the table one by one.
- How many times: Once for every item stored in the table.
As the number of items grows, the Scan reads more and more data.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Reads about 10 items |
| 100 | Reads about 100 items |
| 1000 | Reads about 1000 items |
Pattern observation: The work grows directly with the number of items. More items mean more reading.
Time Complexity: O(n)
This means the time to complete the Scan grows linearly with the number of items in the table.
[X] Wrong: "Scan only reads a few items, so it's always fast."
[OK] Correct: Scan reads every item in the table, so if the table is large, it takes longer.
Understanding when Scan is acceptable shows you know how to balance simplicity and performance in real projects.
"What if we replaced Scan with Query using a partition key? How would the time complexity change?"