AWS SDK for JavaScript/Node.js in DynamoDB - Time & Space Complexity
We want to understand how the time to complete DynamoDB operations changes as we work with more data.
Specifically, how does the number of API calls grow when using AWS SDK for JavaScript/Node.js with DynamoDB?
Analyze the time complexity of the following operation sequence.
const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB.DocumentClient();
async function batchGetItems(keys) {
const params = {
RequestItems: {
'MyTable': { Keys: keys }
}
};
return dynamodb.batchGet(params).promise();
}
This code fetches multiple items from a DynamoDB table in one batch request using the AWS SDK for JavaScript/Node.js.
Identify the API calls, resource provisioning, data transfers that repeat.
- Primary operation: The
batchGetAPI call to DynamoDB. - How many times: One call per batch, but if keys exceed 100, multiple batch calls are needed.
As the number of keys increases, the number of batchGet calls grows in steps of 100 keys per call.
| Input Size (n) | Approx. API Calls/Operations |
|---|---|
| 10 | 1 |
| 100 | 1 |
| 1000 | 10 |
Pattern observation: The number of API calls grows roughly in proportion to the number of keys divided by 100.
Time Complexity: O(n)
This means the time grows linearly with the number of items requested.
[X] Wrong: "One batchGet call can fetch any number of items instantly."
[OK] Correct: DynamoDB limits batchGet to 100 items per call, so large requests require multiple calls, increasing time.
Understanding how API calls scale with input size helps you design efficient data access patterns and shows you can think about performance in real cloud applications.
"What if we changed batchGet to single GetItem calls for each key? How would the time complexity change?"