Transaction limits in DynamoDB - Time & Space Complexity
When using transactions in DynamoDB, it is important to understand how the number of operations affects performance.
We want to know how the time to complete a transaction grows as we add more operations inside it.
Analyze the time complexity of the following DynamoDB transaction code.
const params = {
TransactItems: [
{ Put: { TableName: 'Orders', Item: order1 } },
{ Update: { TableName: 'Inventory', Key: itemKey, UpdateExpression: 'SET qty = qty - :dec', ExpressionAttributeValues: { ':dec': 1 } } },
// ... more operations ...
]
};
await dynamodb.transactWriteItems(params).promise();
This code runs a transaction with multiple write operations in DynamoDB.
Look at what repeats as the transaction size grows.
- Primary operation: Each write or update inside the transaction.
- How many times: Equal to the number of operations inside the TransactItems array.
As you add more operations to the transaction, the total work grows roughly in direct proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 5 | 5 write/update actions |
| 10 | 10 write/update actions |
| 25 | 25 write/update actions |
Pattern observation: Doubling the number of operations roughly doubles the work done.
Time Complexity: O(n)
This means the time to complete the transaction grows linearly with the number of operations inside it.
[X] Wrong: "Adding more operations inside a transaction does not affect performance much because it's one request."
[OK] Correct: Each operation inside the transaction adds work, so more operations mean more time and resources used.
Understanding how transaction size affects performance shows you can reason about real database limits and costs, a useful skill in many jobs.
"What if we split one large transaction into multiple smaller transactions? How would the time complexity change?"