Event triggers for Lambda in AWS - Time & Space Complexity
When using event triggers for AWS Lambda, it's important to understand how the number of events affects the work Lambda does.
We want to know how the number of triggered events changes the total processing time.
Analyze the time complexity of this Lambda event trigger setup.
// Example: S3 bucket triggers Lambda on object creation
const aws = require('aws-sdk');
const s3 = new aws.S3();
exports.handler = async (event) => {
for (const record of event.Records) {
const bucket = record.s3.bucket.name;
const key = record.s3.object.key;
await processObject(bucket, key);
}
};
async function processObject(bucket, key) {
// Process the S3 object
}
This code, triggered by S3 object creation events, processes each object in the received event records (S3 batches multiple events into invocations).
Look at what repeats when the Lambda runs:
- Primary operation: Processing each S3 object triggered by the event.
- How many times: Once for every object creation event received.
As more objects are created, the total processing work grows linearly with the number of objects.
| Input Size (n) | Approx. Api Calls/Operations |
|---|---|
| 10 | 10 processObject calls |
| 100 | 100 processObject calls |
| 1000 | 1000 processObject calls |
Pattern observation: The number of operations grows directly with the number of events.
Time Complexity: O(n)
This means the total work grows in a straight line as the number of events increases.
[X] Wrong: "The Lambda runs once no matter how many events happen at the same time."
[OK] Correct: While S3 may batch multiple events into a single invocation, more events still mean more total work (more processObject calls) and typically more Lambda executions.
Understanding how event triggers scale helps you design systems that handle growing workloads smoothly and predict costs better.
"What if the Lambda function batches multiple events together before processing? How would the time complexity change?"