Change stream events (insert, update, delete) in MongoDB - Time & Space Complexity
When watching for changes in a MongoDB collection, we want to know how the cost grows as more events happen.
How does the system handle more insert, update, or delete events over time?
Analyze the time complexity of the following code snippet.
const changeStream = collection.watch();
changeStream.on('change', (event) => {
if (event.operationType === 'insert') {
console.log('New document inserted');
} else if (event.operationType === 'update') {
console.log('Document updated');
} else if (event.operationType === 'delete') {
console.log('Document deleted');
}
});
This code listens for insert, update, and delete events on a collection and reacts to each event as it happens.
- Primary operation: Handling each change event as it arrives.
- How many times: Once per event, repeating indefinitely as changes occur.
Each new event triggers one handling operation. More events mean more operations, growing directly with event count.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 event handlers run |
| 100 | 100 event handlers run |
| 1000 | 1000 event handlers run |
Pattern observation: The work grows linearly as the number of events increases.
Time Complexity: O(n)
This means the time to handle events grows in direct proportion to how many events happen.
[X] Wrong: "Handling one event takes constant time no matter how many events happen overall."
[OK] Correct: While one event is handled quickly, the total time grows as more events come in, so overall work depends on event count.
Understanding how event handling scales helps you design systems that stay responsive as data changes grow.
"What if we batch process multiple change events together? How would the time complexity change?"