Why atomic operations ensure consistency in Firebase - Performance Analysis
We want to see how the time to complete atomic operations changes as we do more of them in Firebase.
How does the number of atomic updates affect the work done?
Analyze the time complexity of this atomic update sequence.
const db = firebase.firestore();
const docRef = db.collection('counters').doc('countDoc');
await db.runTransaction(async (transaction) => {
const doc = await transaction.get(docRef);
const newCount = (doc.data()?.count || 0) + 1;
transaction.update(docRef, { count: newCount });
});
This code reads a document, increments a count, and writes it back atomically.
Look at what repeats when we do many atomic increments.
- Primary operation: Reading and writing the document inside a transaction.
- How many times: Once per increment operation requested.
Each atomic increment requires a read and a write inside a transaction.
| Input Size (n) | Approx. Api Calls/Operations |
|---|---|
| 10 | 20 (10 reads + 10 writes) |
| 100 | 200 (100 reads + 100 writes) |
| 1000 | 2000 (1000 reads + 1000 writes) |
Pattern observation: The number of operations grows directly with the number of increments.
Time Complexity: O(n)
This means the time grows linearly as you do more atomic increments.
[X] Wrong: "Atomic operations happen instantly no matter how many."
[OK] Correct: Each atomic operation still needs to read and write data, so doing more means more work and time.
Understanding how atomic operations scale helps you design reliable apps that stay consistent even when many users update data.
What if we batch multiple increments in one transaction? How would the time complexity change?