Upload progress monitoring in Firebase - Time & Space Complexity
When uploading files with Firebase, tracking progress helps users see how much is done.
We want to understand how the time to update progress changes as the file size grows.
Analyze the time complexity of the following Firebase upload progress code.
const uploadTask = storageRef.put(file);
uploadTask.on('state_changed', snapshot => {
const progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
});
This code listens to upload progress events and calculates the percentage uploaded.
Look for repeated actions during the upload.
- Primary operation: The progress event handler runs repeatedly as data uploads.
- How many times: Number of times depends on how many chunks the file is split into during upload.
As the file size grows, the number of progress updates grows roughly in proportion.
| Input Size (n in MB) | Approx. Progress Events |
|---|---|
| 10 | About 10-20 events |
| 100 | About 100-200 events |
| 1000 | About 1000-2000 events |
Pattern observation: More data means more progress events roughly in direct proportion.
Time Complexity: O(n)
This means the number of progress updates grows linearly with the file size.
[X] Wrong: "Progress updates happen only once or twice regardless of file size."
[OK] Correct: The upload sends data in chunks, so progress updates happen many times, increasing with file size.
Understanding how progress updates scale helps you design smooth user experiences and efficient event handling in real projects.
What if the upload used larger chunk sizes? How would the time complexity of progress updates change?