Why Cloud Storage matters for object data in GCP - Performance Analysis
We want to understand how the time to store and retrieve objects in Cloud Storage changes as we handle more data.
How does the number of objects affect the work done by Cloud Storage?
Analyze the time complexity of uploading multiple objects to Cloud Storage.
// Upload multiple files to a Cloud Storage bucket
for (let i = 0; i < files.length; i++) {
await storage.bucket('my-bucket').upload(files[i]);
}
This sequence uploads each file one by one to the storage bucket.
Identify the API calls, resource provisioning, data transfers that repeat.
- Primary operation: Upload API call for each file.
- How many times: Once per file, so as many times as the number of files.
Each new file adds one upload operation, so the total work grows directly with the number of files.
| Input Size (n) | Approx. API Calls/Operations |
|---|---|
| 10 | 10 uploads |
| 100 | 100 uploads |
| 1000 | 1000 uploads |
Pattern observation: The number of operations grows in a straight line with the number of files.
Time Complexity: O(n)
This means the time to upload grows directly in proportion to the number of objects.
[X] Wrong: "Uploading many files happens all at once, so time stays the same no matter how many files."
[OK] Correct: Each file upload is a separate operation that takes time, so more files mean more total time.
Understanding how upload time grows with data size shows you can think about real cloud workloads and their costs.
"What if we upload files in parallel instead of one by one? How would the time complexity change?"