Why data services matter in GCP - Performance Analysis
We want to understand how the time to handle data grows as we use cloud data services more. This helps us see why these services are important for managing data efficiently.
What happens to the work done when data size or requests increase?
Analyze the time complexity of the following operation sequence.
// Using Google Cloud BigQuery to run queries on a dataset
const { BigQuery } = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
async function runQueries(queries) {
for (const query of queries) {
await bigquery.query({ query });
}
}
This code runs a list of queries one by one on a BigQuery dataset.
Identify the API calls, resource provisioning, data transfers that repeat.
- Primary operation: Running a query via BigQuery API call.
- How many times: Once per query in the list.
Each query runs separately, so the total time grows as we add more queries.
| Input Size (n) | Approx. Api Calls/Operations |
|---|---|
| 10 | 10 queries run, 10 API calls |
| 100 | 100 queries run, 100 API calls |
| 1000 | 1000 queries run, 1000 API calls |
Pattern observation: The work grows directly with the number of queries.
Time Complexity: O(n)
This means if you double the number of queries, the total time roughly doubles too.
[X] Wrong: "Running more queries won't take more time because the cloud is fast."
[OK] Correct: Each query still needs its own processing time and resources, so more queries mean more total work and time.
Understanding how work grows with data operations shows you can think about efficiency in cloud services. This skill helps you design better systems and explain your choices clearly.
"What if we batch multiple queries into one request? How would the time complexity change?"