Why performance tuning matters in PostgreSQL - Performance Analysis
When working with databases, how fast a query runs can change a lot as the data grows.
We want to understand how the time to run a query changes when we have more data.
Analyze the time complexity of the following code snippet.
SELECT *
FROM orders
WHERE customer_id = 12345;
-- This query fetches all orders for one customer.
-- It scans the orders table to find matching rows.
This query looks for all orders from a specific customer in the orders table.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Scanning the orders table rows to find matches.
- How many times: Once for each row in the orders table.
As the number of orders grows, the time to find matching orders grows too.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 row checks |
| 100 | 100 row checks |
| 1000 | 1000 row checks |
Pattern observation: The work grows directly with the number of rows in the table.
Time Complexity: O(n)
This means the time to run the query grows in a straight line as the table gets bigger.
[X] Wrong: "The query will always run fast no matter how big the table is."
[OK] Correct: Without tuning or indexes, the database checks every row, so bigger tables take longer.
Understanding how query time grows helps you write better database code and shows you know how to handle real data sizes.
"What if we add an index on customer_id? How would the time complexity change?"