Why result control matters in PostgreSQL - Performance Analysis
When working with database queries, controlling the results returned is important for performance.
We want to understand how the size of the result affects the time it takes to get data.
Analyze the time complexity of the following query that limits results.
SELECT *
FROM employees
WHERE department = 'Sales'
ORDER BY hire_date DESC
LIMIT 10;
This query fetches the 10 most recently hired employees in the Sales department.
Look at what repeats as data grows.
- Primary operation: Scanning employees in Sales and sorting by hire date.
- How many times: Depends on number of Sales employees before limiting to 10.
As the number of Sales employees grows, the database must check more rows and sort more data.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Checking and sorting about 10 rows |
| 100 | Checking and sorting about 100 rows |
| 1000 | Checking and sorting about 1000 rows |
Pattern observation: More rows mean more work before picking the top 10 results.
Time Complexity: O(n log n)
This means the time grows faster than the number of rows because sorting takes more steps as data grows.
[X] Wrong: "Limiting results to 10 means the query always runs fast regardless of data size."
[OK] Correct: The database still processes all matching rows before sorting and limiting, so large data means more work.
Understanding how result size affects query time helps you write efficient queries and explain performance in real projects.
"What if we added an index on hire_date? How would the time complexity change?"