LIMIT clause behavior in SQL - Time & Space Complexity
We want to understand how using the LIMIT clause affects the time it takes to get results from a database query.
Specifically, how does the number of rows we ask for change the work the database does?
Analyze the time complexity of the following code snippet.
SELECT *
FROM employees
ORDER BY hire_date DESC
LIMIT 10;
This query gets the 10 most recently hired employees from the employees table.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Scanning and sorting the employees table rows by hire_date.
- How many times: The database looks at each row once to sort, then picks the top 10 rows.
As the number of rows in the employees table grows, the database must sort more rows before picking the top 10.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Sorting 10 rows, then returning 10 rows |
| 100 | Sorting 100 rows, then returning 10 rows |
| 1000 | Sorting 1000 rows, then returning 10 rows |
Pattern observation: The sorting work grows as the table grows, but the number of rows returned stays the same.
Time Complexity: O(n log n)
This means the time to run the query grows a bit faster than the number of rows because sorting takes more work as the table gets bigger.
[X] Wrong: "LIMIT 10 means the database only looks at 10 rows, so it's always fast."
[OK] Correct: The database usually must look at all rows to sort them before picking the top 10, so the work depends on the total number of rows, not just the limit.
Understanding how LIMIT affects query time helps you explain how databases handle large data efficiently and shows you think about performance in real situations.
"What if we added an index on hire_date? How would the time complexity change?"