Why REST APIs exist - Performance Analysis
We want to understand how the time it takes to use REST APIs changes as the amount of data or requests grows.
How does the work needed grow when more clients or data are involved?
Analyze the time complexity of the following REST API request handling.
GET /items
- Server fetches all items from database
- Server processes each item to prepare response
- Server sends response back to client
This code snippet shows a simple REST API endpoint that returns a list of items by fetching and processing them before sending back.
Look for repeated work that grows with input size.
- Primary operation: Processing each item in the list.
- How many times: Once for every item fetched from the database.
As the number of items grows, the work to process them grows too.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 processing steps |
| 100 | 100 processing steps |
| 1000 | 1000 processing steps |
Pattern observation: The work grows directly with the number of items; doubling items doubles the work.
Time Complexity: O(n)
This means the time to handle the request grows in a straight line with the number of items.
[X] Wrong: "The time to get all items stays the same no matter how many items there are."
[OK] Correct: Because the server must look at each item to prepare the response, more items mean more work and more time.
Understanding how REST API work time grows helps you explain real-world system behavior clearly and confidently.
"What if the server cached the items instead of fetching each time? How would the time complexity change?"