0
0
GraphQLquery~5 mins

Cursor-based pagination in GraphQL - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: Cursor-based pagination
O(k)
Understanding Time Complexity

When using cursor-based pagination in GraphQL, we want to know how the time to get results changes as the data grows.

We ask: How does fetching pages with cursors scale with more data?

Scenario Under Consideration

Analyze the time complexity of the following code snippet.


query GetItems($first: Int, $after: String) {
  items(first: $first, after: $after) {
    edges {
      node {
        id
        name
      }
      cursor
    }
    pageInfo {
      endCursor
      hasNextPage
    }
  }
}
    

This query fetches a page of items after a given cursor, returning a limited number of results.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: Retrieving and returning a fixed number of items (page size) after the cursor.
  • How many times: The server scans or accesses items starting from the cursor position, up to the page size.
How Execution Grows With Input

Explain the growth pattern intuitively.

Input Size (n)Approx. Operations
10About 10 items fetched
100About 10 items fetched
1000About 10 items fetched

Pattern observation: The work grows linearly with the page size, not the total data size.

Final Time Complexity

Time Complexity: O(k)

This means the time depends mainly on the number of items requested per page, not the total number of items in the database.

Common Mistake

[X] Wrong: "Fetching a page with a cursor takes time proportional to the total number of items in the database."

[OK] Correct: Cursor-based pagination fetches only a limited number of items after the cursor, so time depends on page size, not total data size.

Interview Connect

Understanding cursor-based pagination time helps you explain efficient data fetching in real apps, showing you know how to handle large lists smoothly.

Self-Check

"What if the cursor is not indexed and the server must scan from the start each time? How would the time complexity change?"