Allocate Minimum Pages Binary Search on Answer in DSA C++ - Time & Space Complexity
We want to find how fast the binary search approach works to allocate minimum pages among students.
How does the number of operations grow as the number of books or pages increases?
Analyze the time complexity of the following code snippet.
bool isPossible(int arr[], int n, int m, int mid) {
int studentCount = 1, pageSum = 0;
for (int i = 0; i < n; i++) {
if (pageSum + arr[i] <= mid) {
pageSum += arr[i];
} else {
studentCount++;
pageSum = arr[i];
if (studentCount > m || arr[i] > mid) return false;
}
}
return true;
}
int allocateMinimumPages(int arr[], int n, int m) {
int start = 0, end = 0, result = -1;
for (int i = 0; i < n; i++) end += arr[i];
while (start <= end) {
int mid = start + (end - start) / 2;
if (isPossible(arr, n, m, mid)) {
result = mid;
end = mid - 1;
} else {
start = mid + 1;
}
}
return result;
}
This code tries to find the minimum maximum pages a student can get by using binary search on the answer space.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The binary search loop runs repeatedly, and inside it, the
isPossiblefunction traverses the array once. - How many times: Binary search runs about log of the sum of pages times, and
isPossibleruns once per binary search step, traversing all books (n times).
As the number of books (n) or total pages increases, the binary search tries fewer guesses than checking all possibilities.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 * log(sum of pages) |
| 100 | About 100 * log(sum of pages) |
| 1000 | About 1000 * log(sum of pages) |
Pattern observation: Operations grow linearly with number of books and logarithmically with total pages.
Time Complexity: O(n * log(sum_of_pages))
This means the time grows linearly with the number of books and logarithmically with the total pages to allocate.
[X] Wrong: "The binary search runs in O(log n) time because it halves the input size each time."
[OK] Correct: The binary search here is on the range of possible page sums, not on the number of books. Each step requires checking all books, so the array traversal makes it O(n * log(sum_of_pages)).
Understanding this time complexity helps you explain how binary search can be applied beyond sorted arrays, showing problem-solving flexibility.
"What if we used a linear search instead of binary search on the answer range? How would the time complexity change?"