Allocate Minimum Pages Binary Search on Answer in DSA Javascript - Time & Space Complexity
We want to find how fast the binary search approach works when allocating minimum pages to students.
How does the number of operations grow as the number of books or pages increases?
Analyze the time complexity of the following code snippet.
function isPossible(books, students, maxPages) {
let requiredStudents = 1;
let currentSum = 0;
for (let pages of books) {
if (pages > maxPages) return false;
if (currentSum + pages > maxPages) {
requiredStudents++;
currentSum = pages;
} else {
currentSum += pages;
}
}
return requiredStudents <= students;
}
function allocateMinimumPages(books, students) {
let start = 0;
let end = books.reduce((a, b) => a + b, 0);
let result = end;
while (start <= end) {
let mid = Math.floor((start + end) / 2);
if (isPossible(books, students, mid)) {
result = mid;
end = mid - 1;
} else {
start = mid + 1;
}
}
return result;
}
This code tries to find the smallest maximum number of pages assigned to any student by using binary search on the answer range.
- Primary operation: The
whileloop performing binary search on the range of possible page sums. - How many times: The binary search loop runs about
log(sum of pages)times. - Secondary operation: The
forloop insideisPossiblethat checks if allocation is possible for a given maxPages. - How many times: This loop runs once per binary search iteration, iterating over all books, so
ntimes per check.
As the number of books n grows, the inner check runs n times each binary search step. The binary search runs about log(S) times, where S is the sum of all pages.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 (books) | 10 x log(S) (e.g., 10 x 7 = 70) |
| 100 (books) | 100 x log(S) (e.g., 100 x 14 = 1400) |
| 1000 (books) | 1000 x log(S) (e.g., 1000 x 20 = 20000) |
Pattern observation: The operations grow linearly with the number of books and logarithmically with the total pages sum.
Time Complexity: O(n x log S)
This means the time grows linearly with the number of books and logarithmically with the total pages, making it efficient for large inputs.
[X] Wrong: "The binary search runs in O(log n) because it depends on the number of books."
[OK] Correct: The binary search runs on the range of possible page sums, not the number of books, so it depends on the sum of pages, not just n.
Understanding this time complexity helps you explain how binary search can be applied beyond simple arrays, showing your problem-solving skills with searching answer spaces efficiently.
What if we replaced the linear check in isPossible with a more complex operation? How would that affect the overall time complexity?