Divide and Conquer Strategy and Recurrence Relations in DSA C - Time & Space Complexity
We want to understand how dividing a problem into smaller parts affects the time it takes to solve it.
How does breaking a task into pieces change the total work needed?
Analyze the time complexity of the following code snippet.
void divideAndConquer(int n) {
if (n <= 1) return;
divideAndConquer(n / 2);
divideAndConquer(n / 2);
// constant time work
}
This code splits the problem into two halves recursively and does a small fixed amount of work each time.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Two recursive calls each on half the input size.
- How many times: The recursion continues until the input size reduces to 1.
Each call splits into two smaller calls, doubling the number of calls as input grows.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 19 calls |
| 100 | About 199 calls |
| 1000 | About 1999 calls |
Pattern observation: The number of calls roughly doubles each time the input size doubles, growing quickly but not as fast as squaring.
Time Complexity: O(n)
This means the total work grows roughly in direct proportion to the input size.
[X] Wrong: "Because there are two recursive calls, the time must be exponential like O(2^n)."
[OK] Correct: Each call works on half the input, so the total number of calls grows linearly, not exponentially.
Understanding how dividing a problem affects time helps you explain and design efficient algorithms confidently.
"What if the code made three recursive calls instead of two, each on one-third of the input? How would the time complexity change?"