Imagine you have a big problem to solve. You decide to split it into smaller parts, solve each part, and then combine the answers. What is the main advantage of this approach?
Think about how breaking a big task into smaller tasks helps you work faster.
Divide and conquer works by breaking a big problem into smaller, easier problems. This reduces the size of the problem at each step, making it simpler to solve and often faster overall.
What is the printed output after running this simplified merge sort divide step on the array [4, 2, 7, 1]?
#include <stdio.h> void printArray(int arr[], int size) { for (int i = 0; i < size; i++) { printf("%d ", arr[i]); } printf("\n"); } void mergeSortDivide(int arr[], int start, int end) { if (start < end) { int mid = (start + end) / 2; printArray(arr + start, mid - start + 1); printArray(arr + mid + 1, end - mid); } } int main() { int arr[] = {4, 2, 7, 1}; mergeSortDivide(arr, 0, 3); return 0; }
Look at how the array is split into two halves and printed separately.
The array [4, 2, 7, 1] is split into two parts: [4, 2] and [7, 1]. The code prints these two parts separately.
What is the output of the following code that prints the pivot chosen at each recursive call of quick sort on the array [3, 6, 8, 10, 1, 2, 1]?
#include <stdio.h> void quickSort(int arr[], int low, int high) { if (low < high) { int pivot = arr[high]; printf("%d ", pivot); int i = low - 1; for (int j = low; j < high; j++) { if (arr[j] < pivot) { i++; int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } } int temp = arr[i + 1]; arr[i + 1] = arr[high]; arr[high] = temp; int pi = i + 1; quickSort(arr, low, pi - 1); quickSort(arr, pi + 1, high); } } int main() { int arr[] = {3, 6, 8, 10, 1, 2, 1}; quickSort(arr, 0, 6); return 0; }
Trace the pivot chosen at each recursive call carefully.
The pivot chosen is always the last element in the current subarray. The sequence of pivots printed matches the order of recursive calls.
Why does breaking a problem into smaller parts and solving each part separately often lead to faster algorithms?
Think about how working on small tasks separately can save time compared to one big task.
Divide and conquer reduces the total work by solving smaller problems independently and combining results efficiently, often leading to better time complexity.
Consider a divide and conquer algorithm that splits a problem of size n into 3 equal parts, solves each recursively, and combines results in linear time. How many subproblems are solved at the second level of recursion?
At each level, the number of subproblems multiplies by how many parts you split into.
At the first level, there are 3 subproblems. Each of these splits into 3 more at the second level, so total subproblems at second level = 3 * 3 = 9.