Convert Sorted Array to Balanced BST in DSA Go - Time & Space Complexity
We want to understand how the time needed to build a balanced binary search tree changes as the input array grows.
Specifically, how does the work increase when we convert a sorted array into a balanced BST?
Analyze the time complexity of the following code snippet.
func sortedArrayToBST(nums []int) *TreeNode {
if len(nums) == 0 {
return nil
}
mid := len(nums) / 2
root := &TreeNode{Val: nums[mid]}
root.Left = sortedArrayToBST(nums[:mid])
root.Right = sortedArrayToBST(nums[mid+1:])
return root
}
This code builds a balanced BST by choosing the middle element as root and recursively doing the same for left and right halves.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Recursive calls splitting the array and creating nodes.
- How many times: Each element is processed exactly once to create a node.
Each recursive call splits the array roughly in half, and every element is visited once to create a node.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 node creations and splits |
| 100 | About 100 node creations and splits |
| 1000 | About 1000 node creations and splits |
Pattern observation: The work grows roughly in direct proportion to the input size.
Time Complexity: O(n)
This means the time to build the tree grows linearly with the number of elements in the array.
[X] Wrong: "Because the function calls itself twice, the time complexity is exponential like O(2^n)."
[OK] Correct: Each element is processed only once, and the array is split, so the total work adds up linearly, not exponentially.
Understanding this helps you explain how recursion and divide-and-conquer work together efficiently to build balanced trees.
"What if we did not pick the middle element but always picked the first element as root? How would the time complexity change?"