Merge Overlapping Intervals in DSA Python - Time & Space Complexity
We want to understand how the time needed to merge overlapping intervals changes as the number of intervals grows.
Specifically, how does the work increase when we have more intervals to merge?
Analyze the time complexity of the following code snippet.
def merge_intervals(intervals):
intervals.sort(key=lambda x: x[0])
merged = []
for interval in intervals:
if not merged or merged[-1][1] < interval[0]:
merged.append(interval)
else:
merged[-1][1] = max(merged[-1][1], interval[1])
return merged
This code sorts intervals by start time and then merges any that overlap into one.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Sorting the list of intervals and then looping through each interval once.
- How many times: Sorting takes multiple comparisons depending on number of intervals (n), and the loop runs once for each interval (n times).
Sorting grows faster than just looping, so it dominates the time.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 log 10 ≈ 33 operations |
| 100 | About 100 log 100 ≈ 664 operations |
| 1000 | About 1000 log 1000 ≈ 9966 operations |
Pattern observation: As input size grows, operations grow a bit more than linearly because of sorting.
Time Complexity: O(n log n)
This means the time grows a bit faster than the number of intervals because sorting takes extra work, then merging is quick.
[X] Wrong: "Merging intervals takes only O(n) time because we just loop once."
[OK] Correct: Sorting the intervals first takes more time than just looping, so the total time is dominated by sorting, not just the loop.
Understanding this time complexity helps you explain why sorting is important before merging and shows you can analyze combined steps clearly.
"What if the intervals were already sorted? How would the time complexity change?"