Array comparison and set operations in Ruby - Time & Space Complexity
When we compare arrays or perform set operations, we want to know how the time needed grows as the arrays get bigger.
We ask: How does the work change when the input arrays grow in size?
Analyze the time complexity of the following code snippet.
arr1 = [1, 2, 3, 4, 5]
arr2 = [4, 5, 6, 7, 8]
# Find common elements
common = arr1 & arr2
# Find elements in arr1 but not in arr2
diff = arr1 - arr2
# Check if arr1 includes all elements of arr2
includes_all = (arr2 - arr1).empty?
This code finds common elements, differences, and checks inclusion between two arrays.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Building a hash from one array and performing constant-time lookups for elements of the other.
- How many times: Hash construction takes O(m) time, then O(n) lookups, total O(n + m).
As the arrays get bigger, the number of operations grows linearly because Ruby uses hash tables for efficient lookups.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 20 operations |
| 100 | About 200 operations |
| 1000 | About 2000 operations |
Pattern observation: The work grows linearly with the input size.
Time Complexity: O(n + m)
This means the time needed grows proportionally to the sum of the sizes of the two arrays.
[X] Wrong: "Array set operations like & and - take O(n * m) time with linear scans."
[OK] Correct: Ruby uses hash tables internally for O(1) average-case lookups after O(m) preprocessing, for total O(n + m).
Understanding how array comparisons scale helps you explain your code choices clearly and shows you know how to handle bigger data efficiently.
What if we implemented these operations using nested loops instead? How would the time complexity change?