Lazy enumerators in Ruby - Time & Space Complexity
When using lazy enumerators in Ruby, we want to understand how the program's work grows as we process more items.
We ask: How does delaying work until needed affect the total steps the program takes?
Analyze the time complexity of the following Ruby code using lazy enumerators.
numbers = (1..Float::INFINITY).lazy
squares = numbers.map { |n| n * n }
first_five = squares.first(5)
puts first_five.inspect
This code creates an infinite sequence of numbers, squares them lazily, and then takes the first five squares.
Look for loops or repeated steps in the code.
- Primary operation: Calculating the square of each number.
- How many times: Only for the first 5 numbers, because lazy evaluation stops after that.
Since the code only processes as many items as requested, work grows directly with how many items we take.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 squares calculated |
| 100 | 100 squares calculated |
| 1000 | 1000 squares calculated |
Pattern observation: The work grows linearly with the number of items requested, not with the infinite source size.
Time Complexity: O(n)
This means the program does work proportional to how many items you ask for, not more.
[X] Wrong: "Lazy enumerators process all items in the infinite list before returning results."
[OK] Correct: Lazy enumerators only do work for items actually needed, so they stop early and avoid infinite work.
Understanding lazy enumerators helps you explain how to handle large or infinite data efficiently, a useful skill in many coding challenges.
"What if we replaced first(5) with take_while { |x| x < 100 }? How would the time complexity change?"