Lambda with map() in Python - Time & Space Complexity
We want to understand how the time needed to run a lambda function with map() changes as the input list grows.
How does the program's work increase when we give it more items to process?
Analyze the time complexity of the following code snippet.
numbers = [1, 2, 3, 4, 5]
squared = list(map(lambda x: x * x, numbers))
print(squared)
This code takes a list of numbers and uses map with a lambda to square each number, creating a new list.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Applying the lambda function to each item in the list.
- How many times: Once for every element in the input list.
As the list gets bigger, the program does more work by applying the lambda to each new item.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 lambda calls |
| 100 | 100 lambda calls |
| 1000 | 1000 lambda calls |
Pattern observation: The work grows directly with the number of items. Double the items, double the work.
Time Complexity: O(n)
This means the time to finish grows in a straight line with the number of items to process.
[X] Wrong: "Using map with lambda is faster because it runs all at once."
[OK] Correct: The lambda still runs once per item, so the total work depends on the list size, not on running all at once.
Understanding how map and lambda scale helps you explain how your code handles bigger data clearly and confidently.
"What if we replaced map with a list comprehension using the same lambda? How would the time complexity change?"