Integer random with integers() in NumPy - Time & Space Complexity
We want to understand how the time it takes to generate random integers changes as we ask for more numbers.
How does the work grow when we increase the number of random integers generated?
Analyze the time complexity of the following code snippet.
import numpy as np
# Generate 1 million random integers between 0 and 9
random_numbers = np.random.default_rng().integers(low=0, high=10, size=1000000)
This code creates an array of one million random integers from 0 to 9.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Generating each random integer one by one internally.
- How many times: Once for each number requested (size of the array).
As we ask for more random numbers, the work grows directly with how many numbers we want.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 random numbers generated |
| 100 | About 100 random numbers generated |
| 1000 | About 1000 random numbers generated |
Pattern observation: The number of operations grows in a straight line with the input size.
Time Complexity: O(n)
This means the time to generate random integers grows directly with how many numbers you want.
[X] Wrong: "Generating many random numbers is almost instant no matter how many I ask for."
[OK] Correct: Each number requires some work, so asking for more numbers takes more time.
Understanding how time grows with input size helps you explain performance clearly and shows you know how algorithms scale in real tasks.
"What if we generate random integers without specifying the size (just one number)? How would the time complexity change?"