Uniform random with random() in NumPy - Time & Space Complexity
We want to understand how the time it takes to generate random numbers grows as we ask for more numbers.
How does the cost change when we increase the amount of random values generated?
Analyze the time complexity of the following code snippet.
import numpy as np
n = 1000
random_numbers = np.random.random(n)
This code generates n random numbers between 0 and 1 using NumPy's random() function.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Generating each random number independently.
- How many times: Exactly
ntimes, once per number requested.
Each random number takes about the same time to generate, so the total time grows directly with how many numbers we ask for.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 random generations |
| 100 | 100 random generations |
| 1000 | 1000 random generations |
Pattern observation: Doubling n roughly doubles the work needed.
Time Complexity: O(n)
This means the time to generate random numbers grows in a straight line with the number of values requested.
[X] Wrong: "Generating multiple random numbers is done all at once, so time stays the same no matter how many numbers we ask for."
[OK] Correct: Each number requires its own calculation, so more numbers mean more work and more time.
Understanding how generating random data scales helps you reason about performance in simulations and data sampling tasks.
"What if we generate a 2D array of random numbers with shape (n, m)? How would the time complexity change?"