Generating random samples in NumPy - Time & Space Complexity
We want to understand how the time needed to create random samples changes as we ask for more samples.
How does the work grow when we increase the number of random values generated?
Analyze the time complexity of the following code snippet.
import numpy as np
# Generate n random samples from a normal distribution
n = 1000
samples = np.random.normal(loc=0, scale=1, size=n)
This code creates an array of n random numbers from a normal distribution.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Generating each random number in the array.
- How many times: Exactly
ntimes, once for each sample requested.
As we ask for more samples, the time to generate them grows roughly in direct proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 random number generations |
| 100 | About 100 random number generations |
| 1000 | About 1000 random number generations |
Pattern observation: Doubling the number of samples roughly doubles the work done.
Time Complexity: O(n)
This means the time to generate samples grows linearly with the number of samples requested.
[X] Wrong: "Generating 1000 samples takes the same time as generating 10 samples because it's just one function call."
[OK] Correct: Each sample requires work, so more samples mean more time, even if it's one call.
Understanding how time grows with input size helps you explain performance clearly and shows you can think about efficiency in real tasks.
"What if we generate samples in batches of fixed size instead of all at once? How would the time complexity change?"