np.savetxt() and np.loadtxt() for text in NumPy - Time & Space Complexity
We want to understand how the time needed to save and load data with numpy changes as the data size grows.
How does the time to write or read arrays scale when the array gets bigger?
Analyze the time complexity of the following code snippet.
import numpy as np
arr = np.random.rand(1000, 10)
np.savetxt('data.txt', arr)
loaded_arr = np.loadtxt('data.txt')
This code saves a 2D array to a text file and then loads it back into memory.
Look at what repeats when saving and loading.
- Primary operation: Writing or reading each element of the array one by one.
- How many times: Once for every element in the array (rows x columns).
As the array size grows, the time to save or load grows roughly in proportion to the number of elements.
| Input Size (rows x columns) | Approx. Operations |
|---|---|
| 10 x 10 = 100 | About 100 operations |
| 100 x 10 = 1,000 | About 1,000 operations |
| 1,000 x 10 = 10,000 | About 10,000 operations |
Pattern observation: The time grows linearly with the total number of elements.
Time Complexity: O(n)
This means the time to save or load grows directly with the number of elements in the array.
[X] Wrong: "Saving or loading a file takes the same time no matter how big the array is."
[OK] Correct: The program must process each element, so bigger arrays take more time to write or read.
Understanding how file input/output scales helps you handle data efficiently in real projects and shows you think about performance.
"What if we saved the array in binary format instead of text? How would the time complexity change?"