transpose() for swapping axes in NumPy - Time & Space Complexity
We want to understand how the time needed to swap axes in a numpy array changes as the array size grows.
How does the work grow when we use transpose() to swap axes?
Analyze the time complexity of the following code snippet.
import numpy as np
arr = np.random.rand(1000, 500)
transposed_arr = arr.transpose()
This code creates a 2D array and swaps its rows and columns using transpose().
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Adjusting the strides and shape metadata (no data movement).
- How many times: A constant number of times (O(1) operations).
When the array size grows, the execution time remains constant since no data is copied or moved.
| Input Size (rows x cols) | Approx. Operations |
|---|---|
| 10 x 10 | 5 |
| 100 x 100 | 5 |
| 1000 x 1000 | 5 |
Pattern observation: The work stays constant regardless of input size (O(1)).
Time Complexity: O(1)
This means the time to transpose is constant, independent of the array size.
[X] Wrong: "Transposing takes time proportional to array size because it rearranges data."
[OK] Correct: Numpy's transpose() creates a view without copying data; it only adjusts the strides and shape, taking constant O(1) time.
Understanding how data rearrangement scales helps you explain performance in data processing tasks clearly and confidently.
What if we used transpose((1, 0, 2)) on a 3D array instead? How would the time complexity change?