Array slicing in Bash Scripting - Time & Space Complexity
We want to understand how the time it takes to slice an array grows as the array gets bigger.
How does the number of elements we take affect the work done?
Analyze the time complexity of the following code snippet.
arr=(one two three four five six seven eight nine ten)
slice=("${arr[@]:2:4}")
echo "${slice[@]}"
This code takes a slice of 4 elements starting from index 2 of the array and prints them.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Copying each element from the original array slice to the new slice.
- How many times: Once for each element in the slice (4 times in this example).
As the number of elements you want to slice grows, the work grows roughly the same amount.
| Input Size (slice length) | Approx. Operations |
|---|---|
| 10 | 10 copies |
| 100 | 100 copies |
| 1000 | 1000 copies |
Pattern observation: The time grows linearly with the number of elements sliced.
Time Complexity: O(k)
This means the time depends directly on how many elements you slice, not on the total array size.
[X] Wrong: "Slicing an array always takes time proportional to the whole array size."
[OK] Correct: Actually, only the number of elements you copy matters, not the entire array length.
Knowing how slicing scales helps you write efficient scripts and explain your code clearly in interviews.
"What if we slice the array without specifying length (till the end)? How would the time complexity change?"