Iterating over arrays in Bash Scripting - Time & Space Complexity
When we run a script that goes through each item in a list, it takes some time. We want to understand how this time changes as the list gets bigger.
The question is: How does the work grow when we look at more items?
Analyze the time complexity of the following code snippet.
# Define an array
arr=(apple banana cherry date elderberry)
# Loop through each element
for item in "${arr[@]}"; do
echo "$item"
done
This script prints each fruit name from the array one by one.
- Primary operation: The
forloop that goes through each item in the array. - How many times: Once for every element in the array.
As the array gets bigger, the script prints more items, so it takes longer.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 print actions |
| 100 | 100 print actions |
| 1000 | 1000 print actions |
Pattern observation: The work grows directly with the number of items. Double the items, double the work.
Time Complexity: O(n)
This means the time to finish grows in a straight line with the number of items in the array.
[X] Wrong: "The loop runs in constant time no matter how many items there are."
[OK] Correct: Each item needs to be handled once, so more items mean more work and more time.
Understanding how loops grow with input size is a key skill. It helps you explain your code clearly and shows you know how to think about efficiency.
"What if we nested another loop inside to compare each item with every other item? How would the time complexity change?"