Default values (${var:-default}) in Bash Scripting - Time & Space Complexity
We want to understand how the time it takes to run a script changes when using default values in bash variables.
Specifically, how does checking and using a default value affect the script's speed as input grows?
Analyze the time complexity of the following code snippet.
for file in /some/directory/*; do
filename=${file:-default.txt}
echo "Processing $filename"
done
This script loops over files in a directory, assigns a default name if a variable is empty, and prints the name.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The for-loop iterates over each file name.
- How many times: Once for each file in the directory (n times).
- The default value check (${file:-default.txt}) happens inside the loop for each file.
Each file causes the loop to run once, doing a simple check and print.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 checks and prints |
| 100 | About 100 checks and prints |
| 1000 | About 1000 checks and prints |
Pattern observation: The work grows directly with the number of files; doubling files doubles work.
Time Complexity: O(n)
This means the time to run grows in a straight line with the number of files processed.
[X] Wrong: "Using a default value check inside the loop makes the script run slower exponentially."
[OK] Correct: The default check is a simple operation done once per item, so it only adds a small constant time each loop, not exponential growth.
Understanding how simple variable checks inside loops affect performance helps you write efficient scripts and explain your reasoning clearly in interviews.
"What if we replaced the default value check with a function call inside the loop? How would the time complexity change?"