File test operators (-f, -d, -e, -r, -w, -x) in Bash Scripting - Time & Space Complexity
We want to see how the time to check file properties grows as we test many files.
How does running file tests on multiple files affect the total time?
Analyze the time complexity of the following code snippet.
for file in "$@"; do
if [ -f "$file" ]; then
echo "$file is a regular file"
elif [ -d "$file" ]; then
echo "$file is a directory"
elif [ -e "$file" ]; then
echo "$file exists but is not a regular file or directory"
else
echo "$file does not exist"
fi
if [ -r "$file" ]; then echo "$file is readable"; fi
if [ -w "$file" ]; then echo "$file is writable"; fi
if [ -x "$file" ]; then echo "$file is executable"; fi
done
This script checks each file passed as input for different properties like type and permissions.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Looping over each file and running multiple file tests.
- How many times: Once per file in the input list.
Each file causes a fixed number of checks, so total work grows directly with number of files.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 times the fixed checks |
| 100 | About 100 times the fixed checks |
| 1000 | About 1000 times the fixed checks |
Pattern observation: The total checks increase steadily as you add more files.
Time Complexity: O(n)
This means the time grows in a straight line with the number of files checked.
[X] Wrong: "Checking multiple file properties at once makes the script slower exponentially."
[OK] Correct: Each file test is a simple check, so doing several per file adds a fixed amount of work, not exponential growth.
Understanding how loops and repeated checks affect time helps you write scripts that scale well with input size.
"What if we nested the file tests inside another loop over file attributes? How would the time complexity change?"