Backup automation script in Bash Scripting - Time & Space Complexity
When automating backups with a script, it's important to know how the time it takes grows as the number of files increases.
We want to understand how the script's running time changes when backing up more files.
Analyze the time complexity of the following code snippet.
#!/bin/bash
SOURCE_DIR="/path/to/source"
BACKUP_DIR="/path/to/backup"
for file in "$SOURCE_DIR"/*; do
cp "$file" "$BACKUP_DIR/"
done
This script copies each file from the source folder to the backup folder one by one.
- Primary operation: The
forloop that copies each file. - How many times: Once for every file in the source directory.
As the number of files grows, the script copies more files, so the time grows roughly in direct proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 file copies |
| 100 | 100 file copies |
| 1000 | 1000 file copies |
Pattern observation: Doubling the number of files roughly doubles the time taken.
Time Complexity: O(n)
This means the time to complete the backup grows linearly with the number of files.
[X] Wrong: "The script runs in constant time because it just copies files."
[OK] Correct: Each file must be copied separately, so more files mean more work and more time.
Understanding how loops affect script speed helps you write efficient automation and explain your code clearly in interviews.
"What if the script compressed all files into one archive instead of copying them individually? How would the time complexity change?"