Backup strategies in Linux CLI - Time & Space Complexity
When we run backup commands, we want to know how long they take as our files grow.
We ask: How does the time to back up data change when we add more files?
Analyze the time complexity of the following backup script snippet.
#!/bin/bash
SOURCE_DIR="/home/user/data"
BACKUP_DIR="/backup/data"
for file in "$SOURCE_DIR"/*; do
cp "$file" "$BACKUP_DIR"
done
This script copies each file from the source folder to the backup folder one by one.
Look for repeated actions in the script.
- Primary operation: Copying each file with
cp. - How many times: Once for every file in the source directory.
As the number of files grows, the script copies more files one by one.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 file copies |
| 100 | 100 file copies |
| 1000 | 1000 file copies |
Pattern observation: The time grows directly with the number of files.
Time Complexity: O(n)
This means the backup time increases in a straight line as you add more files.
[X] Wrong: "Copying many files takes the same time as copying one file."
[OK] Correct: Each file needs its own copy step, so more files mean more time.
Understanding how backup time grows helps you design better scripts and explain your choices clearly.
"What if we used a tool that copies all files in one command instead of looping? How would the time complexity change?"