0
0
Bash-scriptingHow-ToBeginner · 2 min read

Bash Script to Automate Backup with Simple Commands

Use a Bash script like tar -czf backup-$(date +%Y%m%d%H%M%S).tar.gz /path/to/source inside a script to automate backups by compressing files with a timestamped name.
📋

Examples

Input/home/user/documents
OutputCreated backup archive backup-20240601123045.tar.gz containing /home/user/documents
Input/var/www/html
OutputCreated backup archive backup-20240601123100.tar.gz containing /var/www/html
Input/tmp
OutputCreated backup archive backup-20240601123115.tar.gz containing /tmp
🧠

How to Think About It

To automate backup, think about compressing the files or folders you want to save into a single archive file. Use the current date and time in the filename to avoid overwriting old backups. The script should take the source path and create a compressed archive with a unique name.
📐

Algorithm

1
Get the source directory or file path to back up.
2
Create a filename with the current date and time for uniqueness.
3
Use a compression tool like tar with gzip to create an archive of the source.
4
Save the archive in a backup directory or current folder.
5
Print a message confirming the backup creation.
💻

Code

bash
#!/bin/bash

SOURCE="$1"
BACKUP_DIR="./backups"

mkdir -p "$BACKUP_DIR"

TIMESTAMP=$(date +%Y%m%d%H%M%S)
ARCHIVE="$BACKUP_DIR/backup-$TIMESTAMP.tar.gz"

tar -czf "$ARCHIVE" "$SOURCE"
echo "Created backup archive $ARCHIVE containing $SOURCE"
Output
Created backup archive ./backups/backup-20240601123145.tar.gz containing /home/user/documents
🔍

Dry Run

Let's trace backing up /home/user/documents through the code

1

Set SOURCE

SOURCE="/home/user/documents"

2

Create backup directory

mkdir -p ./backups (creates if not exists)

3

Generate timestamp

TIMESTAMP=20240601123145

4

Set archive path

ARCHIVE=./backups/backup-20240601123145.tar.gz

5

Create archive

tar -czf ./backups/backup-20240601123145.tar.gz /home/user/documents

6

Print confirmation

Created backup archive ./backups/backup-20240601123145.tar.gz containing /home/user/documents

StepActionValue
1SOURCE/home/user/documents
2Backup directory./backups
3TIMESTAMP20240601123145
4ARCHIVE./backups/backup-20240601123145.tar.gz
5tar commandtar -czf ./backups/backup-20240601123145.tar.gz /home/user/documents
6OutputCreated backup archive ./backups/backup-20240601123145.tar.gz containing /home/user/documents
💡

Why This Works

Step 1: Using <code>tar</code> with gzip

The tar -czf command creates a compressed archive file, combining files into one and compressing them to save space.

Step 2: Timestamp in filename

Adding $(date +%Y%m%d%H%M%S) ensures each backup file has a unique name based on the current date and time, preventing overwrites.

Step 3: Backup directory creation

The script creates a backups folder if it doesn't exist, keeping backup files organized in one place.

🔄

Alternative Approaches

Using rsync for incremental backup
bash
#!/bin/bash
SOURCE="$1"
DEST="./backups/"
mkdir -p "$DEST"
rsync -av --delete "$SOURCE" "$DEST"
echo "Synced $SOURCE to $DEST"
rsync copies only changed files, saving time and space but does not create compressed archives.
Using zip instead of tar
bash
#!/bin/bash
SOURCE="$1"
BACKUP_DIR="./backups"
mkdir -p "$BACKUP_DIR"
TIMESTAMP=$(date +%Y%m%d%H%M%S)
ARCHIVE="$BACKUP_DIR/backup-$TIMESTAMP.zip"
zip -r "$ARCHIVE" "$SOURCE"
echo "Created zip backup $ARCHIVE containing $SOURCE"
zip creates compressed archives compatible with Windows but may be slower than tar gzip on Linux.

Complexity: O(n) time, O(n) space

Time Complexity

The script's time depends on the size and number of files in the source directory, as it processes each file once during compression.

Space Complexity

The archive file requires space proportional to the total size of the source files; no extra memory is used beyond temporary buffers.

Which Approach is Fastest?

Using rsync is faster for repeated backups because it copies only changed files, while tar compresses everything each time.

ApproachTimeSpaceBest For
tar gzip archiveO(n)O(n)Full compressed backups
rsync incrementalO(changed files)O(changed files)Fast incremental syncs
zip archiveO(n)O(n)Cross-platform compressed backups
💡
Always test your backup script with a small folder first to ensure it works before automating.
⚠️
Forgetting to quote variables like "$SOURCE" can cause errors with paths containing spaces.