0
0
Bash-scriptingHow-ToBeginner · 2 min read

Bash Script to Download Multiple Files Easily

Use a Bash script with a loop and wget or curl to download multiple files, for example: for url in url1 url2; do wget "$url"; done.
📋

Examples

Inputurls=("https://example.com/file1.txt" "https://example.com/file2.txt")
OutputDownloading file1.txt Downloading file2.txt Files saved in current directory
Inputurls=("https://example.com/image.png" "https://example.com/document.pdf")
OutputDownloading image.png Downloading document.pdf Files saved in current directory
Inputurls=("https://invalid-url/file.txt")
OutputDownloading file.txt wget: unable to resolve host address 'invalid-url' Download failed for file.txt
🧠

How to Think About It

To download multiple files in Bash, list all URLs and loop over them. For each URL, use a command like wget or curl to fetch the file. This way, you automate downloading many files without typing each command manually.
📐

Algorithm

1
Create a list of URLs to download
2
For each URL in the list, run the download command
3
Save the downloaded file in the current directory
4
Print a message for each download to track progress
💻

Code

bash
#!/bin/bash
urls=("https://example.com/file1.txt" "https://example.com/file2.txt")
for url in "${urls[@]}"; do
  echo "Downloading ${url##*/}"
  wget "$url"
done
Output
Downloading file1.txt --2024-06-01 12:00:00-- https://example.com/file1.txt Resolving example.com (example.com)... 93.184.216.34 Connecting to example.com (example.com)|93.184.216.34|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 1234 (1.2K) [text/plain] Saving to: ‘file1.txt’ file1.txt 100%[===================>] 1.2K --.-KB/s in 0s Downloading file2.txt --2024-06-01 12:00:01-- https://example.com/file2.txt Resolving example.com (example.com)... 93.184.216.34 Connecting to example.com (example.com)|93.184.216.34|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 2345 (2.3K) [text/plain] Saving to: ‘file2.txt’ file2.txt 100%[===================>] 2.3K --.-KB/s in 0s
🔍

Dry Run

Let's trace downloading two files through the script

1

Start loop

urls contains ["https://example.com/file1.txt", "https://example.com/file2.txt"]

2

Download first file

url = https://example.com/file1.txt, filename = file1.txt

3

Download second file

url = https://example.com/file2.txt, filename = file2.txt

IterationURLFilename
1https://example.com/file1.txtfile1.txt
2https://example.com/file2.txtfile2.txt
💡

Why This Works

Step 1: Loop over URLs

The for loop goes through each URL in the list one by one.

Step 2: Extract filename

Using ${url##*/} extracts the file name from the URL to show a friendly message.

Step 3: Download file

The wget command downloads the file from the URL and saves it in the current folder.

🔄

Alternative Approaches

Using curl in a loop
bash
#!/bin/bash
urls=("https://example.com/file1.txt" "https://example.com/file2.txt")
for url in "${urls[@]}"; do
  echo "Downloading ${url##*/}"
  curl -O "$url"
done
Uses curl instead of wget; curl is often pre-installed on many systems.
Reading URLs from a file
bash
#!/bin/bash
while IFS= read -r url; do
  echo "Downloading ${url##*/}"
  wget "$url"
done < urls.txt
Reads URLs line-by-line from a file named urls.txt, useful for many URLs.
Parallel downloads with xargs
bash
cat urls.txt | xargs -n 1 -P 4 wget
Downloads multiple files in parallel to speed up the process, but harder to track progress.

Complexity: O(n) time, O(n) space

Time Complexity

The script downloads each file one by one, so time grows linearly with the number of URLs.

Space Complexity

The script stores the list of URLs in memory, which grows linearly with the number of URLs.

Which Approach is Fastest?

Parallel downloads with xargs can be faster but add complexity; simple loops are easier to understand and debug.

ApproachTimeSpaceBest For
Loop with wgetO(n)O(n)Simple scripts, easy to read
Loop with curlO(n)O(n)Systems without wget
Read from fileO(n)O(n)Many URLs managed externally
Parallel with xargsO(n/k)O(n)Speeding up downloads with multiple connections
💡
Always quote your variables like "$url" to handle spaces or special characters safely.
⚠️
Forgetting to quote URLs or filenames can cause errors if they contain spaces or special characters.