Bash Script to Download Multiple Files Easily
wget or curl to download multiple files, for example: for url in url1 url2; do wget "$url"; done.Examples
How to Think About It
wget or curl to fetch the file. This way, you automate downloading many files without typing each command manually.Algorithm
Code
#!/bin/bash urls=("https://example.com/file1.txt" "https://example.com/file2.txt") for url in "${urls[@]}"; do echo "Downloading ${url##*/}" wget "$url" done
Dry Run
Let's trace downloading two files through the script
Start loop
urls contains ["https://example.com/file1.txt", "https://example.com/file2.txt"]
Download first file
url = https://example.com/file1.txt, filename = file1.txt
Download second file
url = https://example.com/file2.txt, filename = file2.txt
| Iteration | URL | Filename |
|---|---|---|
| 1 | https://example.com/file1.txt | file1.txt |
| 2 | https://example.com/file2.txt | file2.txt |
Why This Works
Step 1: Loop over URLs
The for loop goes through each URL in the list one by one.
Step 2: Extract filename
Using ${url##*/} extracts the file name from the URL to show a friendly message.
Step 3: Download file
The wget command downloads the file from the URL and saves it in the current folder.
Alternative Approaches
#!/bin/bash urls=("https://example.com/file1.txt" "https://example.com/file2.txt") for url in "${urls[@]}"; do echo "Downloading ${url##*/}" curl -O "$url" done
#!/bin/bash while IFS= read -r url; do echo "Downloading ${url##*/}" wget "$url" done < urls.txt
cat urls.txt | xargs -n 1 -P 4 wget
Complexity: O(n) time, O(n) space
Time Complexity
The script downloads each file one by one, so time grows linearly with the number of URLs.
Space Complexity
The script stores the list of URLs in memory, which grows linearly with the number of URLs.
Which Approach is Fastest?
Parallel downloads with xargs can be faster but add complexity; simple loops are easier to understand and debug.
| Approach | Time | Space | Best For |
|---|---|---|---|
| Loop with wget | O(n) | O(n) | Simple scripts, easy to read |
| Loop with curl | O(n) | O(n) | Systems without wget |
| Read from file | O(n) | O(n) | Many URLs managed externally |
| Parallel with xargs | O(n/k) | O(n) | Speeding up downloads with multiple connections |