How to add a timer between wget html downloads?

I found this very useful script that does the job for what I'm looking to achieve.

It's been taken from this post: Batch download URLs from a .txt file

To use the following method, you will need to install wget. Make a file with the extension .sh in the same directory as your file containing the links and add this text to it:

mkdir ~/Desktop/download

while read line; do wget -E -H --directory-prefix=/Users/username/Desktop/download -k -p $line; done < file.txt

cd ~/Desktop/download

Make sure to edit the script and change username to your username This reads file.txt for the URLs and runs the wget command multiple times with all the links one-by-one and saves them to a folder named download on your desktop.

I just need to make a small edit to make sure that between one request and another the script waits 5 minutes.

Could you please tell me how to edit it?


wget can read URLs directly from a file (-i file), and knows how to pause between downloads (-w seconds), so you don’t actually need a loop at all. Also, wget downloads into the current directory by default so you don't really need --directory-prefix either.

Just run

mkdir -p ~/Desktop/download
cd ~/Desktop/download
wget -E -H -k -p -w $((5*60)) -i file.txt

PS: You might also want to add --random-wait to have wget wait between 0.5 and 1.5 times the number of seconds you've specified. This might help to avoid problems with sites who take extreme steps to detect and block wget requests.