Download ALL Folders, SubFolders, and Files using Wget
Solution 1:
I want to assume you've not tried this:
wget -r --no-parent http://www.mysite.com/Pictures/
or to retrieve the content, without downloading the "index.html" files:
wget -r --no-parent --reject "index.html*" http://www.mysite.com/Pictures/
Reference: Using wget to recursively fetch a directory with arbitrary files in it
Solution 2:
I use wget -rkpN -e robots=off http://www.example.com/
-r
means recursively
-k
means convert links. So links on the webpage will be localhost instead of example.com/bla
-p
means get all webpage resources so obtain images and javascript files to make website work
properly.
-N
is to retrieve timestamps so if local files are newer than files on remote website skip them.
-e
is a flag option it needs to be there for the robots=off
to work.
robots=off
means ignore robots file.
I also had -c
in this command so if they connection dropped if would continue where it left off from when i re-run the command. I figured -N
would go well with -c