How to download all files from a site? [duplicate]
Possible Duplicate:
How can I download an entire website
There is a site, which has shared images on it, but I know the urls to only some of them. I suppose there are hundreds of images on this site and I want to download them all.
Any suggestions ?
I suggest you try wget, the "non-interactive downloader". It is a command-line file/website/whatever downloading utility packed with features. I have easily mirrored (ie. completely downloaded) several sites to my hard drive with it. Don't shrug it off because it is a "command-line" program, it's very easy to use.
You haven't mentioned what operating system you are using. If it's a *nix-like system you can probably install wget through your package manager. If it's Windows, you can get a Windows of it version here:
http://gnuwin32.sourceforge.net/packages/wget.htm
I think this wget command should do it for you:
wget --page-requisites --convert-links --html-extension --restrict-file-names=windows www-example-dot-com
With you replacing the website's address where I wrote www-example-dot-com
If you are completely baffled:
The program's name (wget, of course), along with a list of options ("arguments") is typed at the command line. --page-requisites
option tells it to download everything the page needs to show up propperly, such as the images on it. --convert-links makes it possible to browse the local copy offline. --html-extension
and --restrict-file-names=windows
help make the thing viewable offline on most computers (in a nutshell).
If you want to download (mirror) the entire site to your hard drive, check out the --recursive
and --level options
.
I'm not going to explain working with the command line in full, but if you don't know: it becomes very easy once you grasp opening a command prompt and then changing directories. (honest!) You could of course rather look for a GUI-based download manager or something, but it might just nag you to register, unlike wget, which is open source.