Problems Using WGET to transfer files from an FTP Server
I make use of wget to get files from an FTP server. I do this:
wget ftp://username:[email protected]/public_html/images/audiothumbs/* .
After I download about a 1600 files, I get this error:
--2010-09-07 01:36:51-- http://./
Resolving .... failed: Name or service not known.
wget: unable to resolve host address `.'
FINISHED --2010-09-07 01:36:52--
Downloaded: 1998 files, 20M in 3m 31s (95.7 KB/s)
Did I get disconnected from the other server?
Secondly, if make use of the no-clobber option like so:
wget -r -nc ftp://username:[email protected]/public_html/images/audiothumbs/* .
Why is it files still get overwritten?
Thanks all for any help
Solution 1:
You've got an extra .
on the end of your command line. wget
is not like cp
and doesn't take a destination directory. So after it downloads all your files from the FTP server, it tries to download a file (using HTTP) from the server .
.
And for -nc
, it's documented to do something other than what you expect:
When running Wget without -N, -nc, or -r, downloading the same file in the same directory will result in the original copy of file being preserved and the second copy being named file.1. If that file is downloaded yet again, the third copy will be named file.2, and so on. When -nc is specified, this behavior is suppressed, and Wget will refuse to download newer copies of file. Therefore, ‘‘"no-clobber"’’ is actually a misnomer in this mode---it’s not clobbering that’s prevented (as the numeric suffixes were already preventing clobbering), but rather the multiple version saving that’s prevented.
Solution 2:
Please not that max recursion in the download directory is 5!
When downloading from ftp using wget you have to set recursion depth:
-l depth
--level=depth
Specify recursion maximum depth level depth. The default maximum depth is 5.