Using wget to recursively download whole FTP directories
I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder.
I tried running the following command form my new server:
wget -r ftp://username:[email protected]
But all I get is a made up index.html file.
What the right syntax for using wget recursively over FTP?
Solution 1:
Try -m
for --mirror
wget -m ftp://username:[email protected]
Solution 2:
You have it right, you just need a trailing * on the end:
wget -r ftp://username:[email protected]/dir/*
For shared servers, you can use like this:
wget -r ftp://1.2.3.4/dir/* --ftp-user=username --ftp-password=password
Because most shared servers has ftp-username something like username@hostname, so, the first wget command not works, and second command works fine.
Solution 3:
Besides wget, you may also use lftp in script mode. The following command will mirror the content of a given remote FTP directory into the given local directory, and it can be put into the cron job:
lftp -c 'open <hostname>; user <username> <password>; mirror -e <remote-src-path> <local-dest-path>; quit'
It automatically handles recursion into directories and allows specifying the remote source starting directory from to download data from.