How to resume interrupted download automatically in curl?
I'm working with curl
in Linux. I'm downloading a part of a file in ftp server (using the -r
option), but my connection is not good, it always interrupts. I want to write a script which resume download when I'm connected again.
I've used this command, but it's not working:
until curl -r 666-9999 -C - --retry 999 -o "path/to/file" "ftp:/path/to/remote/file"; do :; done
Solution 1:
curl -L -O your_url
This will download the file.
Now let's say your connection is interrupted;
curl -L -O -C - your_url
This will continue downloading from the last byte downloaded
From the manpage:
Use "-C -" to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.
Solution 2:
wget has been built specifically for this use case. From the man page:
Wget has been designed for robustness over slow or unstable network connections;
if a download fails due to a network problem, it will keep retrying until the
whole file has been retrieved. If the server supports regetting, it will
instruct the server to continue the download from where it left off.
wget is available for almost all Linux distributions - it probably is already installed on yours. Just use wget to download the file, it will re-establish the network connection until the file is completely transferred.
Solution 3:
You can check the exit code in a while loop and resume until the exit code indicates that the download has succeeded:
export ec=18; while [ $ec -eq 18 ]; do /usr/bin/curl -O -C - "http://www.example.com/a-big-archive.zip"; export ec=$?; done
The example is taken from http://ilovesymposia.com/2013/04/11/automatically-resume-interrupted-downloads-in-osx-with-curl/