Copy a large file over an unreliable link
This isn't really a question about programming, but it is a question about working as a programmer. I hope this is an appropriate forum for this question.
I work from home. My Windows XP based laptop connects through a VPN to the network of my employer. Occasionally, I need to download a large file (~2.5 GB) that is shared on a network drive. While it's possible to just drag-and-drop the file using Windows explorer, to copy the file, there is a good chance that the VPN will timeout or my internet connection will flake out at some point during the transfer.
So what I'm looking for is a way to copy a large file that supports resuming if the connection fails. I initially tried to use rsync from within cygwin, but I don't think I had the right set of options.
I was doing "rsync -aP src_file_path dest_file_path". It would download correctly if the transfer completed without error, but if it crashed, when I issued the command again, it would start downloading the entire file.
Solution 1:
You might also try robocopy, an xcopy replacement that ships in Vista, and is available in the Windows XP Resource Kit.
http://en.wikipedia.org/wiki/Robocopy
Robocopy contains options (/Z) for copying files in "restartable" (read: resumable) mode.
Solution 2:
Your rsync command line looks correct (-P
is needed to keep partially transfered files), you however have to make sure that you are actually using rsync for the data transfer itself by specifing a remote path:
rsync -aP juser@server:/tmp/data some_directory
If both paths refer to the local filesystem (even when one of them is a network filesystem), rsync can't use its delta transfer and will transfer the full file.
Solution 3:
I like teracopy. Works like a charm and can integrate with windows explorer as well. Great for large files 10x better than windows. It is free but has a paid version.