fastest backup for large files

what is the fastest way to duplicate a folder contain few files with few hundred megs size and other few megs small files? (say, /var/lib/mysql) cp or tar or rsync or...??


Solution 1:

rsync in general will be faster than cp or tar, because rsync only transfers those files which have changed, and with --partial it will only transfer the parts of a file that changed.

Having said that, rsync works much better if you know what you're backupping and can arrange things so that rsync doesn't have to do as much work. For example, rotated log files in /var/log work much better if you rotate them to a filename with a date in it, instead of .0, .1, .2, etc..

One more note: In your question you mention /var/lib/mysql. Using rsync to backup that is a pretty bad idea, use mysqldump to get a reliable backup. If that backup is too large to transfer frequently, use mysql replication and backup from a slave. (You still should do a full backup on the master periodically though, replication can fail too).

Solution 2:

Depends on a few factors. For a large number of small files it is generally better to do tar. If it is a small number of large files, cp is ok in most instances.

If you have a list of files but only a small portion of them change, rsync is more efficient.

Solution 3:

It depends on what you are backing up. You can't just backup live database with rsync, you'll end you with a corrupted file. For databases, like mysql, you need to set cron, for example, to run 'mysqldump' and then rsync that file. Also, make sure you append 'date' with the file name so, you know when this file was backed up and have several backup copies just in case.