How to backup a full Centos Server?

I switched a few weeks ago from a dedicated server to a VPS. Now that everything is working well on the VPS I would like to shutdown the dedicated server and close my account with the hosting company.

For peace of mind and in order to be more safe I would like to do a full backup of the server before stopping it.

The best would be a backup that I could browse if I find that I need a something in the backup.

What would be the best solution from command line?

Update :

Medium : Network


The best tool to use for this is probably dump, which is a standard linux tool and will give you the whole filesystem. I would do something like this:

/sbin/dump -0uan -f - / | gzip -2 | ssh -c blowfish [email protected] dd of=/backup/server-full-backup-`date '+%d-%B-%Y'`.dump.gz

This will do a file system dump of / (make sure you don't need to dump any other mounts!), compress it with gzip and ssh it to a remote server (backupserver.example.com), storing it in /backup/. If you later need to browse the backup you use restore:

restore -i

Another option, if you don't have access to dump is to use tar and do something like

tar -zcvpf /backup/full-backup-`date '+%d-%B-%Y'`.tar.gz --directory / --exclude=mnt --exclude=proc --exclude=tmp .

But tar does not handle changes in the file system as well.


If you want to backup from Linux to Linux I wouldn’t use dump, because it’s inconvenient when you need to access something inside the backup file. Just using rsync over SSH to do a full system backup should be fine in most cases:

rsync -aAXv --delete-after --exclude={"/dev/*","/proc/*","/sys/*","/tmp/*","/run/*","/mnt/*","/media/*","/lost+found"} / user@server:backup-folder

This will keep everything important and let you browse the backup without additional steps.

You may want to add rsync's --delete option if you are running this multiple times to the same backup folder. In this case make sure that the source path does not end with /*, or this option will only have effect on the files inside the subdirectories of the source directory, but it will have no effect on the files residing directly inside the source directory.


What medium are you going to be storing the backup on? If you're backing up over the network I would sshfs/nfs mount my destination on the source server and run something like:

tar cvjf /<remote_mnt>/<point>/source-030810-full.tar.bz2 /* --exclude=/proc --exclude=/dev --exclude=/sys --exclude=/tmp --exclude=/<remote_mnt>

Note that has not been tested, just my general thinking you may want to exclude more or less than that.


I use the command described above from pehrs, but modified for ftp use. Crontab sample:

30 3 1 * * sudo /sbin/dump -0uan -f server-full-backup-root-`date '+%d-%B-%Y'`.dump / && gzip -1 /<path_to_backup_file>/server-full-backup-root-`date '+%d-%B-%Y'`.dump
50 * * * * lftp -f upload.x

upload.x contains ftp credentials and rules for upload:

open -u user,password -p 21 192.168.1.1
mirror -c -e -R /<path_to_backup_folder> /<path_to_remote_folder_without_trailing_slash>
exit

Note 1: lftp may cause high CPU usage when destination is unreachable trying to reconnect. Because many hosting VPS's may be reset in such cases, I suggest to monitor CPU load to kill lftp process preventing your server shutting down by the host owner. An example for load average >1.33 for processes lftp (and dropbox). Unfortunately, I do not remember the source of initial code, thanks to somebody:

Crontab */5 * * * * /home/cms/cron/loadmon.sh

#!/bin/bash
FROM_EMAIL_ADDRESS=cms
  trigger=1.33
  load=`cat /proc/loadavg | awk '{print $1}'`
  response=`echo | awk -v T=$trigger -v L=$load 'BEGIN{if ( L > T){ print "greater"}}'`
if [[ $response = "greater" ]]
then
  killall dropbox lftp
  nice -n 19 sh /cms/.dropbox-dist/dropboxd
  sar -q | mailx -s "High load on server - [ $load ]" r***[email protected]
fi

Note 2: dump utility may not work on OpenVZ VPS or some other virtual servers.


have you heard of Bacula?

Bacula is a set of Open Source, enterprise ready, computer programs that permit you (or the system administrator) to manage backup, recovery, and verification of computer data across a network of computers of different kinds. Bacula is relatively easy to use and efficient, while offering many advanced storage management features that make it easy to find and recover lost or damaged files. In technical terms, it is an Open Source, enterprise ready, network based backup program.