Easiest way to backup VPS
My situation:
I have VPS(ubuntu) with 2 websites on it. As i am new to VPS'ses (recently i've moved from Shared Hosting) i spent almost 2 weeks on learning how to configure my VPS to run all webs in a way as i wanted. I made a lot of mistakes while configuring VPS, so i were learning it by - reinstalling / configuring / mistaking / reinstalling / configuring etc..
Now all is ok, VPS is running, all the services i were needed also works, a lot of lessons learned - ALL OK.
Questions:
What is the easiest way of making whole VPS backup?
Can i make VPS backup simply using WinSCP ? I mean connect to root (/) folder and download all VPS files from there?
If yes, how i would restore it later if there will be the need for reinstalling from scratches whole VPS?
Additional info about my situation:
- databases will be backuped by hand (possible through phpmyadmin)
- there are no backup tool in my VPS admin panel(SolusVM)
- i am not using any of GUI cpanel/directadmin/virtualmin etc. just console
Solution 1:
My preferred way of backing up a remote virtual server is to simply copy files - the kernel is on the host anyway. Of course, you will have to exercise some care when restoring, and it may not be possible to do a full restore (unlike a disk image). On the other hand, it's easier to migrate to another server, such as a local one for testing.
Now, downloading lots of little files over a remote connection actually is quite slow, due to a lot of overhead. This is more significant if you are physically far away from the server. Normally, I prefer to bundle everything into a single tar
archive, and then compress it for a smaller size.
The command for this is tar cpzf filename.tar.gz /
(where /
means back up everything from the root recursively). you may wish to exclude existing backup files - if you plan to keep a lot of backups, it would be easier to put them all in one directory and exclude that directory. Then you just need to download that file with any method you like.
Restoring would be with the command tar xpf filename.tar
while in the root directory. You can also specify the target directory with -C
.
Solution 2:
I've answered an identical question on serverfault, and while it covers some of the same ground as bob's answer, I'm using slightly different tools, and a slightly different focus. I suggest creating a package list, and using rsync (since it can maintain permissions, and is pretty efficient at file transfer).
If all else fails, there's the old fashioned way - use dpkg --get-selections to dump out a list of installed packages, and install them with dpkg --set-selections. Create the same users as the source system if necessary - cat /etc/passwd should list them out, and you can check with diff to see if the two lists are identical.
Then use rsync to duplicate your /etc/ folder for settings, various /home/ folders for users (and check permissions here) and other folders like /var/www/. Test, make sure everything's there, and you're done. Takes me less time than to set up a fresh server.
Sometimes, the old, simple ways are the best. If you work this out manually, simply write a script that will replicate this off your current server automatically.
Solution 3:
Use rsync for easy incremental backups. You can do it from command line or script more sophisticated scenarios.
Example you can start with:
$ options="--stats -aHh --delete"; echo == RUNNING with options: $options; time rsync -e ssh $options root@your-server-ip-or-hostname:/ /destination/my-server-backup --exclude /proc --exclude /dev --exclude /sys --exclude /run/udev
You can find more examples in the script I put together, although originally for Windows, it can directly be applied to Linux: https://github.com/paravz/windows-rsync-backup