I am looking to find a strategy for backing up data. I have an Ubuntu 10.04 box that I have essentially acting as a server. I work on multiple UNIX systems and I daily backup critical databases, directories, files, etc., via a combination of tar, bzip2, sshfs, and scp. Because of this, I have daily "snapshots" and this will begin to take it's toll on my storage space although I'm okay with snapshots but I was hoping to find out if there is a better way.
Any ideas or pointers are greatly appreciated. One item I am not familiar with and haven't studied is the idea of using tar, etc., to "sync" to an existing tar file. Maybe that's a better way so if anyone can point me in the right direction on that as well.
asked 28 May '10, 15:00
I've used two strategies depending upon my need. For both I have an online RAID6 array containing a directory for each machine I'm backing up. Currently I'm using a Thecus N5200 which has been reliable for a few years:
1) Rsync is FAST and will help your snapshot problem a bit. It overwrites files having the same name with the newer version. It does NOT delete files removed on the original machine which may be a plus or minus depending upon your need to archive vs. restore.
2) For snapshots using an efficient DIFF scheme to minimize duplication, I use rdiff-backup. It's much slower than Rsync but has allowed me to restore working machines when they've been totally clobbered. It also has a feature allowing you to delete old snapshots. Read all about it (after you install it from synaptic or Applications/Ubuntu Software Center) at:
man (1) rdiff-backup
answered 28 May '10, 15:39
rsync is a good option, also I find simple-backup and also luckybackup to also be useful in their own ways. Obviously if you JUST want to backup folders and files, that is different than say backing up entire systems, which you would use a different toolset to do, like Clonezilla.
rsnapshot will give you the benefits of daily snapshots without taking much more space than a single snapshot (at least for normal usage patterns. If you have a ton of data changing on a daily basis, you'll want to explore other options).
answered 11 Jun '10, 02:16
Tar is not meant for incremental backups, unless you're using tape. Rsync is the best solution I've found. I have a script that has worked well for me for years, which you can download from
It's licensed under the GPL.
Keep in mind that experts (i.e. paranoids) recommend you having backups on multiple media, and in multiple locations, e.g. in case your house burns down or some kind of catastrophe destroys your whole system (including the external hard drive you are mirroring to).
answered 19 Jun '10, 19:49
If you are a Perl oriented person then http://search.cpan.org/~lbrocard/Dackup-0.44/lib/Dackup.pm is your friend. i use it for everything and anything. just make a backup aplication (20 lines of code) and set it under crone and all problems solved. (for me at least)
answered 17 Aug '10, 07:53