Please note that LinuxExchange will be shutting down on December 31st, 2016. Visit this thread for additional information and to provide feedback.



I'm looking for a backup tool on my (arch)linux.

On the archlinux wiki, I've found this list : backup programs. There is so many program I don't know which one choose and when I search review on the web I still found several others which looks great.

I'm looking for something to do incremental backup (save disk space and time), easy to recover and manipulate the backups. I'd like something easily customisable to do stuff (with bash script maybe) like "save my system every week, delete after two months", "save my home folder every day except the following folders, don't follow the links (to the partition or another) but keep them", "do incremental backup every x and a full backup every y", "save x every day, delete after after a week but keep one backup one week, two weeks, one month and three months old",...

I'm using only linux so I don't need a cross-system solution.

I tried rsync some times ago. It's a great tool but I had problems to keep the users and permissions. Also rsync doesn't allow to recover something else than the last backup (stop me if I'm wrong).

I heard a lot about rdiff-backup but never tried. Advantage to be able to recover previous backup.

In the wiki, there is link-backup. Never heard about it but it looks great, I may test it. Someone knows it ?

Unison : seen some good reviews. It has bidirectionnal synchronisation (feature, I don't really need).

rdup : another unknown program (based on hdup, like duplicity but looks more powerfull). I like the spirit of the program "not invent the wheel again and again" so instead of doing the backup, it uses another unix tool to do that. It can do compression and encryption. Problem, it copy the full file and not the difference (but if one backup fail, it's not so much a problem them). If someone has tested it, I'd really like to heard comments.

What do you use and why ? Thank you to develop your point and explain the main feature of the program compare to another.

asked 24 Apr '10, 08:39

martvefun's gravatar image

accept rate: 33%

edited 24 Apr '10, 13:02

write your own backup routine in bash script. The tar command is about as useful as a swiss army knife for that task and can be wrapped with some minimal scripting to do exactly what you want. Otherwise you are picking from one of a million backup utilities that all may do something slightly different than what you really need.

(05 May '10, 08:50) shreddies

12next »

It sounds like rsnapshot should satisfy most of your needs:

rsnapshot is a Perl-based utility for saving snapshots of local and remote filesystems. It uses rsync and hard links to create multiple, full filesystem backups, yet only requires slightly more disk space than a single snapshot plus incremental archives.



answered 24 Apr '10, 20:02

jeremy's gravatar image

jeremy ♦♦
accept rate: 37%

Thank you but why not use rsync directly ?

(25 Apr '10, 11:48) martvefun

rsnapshot is both easier to setup and gets you additional functionality that you'd need to manually replicate or go without if you used rsync alone.


(25 Apr '10, 16:20) jeremy ♦♦

I would recommend sbackup:

It is very easy to configure (provides a configuration GUI), and some of its features are:
- manual or automated backups
- purging of older backups
- a simple interface for configuring when to backup (uses crontab)
- include/exclude files and folders using paths, file types, file size, or regular expressions


answered 25 Apr '10, 16:06

Jazz's gravatar image

Jazz ♦
accept rate: 33%

I use that for my Ubuntu server, very easy and convenient.

(27 Apr '10, 21:18) atilla

sbackup is now unsupported and it doesn't work on the latest Ubuntu. It's been replaced by nssbackup (not so simple backup). I looked at many options before settling on backintime. One to keep an eye is time drive (

(13 May '10, 12:26) PJO

rsync seems to fit the bill. If you prefer a GUI version of rsync, have a look at Back In Time.


answered 03 May '10, 09:44

beachboy2's gravatar image

accept rate: 0%

the biggest problem with backintime is that it does ONLY incremental backup. If I loose my computer, an incremental backup without the first full backup is useless.

(15 May '10, 08:46) martvefun

I prefer writing a good shell script that uses rsync to send your backed-up data elsewhere.You can configure an array of directories and a network target URI, then write out a script that, in this order:

  1. Begins a loop for each entry in the directory array.
  2. Adds the contents of the current directory in the array to an archive and compresses it.
  3. Loops to the next directory in the array.
  4. Once the array's done, timestamp the backup's name.
  5. Open an rsync connection to the specified network target and transfer the compressed archive to the specificed destination.
  6. Delete the oldest archive (Possibly even a maximum amount of archives can be configured.) as it's not likely to be useful anymore.

This design is intended strongly for cronjobs. Maybe to fire off every other week to once a month, and is designed so that all the user has to do is configure what is backed up and to where the backups are sent. Could be a good use for a home file server, but it can even even send backups across the Internet if you so desire.


answered 03 May '10, 16:45

Yaro%20Kasear's gravatar image

Yaro Kasear
accept rate: 0%

There are basically 4 ways to backup data : (let's say we have 1Go of data to backup)

1/ One backup of data using rsync. Pros : Fast. Only 1Go of space needed for backup. Cons : Only one backup.

2/ Multiple copies of data (using rsync or cp or tar or zip). Let's say we keep the last 4 weeks. Pros : Multiple aged backups. For each backup, you have the full directories structure of data. Cons : 4Go of space needed .

3/ Incremental backup (using tar or zip). Let's say we keep 1 full backup and 3 incrementals. Pros : Multiple aged backups. A bit more of 1Go of space needed. Cons : The incremental backups contain only the modified files, so it's quite difficult to find files you want to restore.

4/ Rsync + Hard-links (the best way imo). Let's say we keep the last 4 weeks. Pros : Multiple aged backups. A bit more of 1Go of space needed. Each backup contain the full directories structure of data. Cons : Slower.

How (4) works : It takes multiple full backups but, using hard links between files in backup N and files from backup N-1, it creates the illusion of multiple full backups. Actually the data is only stored in the first backup. The next backups are only links, plus differences (added files between backups).

Rsnapshot (command line) and BackInTime (GUI) work this way (4).


answered 03 May '10, 22:41

rndmerle's gravatar image

accept rate: 0%

I can recommend Luckybackup:

Very easy configurable with GUI, based on rsync. Automatically makes cron jobs.


answered 05 May '10, 08:23

Dion's gravatar image

accept rate: 0%

seconded and agreed. I was going to recommend this too, but you beat me to it.

(25 Aug '10, 14:45) Ron ♦

Dear ! Friend, You can also use DAR for can take differential backup,Full backup.

open following web links:-


answered 05 May '10, 09:30

rahuldevalone's gravatar image

accept rate: 0%

for what I read dar seems good thanks

(05 May '10, 11:08) martvefun

I've been liking Back In Time on my desktop very much. On the headless server(s) rsnapshot gets my vote.


answered 03 May '10, 18:51

Kevin's gravatar image

accept rate: 0%

I recommend a backup expert tool(which suggests a commercial product) since you mentioned incrementals. I use a product from Acronis which has worked greatly for me in the past. It is intuitive and has many features including the ability to backup and restore dissimilar OS's for those who have have a dual boot option. I hesitate to recommend it now since it has gotten kind of pricey. The home backup product cost $50 and the plus pack is another $30 as opposed to the $35 I paid for both products about 3 years ago. If you consider your data very important that $80 may well be worth it.

FYI: generally speaking, backing up data is relatively easy but the recovery can be a bitch if you don't manage it properly.


answered 03 May '10, 19:42

jpvrla's gravatar image

accept rate: 0%

One solution that seems to have been overlooked here is Amanda (, it's quite popular and feature-full and its development is also supported for Enterprise-level systems.

There is always a trade-off in customisation if you use an off-the-shelf product, like the many suggested in these answers, but on the other hand you could have a simple set of scripts to deploy your own backup system, once you are familiar with the basic concepts of "rotation" like GFS where you have full monthly ones, weekly partial ones and daily incremental ones:

Compressors like zip or rar allow you to process only the modified/new files and at the same time save space. Also remember to keep a copy of really important data off-site, for instance on a CD at a friend's place or using one of those online (cloud) file storage services like JungleDisk


answered 12 May '10, 10:13

pmarini's gravatar image

accept rate: 28%

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text]( "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported



Asked: 24 Apr '10, 08:39

Seen: 11,714 times

Last updated: 03 Mar '13, 20:03

powered by OSQA