Since the last time I went looking for backup software, I’ve still be using rdiff-backup.
It’s nice, except for one thing: it always keeps an uncompressed copy of your current state on the disk. This is becoming increasingly annoying.
I did some tests with dar and BackupPC, and both saved considerable disk space over rdiff-backup. The problem with dar, or compressed full/incrementals with tar, is that eventually you have to make a new full backup. You have to do that, *then* delete all your old fulls and incrementals, so there will be times when you have to store a full backup twice.
The hardlinking approach sounds good. It’s got a few problems, too. One is that it can lose metadata about, ironically enough, hard links. Another is that few of the hard linking programs offer a compressed on-disk format. Here’s what I’ve been looking at:
BackupPC
Nice on the service. I’m a bit annoyed that it’s web-driven rather than commandline-driven, but I can look past that. I can also look past that it won’t let me clamp down on ssh access as much as I’d like.
BackupPC writes metadata to disk alongside files, so it can restore hard links, symlinks, device entries, and the like. It also has the nice feature of being able to hard link identical files across machines, so if you’re backing up /usr on a bunch of machines and have the same files installed, you save space. Nice.
BackupPC also can compress the files on your disk. It uses pre-compression md5sums for identifying files to hard link, which is nice.
Here’s where I get nervous.
BackupPC doesn’t just use regular compression, from say gzip or bzip2. It uses its own low-level algorithm centered around the Perl deflate library. And it does it in a nonstandard way owing to a supposed memory issue with zlib. Why they don’t just pipe it through gzip or equivalent is beyond me.
This means that, first off, it’s using a nonstandard compression format, which makes me nervous to begin with. If that weren’t annoying enough, you have to install Perl plus a bunch of modules to extract the thing. This makes me nervous too.
Dirvish
Doesn’t support compression.
faubackup
Doesn’t support compression.
rdup
Supports compression and encryption. Does not preserve ownership of things unless the destination filesystem does (meaning you must run as root to store your backups.)
Killer lack of feature: it does not preserve knowledge about what was hardlinked on the source system, so when you restore your backup, all hardlinks are lost. Epic fail.
rsnapshot
Doesn’t support compression.
StoreBackup
Does support compression, appears to restore metadata in a sane way. Supports backing up to a different machine on the LAN, but only if you set up NFS. Looks inappropriate for doing backups over VPN. Comprehensive, though confusing, manual. Looks like an oddball design with an oddball manual.
So, any suggestions?