Offsite Backup

published

Rick contacted me recently to propose a mutually beneficially technical configuration: use one another’s NSLU2 network storage devices as an offsite backup solution. That is, Rick will send his backups to me, and I’ll send my backups to him. I’ve been exceedingly lax in my backup strategy of late, so this seemed like an excellent opportunity to remedy that situation.

I’m running Debian on my slug, and Rick is running Unslung. We created accounts for one another on our respective slugs, configured port forwarding on our firewalls, and yesterday I sat down with Rick to help him get ssh keys working for passwordless logins. Rick sent over a few dozen photos for storage on my slug, and it all worked perfectly.

When I got home, I set to configuring my backups. My goal was a little more complex, because I want to backup my web / mail server, in addition to my desktop data. I created a daily cron job on my server to dump all the MySQL databases, gzip compress them, and then encrypt them using GnuPG. The script first empties out my backup directory, removing any previous backups, and then dumps all of the above into the backup directory. I created a weekly cron job to archive all my users’ mail spools, tar and gzip them, and finally encrypt them with GnuPG. Since the weekly cron task executes after the daily, the mail backups will coexist with the database backups for one day, and will be purged the following morning.

On my slug, I created a cron job to run at 8:15 AM every day (after both the normal daily and weekly tasks execute), in order to make sure that the web server had finished all of its archive tasks. The slug uses scp to grab *.gpg from the backup directory on the server, and stores them in the local backup directory. It then uses the find command to identify files in the backup directory that have not been modified in more than 30 days. These are deleted. Finally, the script invokes rsync to synchronize my slug’s backup directory with the backup directory on Rick’s slug. Files that were deleted on my slug (due to age) should be deleted from Rick’s slug, and new files should be transferred over.

In this way, I have a month’s worth of daily database backups, and weekly snapshots of mail, stored in my house, and also stored offsite at Rick’s house. Everything is GPG encrypted, so if anyone robs my house (or Rick’s house), I won’t have to worry too much about any sensitive data that might be contained in the database dumps or in email messages.

While preparing these backup scripts, I was reminded of OfflineIMAP, a utility used to synchronize IMAP folders between multiple computers. It’s a pretty interesting package, and it might be a great way to keep more frequent backups of all the mail on my server. I don’t know if it offers anything unique over just a plain ol’ rsync of my server’s maildir, but it’s another tool in the toolbox. One of the neat tricks you can do with OfflineIMAP is to use ssh to prepare a “preauth tunnel”, so that you don’t need to use your IMAP password at all (relying instead on key-based ssh logins) – that’s pretty clever! I am, however, a little leery about keeping an unencrypted copy of all our mail on the slug locally, but I can’t adequately articulate why: if it’s acceptable to keep the mail on the VPS, on hardware outside my physical control, why should I be worried about a duplicate copy stored on a hard drive in my house? If you’re using OfflineIMAP, do please let me know what you think in the comments.

What backup strategies are you using? How have they worked for you so far?


home / about / archive / RSS