Geoff Shang wrote:


I was thinking that I could perhaps make daily incremental tarballs, with a full backup once a month or something, as bandwidth is a bit limited at the office site (though exactly how much I'm about to find out). But I've not had experience making incremental backups so would like suggestions as to the best schemes to use, etc.


Simple answer: use duplicity.

As an example, here is the (somewhat simplified) script we run daily via cron to do a full backup every week with incremental backups each day between full backups, slicing the backup into chunks of 100mb each, unto the Amazon S3 grid, while encrypting everything using GPG in symmetric cypher mode and keeping backups for 1 year.

#!/bin/bash

export PASSPHRASE='this password is fake'
export AWS_ACCESS_KEY_ID=EXAMPLE123
export AWS_SECRET_ACCESS_KEY=EXAMPLE567

duplicity --volsize 100 --full-if-older-than 1W /back/me/up \ s3+http://bucketname/

duplicity --force remove-all-but-n-full 48 s3+http://bucketname/


Pretty cool, I think.

Of course, it's just a single example. Read Duplicity fine manual for the details

=================================================================
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word "unsubscribe" in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]

Reply via email to