On Wed, 27 Sep 2006 14:04:26 +0100, Steve [Gentoo] wrote:

> 3.  My home directory; subversion repositories and DBMS catalogues are
> backed-up to a remote account.  I currently do this with a cron-job
> which takes dumps; creates tar files; AES encrypts then uploads using
> SSH to the remote site... which manages a history of 3 backups using a
> simple shell-script.  This works OK, but it is very ad-hoc... and it
> won't scale as every backup requires that I upload a new copy - even if
> I've only made a trivial change to my data.  It would be far better if
> an incremental update were possible - though I'm not willing to give up
> encryption of data I send off-site.

Does your remote site use rsync? I use Strongspace and have a directory
on the server set up with encfs. I tried mounting it with sshfs and then
encfs but found it very slow, so what I now do is have a local directory,
mounted with encfs. I backup from my home directory to that, using rsync,
then I rsync that encrypted directory with the one on the server, so I am
transferring pre-encrypted files. I can still mount the remote directory
using sshfs and encfs if I need to, and if I need to, the lack of speed
won't be my main concern.


-- 
Neil Bothwick

"Bother" said Rue, for no apparent reason

Attachment: signature.asc
Description: PGP signature

Reply via email to