[EMAIL PROTECTED] (Ken Williams) writes:

> The thing is, a homebrew backup scheme like this isn't really adequate. 
> For instance, what if you rsync to a remote server every night, and then 
> you accidentally delete a valuable file and realize it three days later? 
> This is why incremental non-clobbering snapshots were created.  You'd 
> have to do a lot of work with a system like rsync in order to get the 
> same effect.

Some years ago I hacked together a simple system that rsynced to a
remote box every day but rotated through ~10 directories one by one.

Required much extra disk space; but at the time I had more disks than
real backup devices.  

I made it copy over the previous days backup into the ~10 day old one
first to save a bit of bandwidth. (So I just copied changes from the
day before).

But as you said: The real problem is that those tools don't know about
Resource Forks.  I think I forgot my powerbook down in the car so I
can't look closer, but Doug's suggestion of using ditto (in particular
when it gets incremental backup features - thanks Doug!) sounds good.
 
> I'm also willing to pay money for a backup tool that /works/.  It makes 
> me feel better if some company's reputation will be hopelessly smeared if 
> they release a backup tool that doesn't work.  Presumably, people that 
> make backup tools know more about how to get it right than I do.

Except when the backup hardware failed me (die DAT, die) I've been a
happy user of Amanda.  (Uses tar or cpio, so doesn't do resource forks
either).  The last months I've been using a small AIT autoloader
thingy to backup my workstation boxes in the office (and
cvs.perl.org); it's great:
http://www.storagebysony.com/products/productmain.asp?id=128
(but doesn't have anything to do with the original question)


 - ask

-- 
ask bjoern hansen, http://ask.netcetera.dk/   !try; do();

Reply via email to