On Aug 12, 2005, at 11:16 AM, [EMAIL PROTECTED] wrote:


I'm trying to make a copy of a 65Gb directory on another PC
and *keep* it updated.  ("poor mans mirroring")

I thought my ol' friend 'rsync -av --delete source/ target'
would be the answer.  However, for so much data it takes
forever to run this command just like it takes to do a whole
copy again.

What gives?  Any way to speed it up?


When you're comparing 65GB of data on two sources and trying to ensure that they're both the same (or determine which bits need to be shipped back and forth to make them the same), yes, it's going to take a while to do those computations.

To simplify your life, though, I'd strongly suggest using unison (http://www.cis.upenn.edu/~bcpierce/unison/) if you need to do two- way syncs. It uses rsync as it's analysis and transport mechanisms, but allows you to decide which changes go which way if there's no obvious solution.

I used it for quite a long time to keep my laptop, home system and work system's ~/src directories synced. Never failed me, once I read the docs. :)

Gregory

--
Gregory K. Ruiz-Ade <[EMAIL PROTECTED]>
OpenPGP Key ID: EAF4844B  keyserver: pgpkeys.mit.edu



Attachment: PGP.sig
Description: This is a digitally signed message part

-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to