[EMAIL PROTECTED] wrote:
I'm trying to make a copy of a 65Gb directory on another PC
and *keep* it updated.  ("poor mans mirroring")

I thought my ol' friend 'rsync -av --delete source/ target'
would be the answer.  However, for so much data it takes
forever to run this command just like it takes to do a whole
copy again.

What gives?  Any way to speed it up?

Define "forever".

I can sync 165GB of data from an old Sun E250 in about 2 hours. I would be *very* surprised if you are slower than that. I would expect somewhere about an hour.

However, it sounds like you really need a SAN (storage area network) with transactional semantics. Sorry, I don't know anything about the open source implementations of that kind of stuff.

-a


--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to