Wade Curry wrote:
Frankly, I'm not yet sure how much work it will take, or if it's
worth it for a small installation. If anyone knows more about AFS
and can offer some insight as to how useful it would be for this
kind of "planned caching for offline use" scenario, please do
share.
I would look at the distributed source control systems (Mercurial, bzr,
darcs, etc.) and see if you could bend them to your purpose. You can
push and pull between systems.
AFS really wasn't meant for online/offline operation. It was really
meant to bond together sites connected with slow, unreliable leased
lines. The lines would go up and down but rarely remained disconnected
for extended lengths of time.
In addition, the files were generally much larger than could be quickly
transmitted over leased line speeds; however, once transmitted, the
coherency system didn't have to transmit it again.
Given that even VPN connections over the Internet are extremely reliable
and that consumer download bandwidth is much larger than a typical file,
AFS really doesn't solve any problems particularly well.
If you are serious about AFS, though, you might want to take a look at
Coda, instead. It is based on an older AFS snapshot but also has some
features for handling offline cases. http://coda.cs.cmu.edu/
-a
--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list