I've got several remote sites that have slow links and large
checkouts are too slow (for their satisfaction).

Here's my plan:

1) Use cvsup to create a read-only mirror that is local to them.
Update this at least once a day.  Prevent checkins here.

2) Create a wrapper around 'cvs checkout' which will take their
$CVSROOT from the environment or from a -d argument, change
the hostname from the master server to their local mirror,
and then pass that along as the new -d arg to 'cvs checkout'.

So ":method:[EMAIL PROTECTED]:/archive" becomes
"method:[EMAIL PROTECTED]:/archive".

After getting a checkout from the cvsup local mirror, it then
traverses the tree and changes all of the CVS/Root files
back to using the master server.

3) Now all future cvs operations (update, log, commit, etc.)
will operate directly with the master, but the amount of data
going back and forth over the net is far lower.

Any holes in this?

It seems that since cvsup is creating an exact duplicate from the
master, all of the CVS/Entries data should be in sync as well.
And the first update will pull over changes made since the mirror
was last updated.

+----------------------------------------------------------------+
| Chris Sharpe  [EMAIL PROTECTED]  KF4WVO  "TiVo, TV your way."  |
|          "PEZ - A treat to eat in a toy that's neat"           |
+----------------------------------------------------------------+



_______________________________________________
Info-cvs mailing list
[EMAIL PROTECTED]
http://mail.gnu.org/mailman/listinfo/info-cvs

Reply via email to