On Sat, 29 Aug 2009, Bjoern Michaelsen wrote:
Guessing from the subject of your mail, I think you want a tarball of the repo with history. The most efficient way to get something like that would be a "hg bundle". That would be a 730MB download for DEV300 as of now. We probably will have such a bundle available after the pilot (ask Heiner). As of now, you will need to "hg clone" from
 http://hg.services.openoffice.org/hg/DEV300

The main problem with direct 'clone' calls is that in the event the network connection breaks, everything is lost as this cannot be resumed because Mercurial will roll back everything it has received, not only one last incomplete changeset.

Furthermore I strongly believe that using 'clone' might be a lot slower than downloading a binary file (might it be a 'bundle' or a 'tarball') over 'http' or 'ftp' with a modern download assistant using multiple connection at the same time.

For now - after reading an old thread about 'clone' command not being resumable on the Mercurial list - I followed a workaround mentiond there: One can initialize an emtpy repo and then use a sequence of 'hg pull -r <rev>' commands where the <rev> is then increased by a lower number of changesets.

The only problem is that one needs the long format changeset id's for that. I got it the manual hard way through the Webinterface where one can jump forward by 1000, 10000 or 30000 changesets.

Looking at SVN repo I had originally expected much more and huge changesets but it seems DEV300 does not to contain the complete history.

Even if looking at the CWS, it appears clear that getting a copy of a CWS also requires special treatment in order to not download everything again.

This means, 'clone' is not an option here, but one should 'clone' or 'cp' an already existing copy of DEV300 locally, and then 'hg pull' that CWS into the copy; otherwise everything is transferred again and much disk space is wasted.

Regards

Guido

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to