First, read
http://gregoryszorc.com/blog/2015/05/29/faster-cloning-from-hg.mozilla.org-with-server-provided-bundles/

The <repo>/archive/<filename> URLs return snapshots of all the files at a
specific revision. This is like a shallow clone, except there is no VCS
data at all. This operation will almost certainly be faster than clone by
definition.

If all you need is a snapshot, fetching these URLs can be tolerated.
However, I'd generally recommend against it in automation because if you
fetch one revision, chances are you'll want to fetch another revision. And
at the point you are fetching N discrete archives, eventually costs eclipse
the cost of the up-front clone and incremental pulls. For automation, I
recommend cloning and not deleting the local copy.

On Fri, Jun 5, 2015 at 6:52 AM, Axel Hecht <[email protected]> wrote:

> Hi,
>
> I just had a random idea, and wondered if anybody else had that before.
>
> wget -O-
> http://hg.mozilla.org/l10n-central/de/archive/0c8dd20094a0.tar.bz2 | tar
> -jx
>
> takes 2.5 seconds here over hotel wifi,
>
> hg clone -r 0c8dd20094a0 http://hg.mozilla.org/l10n-central/de/
>
> almost 7.
>
> Given how many repos we clone, in particular for repacks, would this be an
> interesting way to speed things up?
>
> Axel
> _______________________________________________
> dev-builds mailing list
> [email protected]
> https://lists.mozilla.org/listinfo/dev-builds
>
_______________________________________________
dev-builds mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-builds

Reply via email to