On Sun, Aug 4, 2013 at 7:17 AM, Jörn Hees <d...@joernhees.de> wrote:
> Hi,
> On 4 Aug 2013, at 12:38, Antoine Pelisse <apeli...@gmail.com> wrote:
>> […]
>> I also decided to always clone local repositories because what Jörn Hees
>> said makes sense:
>> If you have a local clone of a big repository, and then want to add a slow
>> remote, you would have to reclone everything.
>> I think the trade-off is good, because clone from local should not be that
>> time expensive (maybe it can be on disk-space though).
> I was working on a similar patch in the meantime, this point was the only 
> thing that
> kept me from submitting… Can someone of you think of an easy way to do this 
> lazily
> on the first non-local remote being added?
> In case we don't have a non-local clone (a mercurial dir with a clone subdir) 
> yet, we
> would try to go though the local mercurial remotes and then clone them… Just 
> would
> need a way to get their URLs. I thought about going through all "git remote 
> -v"

git config --get-regexp '^remote.*.url' is probably more appropriate.

Either way, I don't see why such a change should be in the same patch.

> This way we wouldn't need to copy by default (bad for big repos), but could 
> still do this
> in a cheap way if a slow remote is added later on.
> Btw, is there any reason why we don't just use the local mercurial remotes as 
> shared
> repo? Cause it's not under our git dir and might be deleted?

Yes. Or moved, or might be in an external drive, or many other reasons.

This is my solution:

--- a/contrib/remote-helpers/git-remote-hg.py
+++ b/contrib/remote-helpers/git-remote-hg.py
@@ -391,11 +391,22 @@ def get_repo(url, alias):
         shared_path = os.path.join(gitdir, 'hg')
-        if not os.path.exists(shared_path):
-            try:
-                hg.clone(myui, {}, url, shared_path, update=False, pull=True)
-            except:
-                die('Repository error')
+        # check and upgrade old organization
+        hg_path = os.path.join(shared_path, '.hg')
+        if os.path.exists(shared_path) and not os.path.exists(hg_path):
+            repos = os.listdir(shared_path)
+            for x in repos:
+                local_hg = os.path.join(shared_path, x, 'clone', '.hg')
+                if not os.path.exists(local_hg):
+                    continue
+                shutil.copytree(local_hg, hg_path)
+        # setup shared repo (if not there)
+        try:
+            hg.peer(myui, {}, shared_path, create=True)
+        except error.RepoError:
+            pass

         if not os.path.exists(dirname):

It should also work in all the cases, but there would not be an extra
unnecessary clone while upgrading, and it doesn't sneak in any other

You can see the changes on top of my previous patch that lead to this
diff in my repo:



Felipe Contreras
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to