On Thu, Feb 28, 2008 at 09:07:51PM +0100, martin f krafft wrote:
> I have some minor comments on the code:
> 
> > for repo in .fgits/*; do
> 
> probably want to make this configurable...
Generally speaking, yes. But how?  I mostly consider it a script for personal
use anyway ;)

> >     if GIT_DIR=$repo git ls-files $file | fgrep $file >/dev/null; then
> 
> fgrep -q :)
> 
> The only concern I have is scalability. Iterating all repos and running
> ls-files on them is quite bad. However, I can't think of a better way. Can
> you?

Do you really think, I would have implemented it this way if I knew a better
one? :)

One could maybe generate a mapping list, but I don't think it's worth it. You
won't have a myriad of repos, and ls-files is fast, it seems.

I have been hacking a bit on an add-on tool, that wraps certain git commands to
use the correct repo. For example, add would add each file to the correct repo,
commit and status would be run on all of them, et cetera.
That should be more comfortable than setting GIT_DIR all the time.

Have a first preview (ATM it's called git-guess-repo, but I'd like git-multrepo
or similar. Suggestions?):

#!/bin/sh

action=$1

case $action in
    add)
        shift
        for file in $*; do
            GIT_DIR=$(./git-find-repo.sh $file) git add $file
        done
        ;;
    commit)
        for repo in .fgits/*; do
            GIT_DIR=$repo git commit
        done
        ;;
    status)
        for repo in .fgits/*; do
            GIT_DIR=$repo git status
        done
        ;;
esac

(You don't have to suggest moving the redundant code into a function, I had
that idea myself.)

I wonder if the idea of a merged-together repo would be more transparent to 
handle...

mxey
_______________________________________________
vcs-home mailing list
vcs-home@lists.madduck.net
http://lists.madduck.net/listinfo/vcs-home

Reply via email to