Hello git devs!
We are currently trying to deal with the large-binaries-in-git problem,
and I found this conversation from 2011 on this mailing list:
I was also motivated by finding this git GSoC 2012 proposal:
Git copies and stores every object from a remote repository when
cloning. For large objects, this can consume a lot of bandwidth and
disk space, especially for older versions of large objects which are
unlikely to be accessed. Git could learn a new alternate repository
format where these seldom-used objects are stored on a remote server
and only accessed on demand.
What both this proposal and the email discussion proposed (among
others), i.e. storing large binaries outside of git, and especially
fetching them from somewhere else only on demand, sounds like it would
solve our problem pretty well.
My question (I asked first in the git-devel IRC channel) is if there was
any more activity on this, and/or if this is on a roadmap or similar?
sparse clone sounds like something similar, but this has been pushed far
I am aware of external mechanisms (e.g. git-annex, git-media), but we
would prefer something git-internal: Our userbase is heavily
cross-platform (including windows), but there's no windows support for
git-annex (which otherwise sounds like we could use it).
thank you for any input,
all the best,
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html