On Fri, May 16, 2014 at 2:06 AM, Philip Oakley <philipoak...@iee.org> wrote:
> From: "John Fisher" <fishook2...@gmail.com>
>> I assert based on one piece of evidence ( a post from a facebook dev) that
>> I now have the worlds biggest and slowest git
>> repository, and I am not a happy guy. I used to have the worlds biggest
>> CVS repository, but CVS can't handle multi-G
>> sized files. So I moved the repo to git, because we are using that for our
>> new projects.
>> keep 150 G of files (mostly binary) from tiny sized to over 8G in a
>> version-control system.
I think your best bet so far is git-annex (or maybe bup) for dealing
with huge files. I plan on resurrecting Junio's split-blob series to
make core git handle huge files better, but there's no eta on that.
The problem here is about file size, not the number of files, or
history depth, right?
>> git is absurdly slow, think hours, on fast hardware.
Probably known issues. But some elaboration would be nice (e.g. what
operation is slow, how slow, some more detail characteristics of the
repo..) in case new problems pop up.
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html