[git-users] malloc fails when dealing with huge files

2008-09-09 Thread Jonathan
I'm using Git for a project that contains huge (multi-gigabyte) files. I need to track these files, but with some of the really big ones, git-add aborts with the message "fatal: Out of memory, malloc failed". Also, git-gc sometimes fails because it can't allocate enough memory. I've been using

[git-users] Re: malloc fails when dealing with huge files

2008-09-09 Thread Jonathan
I forgot to mention, I'm using git 1.5.6.2 on Debian 4.0. I'm using the binary download (.tar.bz2) from the git website. On Sep 9, 4:22 pm, Jonathan <[EMAIL PROTECTED]> wrote: > I'm using Git for a project that contains huge (multi-gigabyte) > files.  I need to track

[git-users] Re: How hard would it be to implement sparse fetching/pulling?

2017-12-01 Thread Jonathan Nieder
Hi, Jeff Hostetler wrote: > On 11/30/2017 3:03 PM, Jonathan Nieder wrote: >> One piece of missing functionality that looks intereseting to me: that >> series batches fetches of the missing blobs involved in a "git >> checkout" command: >> >> https://p