> From: krishna chaitanya kurnala <kkc...@gmail.com>
> I observed that in some cases, Git clone is not working, It either Hangs or
> it will take hours to clone some repos
> remote: Compressing objects: 100% (69/69), done.
> fatal: Out of memory, malloc failed (tried to allocate 2305676104 bytes)
> fatal: unpack-objects failed
The central problem is that Git doesn't know how much memory it can
allocate on the machine, and so if there are really big files in the
repository, it assumes that it can handle them entirely in memory.
The solution is to set various configuraiton parameters so that Git
doesn't try to handle very large chunks of data in memory. If you
google "Git 'fatal: Out of memory, malloc failed (tried to allocate'",
you will find a number of examples, including
> *What are the Tools or Commands that we can use on bare repo located on
> central server to clean up meta-data*. I observed that git filter-branch or
> bfg tool can be run on a cloned copy and we can push the cleaned up repo
> after running them. The Same commands are not having any effect when run
> directly on the central repo
The problem is that you want to delete all references to the large
file(s) in the central repository. Deleting all references in a
working repository and then pushing some branch heads to the central
repository doesn't delete all the references in the central
repository. You will have to run the appropriate tool on the central
repository. You may have to simultaneously run the tools on every
clone, because if a clone has a reference, pushing changes back to the
central repository may copy that reference back to the central
repository, carrying the large file(s) with it.
You received this message because you are subscribed to the Google Groups "Git
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email
For more options, visit https://groups.google.com/d/optout.