Hello All 

I have a few questions around Cleaning up Huge Files that reside in Git 
Repo as Meta-data. We have a Central Git Server that we all Push to and 

I observed that in some cases, Git clone is not working, It either Hangs or 
it will take hours to clone some repos

remote: Compressing objects: 100% (69/69), done.
fatal: Out of memory, malloc failed (tried to allocate 2305676104 bytes)
fatal: unpack-objects failed 

*Can someone suggest how to deal in scenarios where we cant clone from 
central git server ? Increase RAM/Storage Space for the Repo, perhaps ??*

*What are the Tools or Commands that we can use on bare repo located on 
central server to clean up meta-data*. I observed that git filter-branch or 
bfg tool can be run on a cloned copy and we can push the cleaned up repo 
after running them. The Same commands are not having any effect when run 
directly on the central repo


You received this message because you are subscribed to the Google Groups "Git 
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to git-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to