> From: Gergely Polonkai <[email protected]> > to dump all the repo into a new Git repository. It ran for around 24 > hours to report an Out of memory error at the end. Although I realised > since then that this is a somewhat bad idea, but I still don’t think > it should happen this way. Or should it?
It looks like some data structure is kept in memory, and that data structure grows very large when cloning/converting your repository. In a perfect world, the Git code would be improved to allow the data structure to be put on disk. But in order to finish your task, the fastest solution is probably to add a large swap file/partition to your system temporarily, and then run the process again. Dale -- You received this message because you are subscribed to the Google Groups "Git for human beings" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
