I've got a Git repository that I use to log updates to system files.
Not things in /var that change every day, but configuration files in
/etc, binaries in /usr, etc. Of course, the repository is large, 9 or
10 GB now.
I've just discovered that setting core.bigFileThreshold = 10k speeds
up garbage collection and repacking tremendously, like from 9 hours to
1/2 hour. This works well in this situation, because files that are
larger than 10k are almost all binaries, and don't benefit from delta
compression. And it turns out that delta compression takes a lot of
time if you have a lot of data.
You received this message because you are subscribed to the Google Groups "Git
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email
For more options, visit https://groups.google.com/d/optout.