On Thu, May 15, 2014 at 10:22:14AM -0700, John Fisher wrote:
> I assert based on one piece of evidence ( a post from a facebook
> dev) that I now have the worlds biggest and slowest git repository,
> and I am not a happy guy. I used to have the worlds biggest CVS
> repository, but CVS can't handle multi-G sized files. So I moved the
> repo to git, because we are using that for our new projects.
> goal:
> keep 150 G of files (mostly binary) from tiny sized to over 8G in a
> version-control system.
> problem:
> git is absurdly slow, think hours, on fast hardware.
> question:
> any suggestions beyond these-
> http://git-annex.branchable.com/
> https://github.com/jedbrown/git-fat
> https://github.com/schacon/git-media
> http://code.google.com/p/boar/
> subversion 
> ?

I think the general consensus is that git is for version control of
source, i.e. text.  In general putting large binary files into a DVCS
is a bad idea since every clone will control ALL versions of ALL
files.  That makes for a lot of used space!

Maybe a backup is what you actually need: https://github.com/bup/bup
Then take another look at git-annex and how it can be used as a client
to bup.


Magnus Therning                      OpenPGP: 0xAB4DFBA4 
email: mag...@therning.org   jabber: mag...@therning.org
twitter: magthe               http://therning.org/magnus

Code as if whoever maintains your program is a violent psychopath who knows
where you live.
     -- Anonymous

Attachment: pgpVmruxaeMqb.pgp
Description: PGP signature

Reply via email to