On Tue, Jan 02, 2018 at 12:15:55AM -0800, 
antonio.poma...@external.thalesgroup.com wrote:

> > I am trying to find some official information about the limit size of a 
> > repository. As I have read git has a limitation of 2GB for each repo 
> > because if you increment the size it starts to decrease the overall 
> > performance. But this limitation is mainly explained in the web based 
> > servers like GitHub or Bitbucket Cloud. So my main question is if this also 
> > happens *when you use a self hosted server* like Bitbucket Server or 
> > GitHub enterprise. However, what I am trying to find is official 
> > documentation that explains this.
[...]
> On the other hand, to clarify my conditions as you said Konstantin
> Khomoutov, I am using Git for Windows. I cannot understand that Git
> for Windows has a practical limit of 2GB on the size of the objects
> Git is able to manipulate in memory, because I was able to do commits
> and checkouts with 3.3 GB.

That's because there is a difference between the total size of a
repository and sizes of individual objects Git manipulates when it
prepares a new commit or reads an old one.

To explain it in the simplest words I find possible, you may have a
repository containing the history of commits each containing several
millions of small files. The repo size may well be over 2 GiB but that
will not be a problem for that size limit of GfW. On the other hand,
should you try to commit a single file of the size slightly over 2 GiB
in size even into an empty repository, Git for Windows would not let you
do that, with some weird error message.

-- 
You received this message because you are subscribed to the Google Groups "Git 
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to git-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to