On Thu, Sep 22, 2022 at 3:59 PM Sean McBride <s...@rogue-research.com> wrote:
> Hi all,
> Our svn repo is about 110 GB for a full checkout. Larger on the server of 
> course, with all history, weighting about 142 GB.
> There haven't been any performance issues, it's working great.
> But now some users are interested in committing an additional 200 GB of 
> mostly large binary files.
> I worry about it becoming "too big".  At what point does that happen?  
> Terabytes?  Petabytes?  100s of GB?
> Thanks,
> Sean

It occurs to me that we don't have a FAQ or other easy-to-find
documentation on maximums, such as the maximum file size, etc.

The largest publicly-accessible SVN repository of which I am aware is
the Apache.org one in which Subversion's own sources (as well as those
of numerous other projects) are housed. This repository contains
approximately 1.9 million revisions. According to [1] the dump of this
repository expands to over 65 gigabytes.

But that seems to be a drop in the ocean when Aleksa writes:

On Fri, Sep 23, 2022 at 3:45 AM Aleksa Todorović <alexi...@gmail.com> wrote:
> I can confirm that Subversion can handle repositories with 100,000+ 
> revisions, size of committed files ranging from few bytes to several GBs, and 
> total repo size of up to 20TB.

It is possible that others here are aware of even larger repositories.

My biggest concern mirrors what Mark said about administrative burden:
the size of backups and the time it takes to make them. Mark addressed
that point quite well. Whatever you do, you must have good backups!
(My $dayjob does backups 3 different ways: the filesystem on which the
repository is stored is backed up regularly. In addition we take
periodic 'hotcopy' backups, and periodic full 'dump' backups.
Obviously as a repository grows, this takes longer and requires more

[1] http://svn-dump.apache.org


Reply via email to