On Fri, Sep 23, 2022 at 7:43 AM Mark Phippard <markp...@gmail.com> wrote:
> On Thu, Sep 22, 2022 at 3:59 PM Sean McBride <s...@rogue-research.com> wrote:
> >
> > Hi all,
> >
> > Our svn repo is about 110 GB for a full checkout. Larger on the server of 
> > course, with all history, weighting about 142 GB.
> >
> > There haven't been any performance issues, it's working great.
> >
> > But now some users are interested in committing an additional 200 GB of 
> > mostly large binary files.
> >
> > I worry about it becoming "too big".  At what point does that happen?  
> > Terabytes?  Petabytes?  100s of GB?
> Assuming you have the disk space then there is no real upper limit.

There are practical limits. The number of file descriptors for years
or decades of irrelevant history accumulate. Bulky accidental commits,
such as large binary objects, accumulate and create burdens for backup
or high availability. And keeping around old tags that haven't been
used in years encourages re-introducing obsolete API's or errors, or
re-introduce security flaws.

> That said ... do not discount the administrative burden. Are you
> backing up your repository? Whether using dump/load, svnsync or
> hotcopy the bigger the repository the more of a burden it will be on
> these tools.
> If this is just about storing binary files why not consider solutions
> that were meant for that like an object storage platform like S3 or
> minio or a package manager like Maven, Nuget etc.
> A big negative of Subversion repositories is you cannot ever delete
> anything. Do you really need to keep all these binaries forever?
> Mark

Reply via email to