Re: How much is too much data in an svn repository?

2022-09-26 Thread Doug Robinson
Sean: On Thu, Sep 22, 2022 at 3:59 PM Sean McBride wrote: > Our svn repo is about 110 GB for a full checkout. Larger on the server of > course, with all history, weighting about 142 GB. > > There haven't been any performance issues, it's working great. > > But now some users are interested in

Re: How much is too much data in an svn repository?

2022-09-23 Thread Jeffrey Walton
On Thu, Sep 22, 2022 at 3:59 PM Sean McBride wrote: > > Our svn repo is about 110 GB for a full checkout. Larger on the server of > course, with all history, weighting about 142 GB. > > There haven't been any performance issues, it's working great. > > But now some users are interested in

Re: How much is too much data in an svn repository?

2022-09-23 Thread Nico Kadel-Garcia
On Fri, Sep 23, 2022 at 7:43 AM Mark Phippard wrote: > > On Thu, Sep 22, 2022 at 3:59 PM Sean McBride wrote: > > > > Hi all, > > > > Our svn repo is about 110 GB for a full checkout. Larger on the server of > > course, with all history, weighting about 142 GB. > > > > There haven't been any

Re: How much is too much data in an svn repository?

2022-09-23 Thread Nathan Hartman
On Thu, Sep 22, 2022 at 3:59 PM Sean McBride wrote: > > Hi all, > > Our svn repo is about 110 GB for a full checkout. Larger on the server of > course, with all history, weighting about 142 GB. > > There haven't been any performance issues, it's working great. > > But now some users are

Re: How much is too much data in an svn repository?

2022-09-23 Thread Daniel Sahlberg
Hi, In addition to all other responses, I'd like to advertise the "pristines on demand" feature that got some traction in the spring. Subversion is normally storing all files twice on the client side (in the "working copy": once for the actual file and once as a "pristine", ie as the file was

Re: How much is too much data in an svn repository?

2022-09-23 Thread Graham Leggett via users
On 23 Sep 2022, at 13:42, Mark Phippard wrote: > A big negative of Subversion repositories is you cannot ever delete > anything. Do you really need to keep all these binaries forever? In our regulated world that is an important feature. Once the repos get too big we start new ones. In the

Re: How much is too much data in an svn repository?

2022-09-23 Thread Mark Phippard
On Thu, Sep 22, 2022 at 3:59 PM Sean McBride wrote: > > Hi all, > > Our svn repo is about 110 GB for a full checkout. Larger on the server of > course, with all history, weighting about 142 GB. > > There haven't been any performance issues, it's working great. > > But now some users are

Re: How much is too much data in an svn repository?

2022-09-23 Thread Graham Leggett via users
On 22 Sep 2022, at 21:59, Sean McBride wrote: > Our svn repo is about 110 GB for a full checkout. Larger on the server of > course, with all history, weighting about 142 GB. > > There haven't been any performance issues, it's working great. > > But now some users are interested in committing

Re: How much is too much data in an svn repository?

2022-09-23 Thread Aleksa Todorović
Hi all, I can confirm that Subversion can handle repositories with 100,000+ revisions, size of committed files ranging from few bytes to several GBs, and total repo size of up to 20TB. Speed issues that I'm seeing are mostly related to hard drive operations, but do not prevent efficient work. The

Re: How much is too much data in an svn repository?

2022-09-23 Thread Justin MASSIOT | Zentek
Hello Sean, I have not enough experience to answer your question, but I'm very concerned about large binary files. Whereas I have a more "splitted" structure of repositories. I'm following this discussion ;-) Can anyone bring some inputs on this topic? Justin MASSIOT | Zentek On Thu, 22 Sept

How much is too much data in an svn repository?

2022-09-22 Thread Sean McBride
Hi all, Our svn repo is about 110 GB for a full checkout. Larger on the server of course, with all history, weighting about 142 GB. There haven't been any performance issues, it's working great. But now some users are interested in committing an additional 200 GB of mostly large binary files.