Re: upgrading server

2017-07-25 Thread Ryan Schmidt
Don't forget that svnadmin dump/load and svnsync don't preserve hook scripts or 
other ancillary data that might be in your repository directory on the server, 
so if you have hook scripts, authorization rules or other customizations, carry 
those over to the new server manually. And of course any configuration files 
for httpd or svnserve.



RE: upgrading server

2017-07-25 Thread Andrew Reedick
>> Does anyone know how long it would take to export the repository of this 
>> size?  This will give us an estimate how long to schedule down time and cut 
>> off time.

Svnsync is the easy option.

If you insist on doing a dump/load, then a) you can time a test run of a 
dump/load, and b) "svnadmin dump" allows you to specify revision ranges which 
means you can do incremental dumps.  You can dump/load the bulk of the repo now 
and then during the migration, run another dump/load to catch any new commits.
http://svnbook.red-bean.com/nightly/en/svn.reposadmin.maint.html#svn.reposadmin.maint.migrate

The only caveat I can think of is if someone changes revision properties (e.g. 
the commit message) between the time of the initial dump and the migration.  
But you can track/prevent those with a hook script.  (And is another reason why 
svnsync is preferred.)




Re: upgrading server

2017-07-25 Thread David Chapman

On 7/25/2017 11:00 AM, Andy So wrote:


We have an old subversion /version 1.4.3 (r23084) /running on Solaris.

We would like to upgrade to use new hardware on Linux based OS (CentOS 
6.9), possibly version 1.8.x or 1.9.x


Our plan is to installed and configure the latest SVN on CentOS 6.9. 
Then go through dump and load of the repository as described in 
various online post and documentations.  The repository is quite 
large…guessing the size to be in the order of 20-40GB


Before we start undertaking such tasks

1.Does anyone know if there are there any problem/gotcha in migrating 
the repository?


2.Does anyone know how long it would take to export the repository of 
this size?  This will give us an estimate how long to schedule down 
time and cut off time.


Thanks for any insight.



I recommend CentOS 7.x; CentOS 6.x is nearing the end of its support 
lifetime.  I also installed an SSD on my new machine - much faster than 
rotating media, and Subversion's "write once" philosophy (old revisions 
are essentially immutable) works well on an SSD.


--
David Chapman  dcchap...@acm.org
Chapman Consulting -- San Jose, CA
Software Development Done Right.
www.chapman-consulting-sj.com



Re: upgrading server

2017-07-25 Thread Mark Phippard
On Tue, Jul 25, 2017 at 2:00 PM, Andy So  wrote:

> We have an old subversion *version 1.4.3 (r23084) *running on Solaris.
>
> We would like to upgrade to use new hardware on Linux based OS (CentOS
> 6.9), possibly version 1.8.x or 1.9.x
>
>
>
> Our plan is to installed and configure the latest SVN on CentOS 6.9. Then
> go through dump and load of the repository as described in various online
> post and documentations.  The repository is quite large…guessing the size
> to be in the order of 20-40GB
>
>
>
> Before we start undertaking such tasks
>
> 1.  Does anyone know if there are there any problem/gotcha in
> migrating the repository?
>
> 2.  Does anyone know how long it would take to export the repository
> of this size?  This will give us an estimate how long to schedule down time
> and cut off time.
>
>
>
> Thanks for any insight.
>


Dump and load is a good idea because it lets the repository be rewritten
using the newer code and repository format which will give you a smaller
repository that will run a little faster.  That said you do not HAVE to
dump/load.  You have other options:

1. Just move the repository folder to the new server.  Perhaps using tar
and then moving the archive.

2. Instead of using dump/load, use svnsync.  This gives all the benefits of
the dump/load but allows you to shrink your downtime to almost nothing.
Just svnsync the repository to the new server.  This will probably take a
long time, but it does not matter since the original server can be running
while it happens.  At the time of your choosing, do a final svnsync, and
then shutdown the old server and use the new one.

3. Do option 1 now and then do a dump/load or svnsync at some future time
that is more convenient for downtime.  It will probably run faster since it
is on new and better hardware too.

There are some gotchas no matter what:

1.  Does the new server have a new hostname or do you intend to update DNS
to point to new server?  If you are not doing the latter, then all of your
existing working copies and scripts have to be modified for the new
server.  Also any use of svn:externals property has to be modified.

2. With an old repository there is a good chance you will run into bugs in
your data that cause svnsync or load to fail.  There are workaround for
different failures but be prepared to run into them and account for finding
the solutions.  SVN 1.8 and 1.9 have added various options to let you
workaround some of the known bugs.  A common problem is having svn:
properties that are not UTF-8 encoded or do not have LF line endings.
 svnsync has this option to work around the encoding problem:

--source-prop-encoding ARG : convert translatable properties from encoding
ARG
 to UTF-8. If not specified, then properties are
 presumed to be encoded in UTF-8.

And it automatically fixes the LF problems.

svnadmin load does not have any options to fix the problems, but you can
add the   --bypass-prop-validation  option to ignore them and just carry
the problems into your new repository.



-- 
Thanks

Mark Phippard
http://markphip.blogspot.com/


upgrading server

2017-07-25 Thread Andy So
We have an old subversion version 1.4.3 (r23084) running on Solaris.
We would like to upgrade to use new hardware on Linux based OS (CentOS 6.9), 
possibly version 1.8.x or 1.9.x

Our plan is to installed and configure the latest SVN on CentOS 6.9. Then go 
through dump and load of the repository as described in various online post and 
documentations.  The repository is quite large…guessing the size to be in the 
order of 20-40GB

Before we start undertaking such tasks

1.  Does anyone know if there are there any problem/gotcha in migrating the 
repository?

2.  Does anyone know how long it would take to export the repository of 
this size?  This will give us an estimate how long to schedule down time and 
cut off time.

Thanks for any insight.


Re: svn vs. git

2017-07-25 Thread Nathan Hartman
On Tue, Jul 25, 2017 at 2:48 AM, Andreas Krey  wrote:

> On Thu, 20 Jul 2017 17:38:38 +, Nathan Hartman wrote:
> ...
> > Is that such a big deal?
>
> The big deal is a slightly different point. Making commit 'offline'
> not only allows me to make commits while in the middle of nowhere
> (and here in germany this easily includes trains).
>

Quite a few people seem to be working on trains during their commute. I
have to say that I've never been able to concentrate with so much noise,
motion, and distraction going on, and I don't see myself able to produce
one coherent commit let alone more than one. I once worked in a car for one
hour to solve a customer emergency. For some reason, looking at my computer
screen while in a moving vehicle made me dizzy for hours afterwards. The
code I wrote worked (miraculously) but on further examination after the
fact, it was some of the most atrocious code I've ever seen. Kudos to all
who can get real work done while on board a plane, train, or automobile. :-)


Re: svn vs. git

2017-07-25 Thread Stefan Sperling
On Tue, Jul 25, 2017 at 01:59:29PM +0200, Johan Corveleyn wrote:
> It even has internal libraries with stable API's that
> allow writing plugins and GUI's on top rather than them having to drive a
> command line utility.

This is something git developers who I know personally *really want*.
Unfortunately it's hard to get there when starting from a pile of
perl scripts and tiny C utils with main() functions :-/

Nothing is perfect.


Re: svn vs. git

2017-07-25 Thread Johan Corveleyn
Op 25 jul. 2017 9:48 a.m. schreef "Andreas Krey" :

On Thu, 20 Jul 2017 17:38:38 +, Nathan Hartman wrote:
...

> Subversion is a very good system. It doesn't get the credit it deserves,

Please. git managed to be faster in providing actually working
(i.e. tracked) merges than subversion, and then there was
the --reintegrate debacle that took another five years to
sort out.


Please. svn managed to be faster in providing granular access control,
sparse checkouts, handling large repositories and is *vastly* more simple
to use than git. It even has internal libraries with stable API's that
allow writing plugins and GUI's on top rather than them having to drive a
command line utility.

So ... it depends what you're after. git being faster at having tracked
merges doesn't make Subversion a bad system.

-- 
Johan


Re: svn vs. git

2017-07-25 Thread Stefan Sperling
On Tue, Jul 25, 2017 at 08:48:07AM +0200, Andreas Krey wrote:
> This also means that I can't maintain a patched version of
> svn (or anything in an svn repo) without having commit
> privilege to the source repo

This is obviously true, and a reason for why the Subversion project
itself has a very relaxed commit access policy. Need a branch?
Just show a nice diff to any full committer and they can make it happen:
https://subversion.apache.org/docs/community-guide/roles.html#partial-commit-access

Granted, this is still an extra hoop you might not want to jump through,
in which case using a git mirror simply makes more sense.

And other projects using SVN may be less open about this, of course.
The point is that people don't need to be walled off still stands.
It is a process problem, not a technical one, unless you're working
at Linux kernel scale with a community of thousands across various
organizations, which Subversion was never intended for.
You might remember the page on subversion.tigris.org around the time
git was first created, which asked people to stop suggesting that Linus
should use SVN: 
https://svn.apache.org/repos/asf/subversion/branches/1.2.x/www/subversion-linus.html


Re: svn vs. git

2017-07-25 Thread Andreas Krey
On Thu, 20 Jul 2017 17:38:38 +, Nathan Hartman wrote:
...
> One myth that is not mentioned on that page is the famous, "But you can't
> work offline!" Being able to work "offline" is supposed to be the biggest
> selling point of a DVCS over a CVCS. Okay... I'm calling that a myth. First
> of all, there is nothing inherent to Subversion that "prevents" you from
> working offline. You can work, you just can't do server side operations.

In other words, you have a versioning system that you can't use offline,
except for local diffs.

> Is that such a big deal?

The big deal is a slightly different point. Making commit 'offline'
not only allows me to make commits while in the middle of nowhere
(and here in germany this easily includes trains).

It also allows me to make commits for a repo that I don't have
commit privileges for - I can commit my work in a meaningful way,
and later convince one of the official committers to pull these.

This also means that I can't maintain a patched version of
svn (or anything in an svn repo) without having commit
privilege to the source repo, or having to do the
vendor branch dance (with in itself is unnecessarily
annoying in svn).

> And if it is, do you mean to tell me that in this day
> and age of cloud services and IoT, where every single thing you do requires
> Internet access, that you're ever really "offline" for long enough that it
> matters?

It also makes a difference in speed - git log usually outputs data
faster than anybody can open an SSL connection in edge land.

...

> And even if you are planning to spend a year alone on a deserted
> island, nothing stops you from doing "svnadmin create" on your local
> computer and making as many commits as you want.

You know that that is a straw man. There is no in-svn way to
reconcile that with another repo, and also I don't want to
start on an empty repo, but, say, on the current svn trunk.

> But that doesn't make
> sense, because the longer you work in isolation, the less likely it is that
> your work will merge cleanly when you get back.

That is also orthogonal. Being offline does not mean that other people
work in the same repo, let alone area of a product. Also, the alternative,
namely working on trunk and do frequent updates, is worse in comparison,
because then you end up in conflicts in you sandbox, and have no way
of backing out and trying again later.

> Even the smartest and
> greatest DVCS in the world can't solve that problem.

Neither can non-distributed systems. The question is how long
you let work diverge, not where you do it.

> Subversion is a very good system. It doesn't get the credit it deserves,

Please. git managed to be faster in providing actually working
(i.e. tracked) merges than subversion, and then there was
the --reintegrate debacle that took another five years to
sort out.

Andreas

-- 
"Totally trivial. Famous last words."
From: Linus Torvalds 
Date: Fri, 22 Jan 2010 07:29:21 -0800