On 03/05/2011 06:53 AM, Susan Joslyn wrote:
Regarding your snippets  below:



Would this really be something you want?  Your ongoing work merged into
everyone else's ongoing work every day - before your work is finished and
tested?

I think the idea is you move discrete, completed patches to the main location.

If 2 people apply patches to completed separate areas of the same code, there should be caution, but I wouldnt call everything to a halt "because I have the src right now". At this point I think communication occurs to make sure all parties are aware of each other.

If 2 people apply patches to the same lines, that is a more serious condition. testing needs to occur repeatability as the parties reconcile their projects.

The "code reservation" I think is an older paradigm. Nowadays, projects work in parallel, and a responsible party is watching the stream of patches. Every dev is aware that they are not working in a vacuum.

For the end-user, I think they are happier and better served because projects can occur in parallel and delivery _can_ occur faster. I know I have said many times in my shop, "We are waiting on project X to finish because they have resources we need."

Some of the tools have specific features to handle particular workflows. They are not all equal and the chore is picking the correct one for your organization.

This is just my opinion here.  Please set me right if I am off base.

Git- I personally think git is best for very large workgroups working on a single large codebase (i.e. an os kernel). It is primarily geared to people comfortable in the shell. It does _not_ have good win32 support so it is a non-starter for me.

Subversion - Very popular, but declining. The leader in the "centralized repo" group. It replaced cvs as the defacto standard for the 90's and 2000's. There is only one copy of the repo and to record _any_ change you must make a network connection. A shortcoming I see is if this repo is damaged, all of your resources are in one place. Google "subversion merge hell" and see how many hits you get. Subversion also suffers from speed problems with large repo's. All three major dvcs's (git, bazaar (bzr), and mercurial (hg)) were designed to combat all 3 of these problems.

DVCS's are much faster.  Here are some benchmarks.

https://git.wiki.kernel.org/index.php/GitBenchmarks#bzr.2C_git.2C_and_hg_performance_on_the_Linux_tree

DVCS's are much safer.  Here are some links.
http://stackoverflow.com/questions/4592740/mercurial-compared-to-private-branches-in-svn/4594085#4594085

http://stackoverflow.com/q/2518779

This one is interesting because he includes massive numbers of binary files, i.e. jpegs and word docs. Not that svnquickly gets to the 20 minute range where all 3 dvcs's are in the 1-2 minute range for the same activity. I have a least one workflow that requires good performance on binary files (optio dcl files).
http://joshcarter.com/productivity/svn_hg_git_for_home_directory

For safety, if anything happens to any of the dvcs local dev repo's all they have lost are changes they have not pushed upstream. Learning to operate in a my repo, your repo, master repo environemnt, me and devB are aware of each others work, I am pulling her new changes into my repo. We basically have a 3 way backup going on. If you count the repo on my laptop, there is a 4th backup. If there is a massive fire at hq and the server and both devs machines are smoldering heaps, my laptop repo may contain 99% of all the src. Since it is not a main machine, I likely and not pulling changes as frequently.

Bazaar is very similar to git and hg. It seems to be primarily tied to Canonical LLC's projects and not a lot of activity outside of that world. I used it for a short time but found hg to meet my needs better.

_______________________________________________
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

Reply via email to