From: "Mike Stump" <>
Sent: Friday, August 01, 2014 11:10 PM
On Aug 1, 2014, at 11:57 AM, Philip Oakley <> wrote:
But that goes both ways, and is a philosophical issue about what is to be expected in various cases.

The problem is, users expect merge to merge. There isn’t a user that expects it to scramble the source code, because the command is called merge, not scramble.

Unfortunately we are back at the problem af what 'merge' means. Git uses a snapshot model, while most other version control systems uses a changeset model (which has been used since before the Titanic was built for various reasons [1]) for their storage. Thus most VCS users see merge as the addition of a delta "A + delta -> B", while Git sees merge as the union of snapshots "A + G -> Q".

The cherry-pick and rebase methods determine the change deltas between adjacent snaphots (i.e. patches) and then apply that to a different snapshot. In those cases we are not merging snapshots, rather applying patches (note my changed use of 'merge').

That word has semantics that were not invented by your project. You cannot change the semantic of the word. Merge has a nice mathematical definition. Merge branch into master means, place into into master the work from that branch. git already does this 99% correct, it is missing one corner case.

This is not a philosophical issue.  It is a definitional one.

For some central control use styles, the ideas behind _distributed_ version control are anathema and (Git) just grinds away at the policies that are expected.

This is irrelevant to the issue at hand.

That said, Git doesn't claim to be perfect

Again, irrelevant.

(and can't because

Do you mean, and can’t be? If so, you are wrong in the case at hand. svn is an existence proof that you are wrong.

of the 'relativity' that comes with being distributed - truth has to give way to a web of trust). Also the artefacts that Git validates are at a different level of abstraction i.e. the whole project as a commit, rather than just a few/one file at a time.

<snipped remainder because of time limitations>


[1] Way back when blueprints really were blue, and real drawing were on kaolin & linen drawing sheets with indian ink, the 'master' drawings were the most prized items, that if damaged would stop production, so there were many layers of drawing office controls to avoid touching it (e.g. tracers to copy drawings) and keep the master drawings pristine.

From that, the ideas of Change requests, Change orders, Approved changes
etc. became the norm. It was the changes that were recorded. These processes are still in place today in most engineering companies and the military, in fact anyone with real artefacts. These techniques were copied into the computing world when it was young.

The big change has been that Computing has reduced the cost of production (duplication, replication) to zero, so now the problem is in trying to validate and verify which alledged copy is actually the 'master' - especially in a fully distributed environment, such as the many different Linux distributions and applications. It was into this gap that Git steps, with the use of the sha1 to both validate the history chain and verify any specific snapshot. It doesn't get hung up on what the change was, that can be determined easily after the fact by a simple diff between adjacent snapshots, though having lots of small changes and crisp commit messages does help.

That's the way I view it anyway. (When I first started work, they still had blue prints, but had moved to microfiche and melinex (polyester) drawing sheets, and the 8088 / IBM PC was new and powerful! I still work in engineering where the old VC model is ubiquitous)

To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to
More majordomo info at

Reply via email to