On Wednesday, 19 August 2015 at 10:09:33 UTC, Chris wrote:
Well, maybe that's exactly what the designers of C did, they didn't slavishly follow the convention that the result of the computation is notated to the right. Maybe they thought, 'Uh, actually, wouldn't it be handier to see immediately what type it is?'.

Algol has "INTEGER A".

Simula has "INTEGER A".
The Simula successor BETA has "A : @integer".

C has "int a".
The C successor Go has "a int".

Has the argument that tpye-to-the-right is easier for beginners has ever been proven?

It is much easier to read when you have longer types. Old languages tended to have not so long types (libraries and programs were smaller).

If you support templates it is pain to have types on the left.

It also makes it much more natural/consistent when you use type deduction. Just omit the type, no silly "auto".


#3 Here we go again... I wonder what's the problem with this. I
still think it's a very handy shorthand for cumbersome `x = x + 1` or even `x += 1`. And no, it's not confusing, because it is well defined as incrementing the value by 1. In fact, I don't like Python's patronizing insistence in having to write `x = x + 1`.

Python supports "+=".

defined, there's no problem. It's like spelling "colour" or "color", it doesn't really matter.

Jeg skriver "farge"… ;-)

Reply via email to