On Wednesday, 19 August 2015 at 11:42:54 UTC, Ola Fosheim Grøstad wrote:
On Wednesday, 19 August 2015 at 10:09:33 UTC, Chris wrote:
Well, maybe that's exactly what the designers of C did, they didn't slavishly follow the convention that the result of the computation is notated to the right. Maybe they thought, 'Uh, actually, wouldn't it be handier to see immediately what type it is?'.

Algol has "INTEGER A".

Simula has "INTEGER A".
The Simula successor BETA has "A : @integer".

C has "int a".
The C successor Go has "a int".

Has the argument that tpye-to-the-right is easier for beginners has ever been proven?

It is much easier to read when you have longer types. Old languages tended to have not so long types (libraries and programs were smaller).

If you support templates it is pain to have types on the left.

It also makes it much more natural/consistent when you use type deduction. Just omit the type, no silly "auto".


#3 Here we go again... I wonder what's the problem with this. I
still think it's a very handy shorthand for cumbersome `x = x + 1` or even `x += 1`. And no, it's not confusing, because it is well defined as incrementing the value by 1. In fact, I don't like Python's patronizing insistence in having to write `x = x + 1`.

Python supports "+=".

Yes, I forgot, it does. But why not `x++`? I never understood why. As if most people were too stoooopid to grasp the concept that `x++` is the same as `x += 1` (which is intellectually as 'challenging' as `x++`, by the way).

defined, there's no problem. It's like spelling "colour" or "color", it doesn't really matter.

Jeg skriver "farge"… ;-)

farge / farve / färg / Farbe - still the same thing ;)

Reply via email to