On Wednesday, 15 November 2017 at 02:05:27 UTC, codephantom wrote:
On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim Grostad wrote:
It [C]is flawed... ESR got that right, not sure how anyone can disagree.

Well I 'can' disagree ;-)

Right… :-)

Languages are just part of an evolutionary chain.

Right and C is part of this chain BCPL->B->C, the funny thing is that BCPL was never meant to be used as a language beyond bootstrapping CPL, but given the limited computers of the day BCPL and later C became the default system programming language exactly because it wasn't much of a language and fit rather well with the CPUs of the day (among other things, you could hold the whole compiler in main memory ;-).

But C doesn't fit well with the underlying hardware anymore, even though CPU makers are benchmarking against existing C code and make provisions for the C model. That argument can be used against C++, D and Rust too. :-P

So, if the abstraction no longer match the concrete well then it will make less and less sense to use it. Overall, I think we over time will see growth in higher level languages designed to map well onto the hardware at the optimization stage. Current languages aren't quite there yet though, and frankly, neither are the CPUs. I think we are in a transition period (wide pipelines in the CPU + GPU is an indicator).

No part of the chain should be considered flawed - unless it was actually flawed - in that it didn't meet the demands of the environment in which it was initially conceived.

Right, but the performance bottle-neck of serial code is now forcing changes in the environment.

In that circumstance, it must be considered flawed, and evolutionary forces will quickly take care of that.

Not quickly, you have the whole issue with installed base, center of gravity… The mindset of people…

But a programming language is not flawed, simply because people use it an environment where it was not designed to operate.

Ok, fair enough. BCPL was meant to bootstrap CPL and C was a hack to implement Unix… ;^)

Corporate needs/strategy, skews ones view of the larger environment, and infects language design. I think it's infected Go, from the get Go. I am glad D is not being designed by a corporate, otherwise D would be something very different, and far less interesting.

I don't think Go is much affected by the corporate… The Go designers appear to be strong-headed and the language design is in line with their prior language designs. I believe they also made the same mistake as D with reallocating buffers "randomly" when extending slices such that you end up with two arrays because the other slices aren't updated. Not very reassuring when a team of people make such correctness blunders. And what about their "exceptions", a dirty hack only added because they are hellbent on not having exceptions… All about the mentality of the designers… put blame where blame is due…

The design of Dart was affected by the corporate requirements according to the designer who called it a "bland language". I think he would have preferred something closer to Smalltalk :-). Dart is probably more important to Google than Go as their business frontend depends on it. And Dart is now getting static typing to the dismay of the original designer (who is in the dynamic camp).

But yeah, it can be interesting to think about how a mix of personalities and external pressure affect language design.

This is where Eric got it wrong, in my opinion. He's looking for the language that can best fix the flaws of C.

Seems to me that he is looking for something that is easier to deal with than C, but where he can retain his C mindset without having performance issues related to GC. So the bare-bones semantics of Go combined with a decent runtime and libraries that are geared towards network programming probably fits his use case (NTP something?)

In fact C has barely had to evolve (which is not a sign of something that is flawed), because it works just fine, in enviroments for which it was designed to work in. And those enviroments still exist today. They will still exist tomorrow..and the next day...and the next..and...

They exist in embedded devices/SoCs etc. Not sure if it is reasonable to say that they exist on the desktop anymore beyond a mere hygienic backwards-compatibility mentality among CPU designers.

Ola.

Reply via email to