On 11/1/2014 6:41 PM, bearophile wrote:
Walter Bright:
Thank you for your answers.
D removes very little bound checks. No data flow is used for this.
This is false.
Oh, good, what are the bound checks removed by the D front-end?
It does some flow analysis based on previous bounds checks.
This is on purpose, because otherwise about half of what enums are used for
would no longer be possible - such as bit flags.
On the other hand we could argue that bit flags are a sufficiently different
purpose to justify an annotation (as in C#) or a Phobos struct (like for the
bitfields) that uses mixin that implements them (there is a pull request for
Phobos, but I don't know how much good it is).
More annotations => more annoyance for programmers. Jonathan Blow characterizes
this as "friction" and he's got a very good point. Programmers have a limited
tolerance for friction, and D must be very careful not to step over the line
into being a "bondage and discipline" language that nobody uses.
D module system has holes like Swiss cheese. And its design is rather
simplistic.
Oh come on.
ML modules are vastly more refined than D modules (and more refined than modules
in most other languages). I am not asking to put ML-style modules in D (because
ML modules are too much complex for C++/Python programmers and probably even
unnecessary given the the kind of template-based generics that D has), but
arguing that D modules are refined is unsustainable. (And I still hope Kenji
fixes some of their larger holes).
I didn't say they were "refined", whatever that means. I did take issue with
your characterization. I don't buy the notion that more complex is better.
Simple and effective is the sweet spot.
- no implicit type conversions
D has a large part of the bad implicit type conversions of C.
D has removed implicit conversions that result in data loss. Removing the rest
would force programs to use casting instead, which is far worse.
This is a complex situation, there are several things that are suboptimal in D
management of implicit casts (one example is the signed/unsigned comparison
situation).
It is not suboptimal. There are lot of tradeoffs with this, and it has been
discussed extensively. D is at a reasonable optimum point for this. The
implication that this is thoughtlessly thrown together against all reason is
just not correct.
I think the size casting that loses bits is still regarded as safe.
It is memory safe.