On Tue, Aug 05, 2014 at 11:35:14AM -0700, Walter Bright via Digitalmars-d wrote: > (limited connectivity for me) > > For some perspective, recently gcc and clang have introduced > optimizations based on undefined behavior in C/C++. The undefined > behavior has been interpreted by modern optimizers as "these cases > will never happen". This has wound up breaking a significant amount of > existing code. There have been a number of articles about these, with > detailed explanations about how they come about and the new, more > correct, way to write code.
And I'd like to emphasize that code *should* have been written in this new, more correct way in the first place. Yes, it's a pain to have to update legacy code, but where would progress be if we're continually hampered by the fear of breaking what was *already* broken to begin with? > The emerging consensus is that the code breakage is worth it for the > performance gains. That said, I do hear what people are saying about > potential code breakage and agree that we need to address this > properly. The way I see it, we need to educate D users to use 'assert' with the proper meaning, and to replace all other usages with alternatives (perhaps a Phobos function that does what they want without the full implications of assert -- i.e., "breaking" behaviour like influencing the optimizer, etc.). Once reasonable notice and time has been given, I'm all for introducing optimizer hinting with asserts. I think in the long run, this will turn out to be an important, revolutionary development not just in D, but in programming languages in general. T -- Ignorance is bliss... until you suffer the consequences!
