On 8/3/2014 7:31 PM, John Carter wrote:
Compiler users always blame the optimizer long before they blame their crappy 
code.

Watching the gcc mailing list over the years, those guys bend over backwards to
prevent that happening.

But since an optimization has to be based on additional hard information, they
have, with every new version of gcc, used that information both for warnings and
optimization.

Recent optimization improvements in gcc and clang have also broken existing code that has worked fine for decades.

In particular, overflow checks often now get optimized out, as the check relied on, pedantically, undefined behavior.

This is why D has added the core.checkedint module, to have overflow checks that are guaranteed to work.

Another optimization that has broken existing code is removal of dead assignments. This has broken crypto code that would overwrite passwords after using them. It's also why D now has volatileStore() and volatileLoad(), if only someone will pull them.

I.e. silent breakage of existing, working code is hardly unknown in the C/C++ 
world.

Reply via email to