On Tue, May 09, 2017 at 09:19:08PM -0400, Nick Sabalausky (Abscissa) via Digitalmars-d wrote: > On 05/09/2017 08:30 PM, H. S. Teoh via Digitalmars-d wrote: > > > > In this sense I agree with Walter that warnings are basically > > useless, because they're not enforced. Either something is correct > > and compiles, or it should be an error that stops compilation. > > Anything else, and you start having people ignore warnings. > > > > Not 100% useless. I'd much rather risk a warning getting ignored that > NOT be informed of something the compiler noticed but decided "Nah, > some people ignore warnings so I'll just look the other way and keep > my mouth shut". (Hogan's Compiler Heroes: "I see NUH-TING!!")
I'd much rather the compiler say "Hey, you! This piece of code is probably wrong, so please fix it! If it was intentional, please write it another way that makes that clear!" - and abort with a compile error. This is actually one of the things I like about D. For example, if you wrote: switch (e) { case 1: return "blah"; case 2: return "bluh"; } the compiler will refuse to compile the code until you either add a default case, or make it a final switch (in which case the compiler will refuse the compile the code unless every possible case is in fact covered). Now imagine if this was merely a warning that people could just ignore. Yep, we're squarely back in good ole C/C++ land, where an unexpected value of e causes the code to amble down an unexpected path, with the consequent hilarity that ensues. IOW, it should not be possible to write tricky stuff by default; you should need to ask for it explicitly so that intent is clear. Another switch example: switch (e) { case 1: x = 2; case 2: x = 3; default: x = 4; } In C, the compiler happily compiles the code for you. In D, at least the latest dmd will give you deprecation warnings (and presumably, in the future, actual compile errors) for forgetting to write `break;`. But if the fallthrough was intentional, you document that with an explicit `goto case ...`. IOW, the default behaviour is the safe one (no fallthrough), and the non-default behaviour (fallthrough) has to be explicitly asked for. Much, much better. > And then the flip side is that some code smells are just to pedantic > to justify breaking the build while the programmer is in the middle of > some debugging or refactoring or some such. > > That puts me strongly in the philosophy of "Code containing warnings: > Allowed while compiling, disallowed when committing (with allowances > for mitigating circumstances)." I'm on the fence about the former. My current theory is that being forced to write "proper" code even while refactoring actually helps the quality of the resulting code. But I definitely agree that code with warnings should never make it into the code repo. The problem is that it's not enforced by the compiler, so *somebody* somewhere will inevitably bypass it. > C/C++ doesn't demonstrate that warnings are doomed to be useless and > "always" ignored. What it demonstrates is that warnings are NOT an > appropriate strategy for fixing language problems. Point. I suppose YMMV, but IME unless warnings are enforced with -Werror or equivalent, after a while people just stop paying attention to them, at least where I work. It's entirely possible that it's a bias specific to my job, but somehow I have a suspicion that this isn't completely the case. Humans tend to be lazy, and ignoring compiler warnings is rather high up on the list of things lazy people tend to do. The likelihood increases with the presence of other factors like looming deadlines, unreasonable customer requests, ambiguous feature specs handed down from the PTBs, or just plain having too much on your plate to be bothering with "trivialities" like fixing compiler warnings. That's why my eventual conclusion is that anything short of enforcement will ultimately fail. Unless there is no way you can actually get an executable out of badly-written code, there will always be *somebody* out there that will write bad code. And by Murphy's Law, that somebody will eventually be someone in your team, and chances are you'll be the one cleaning up the mess afterwards. Not something I envy doing (I've already had to do too much of that). [...] > The moral of this story: Sometimes, breaking people's code is GOOD! ;) Tell that to Walter / Andrei. ;-) [...] > > (Nevermind the elephant in the room that 80-90% of the > > "optimizations" C/C++ coders -- including myself -- have programmed > > into their finger reflexes are actually irrelevant at best, because > > either compilers already do those optimizations for you, or the hot > > spot simply isn't where we'd like to believe it is; or outright > > de-optimizing at worst, because we've successfully defeated the > > compiler's optimizer by writing inscrutable code.) > > C++'s fundamental paradigm has always been "Premature-optimization > oriented programming". C++ promotes POOP. LOL!! Perhaps I'm just being cynical, but my current unfounded hypothesis is that the majority of C/C++ programmers don't use a profiler, and don't *want* to use a profiler, because they're either ignorant that such things exist (unlikely), or they're too dang proud to admit that their painfully-accumulated preconceptions about optimization might possibly be wrong. Or maybe my perceptions are just heavily colored by the supposedly "expert" C coders I've met, who wrote supposedly better code that I eventually realized was actually not better, but in many ways actually worse -- less readable, less maintainable, more error-prone to write, and at the end of the day arguably less performant because it ultimately led to far too much boilerplate and other sources of code bloat, excessive string copying, too much indirection (cache unfriendliness), and other such symptoms that C coders often overlook. (And meanwhile, the mere mention of the two letters "G C" and they instantly recoil, and rattle of an interminable list of 20-years-outdated GC-phobic excuses, preferring rather to die the death of a thousand pointer bugs (and memory leaks, and overrun buffers) than succumb to the Java of the early 90's with its klunky, poorly-performing GC of spotted repute that has long since been surpassed. And of course, any mention of any evidence that Java *might* actually perform better than poorly-written C code in some cases will incite instant vehement denial. After all, how can an "interpreted" language possibly outperform poorly-designed, over-engineered C scaffolding that necessitates far too much excessive buffer copying and destroys cache coherence with far too many unnecessary indirections? Inconceivable!) > > That's another fundamental problem with the C/C++ world: coding by > > convention. We all know all too well that *if* we'd only abide by > > such-and-such coding guidelines and recommendations, our code would > > actually stand a chance of being correct, safe, non-leaking, etc.. > > Luckily, there IS a way to enforce that proper coding conventions are > actually adhered to: It's called "compile-time error". :) Exactly. Not compiler warnings... :-D T -- You have to expect the unexpected. -- RL