Date: Thu, 06 May 2021 12:52:36 -0700 From: "Greg A. Woods" <wo...@planix.ca> Message-ID: <m1lek32-0036urC@more.local>
| Yeah, "Undefined Behaviour" should be undefined -- i.e. removed from the | spec -- i.e. become either fully defined or at least implementation | defined. It is not helpful at all -- it was a very VERY bad idea. Not really possible. To become implementation defined, the implementation needs to be able to specify what happens (even if different from what other implementations specify for the same thing). Sometimes that's not possible, and what happens depends upon things outside the control of the implementation. Eg: accessing an array out of bounds might just return random data from some other data structure, or it might generate a segmentation violation - it all depends upon how far out of bounds the access was, and where in the memory map the array in question happened to be placed. There's no way to define what will happen - even worse on an embedded system, running with no memory management or privilege separation, the access might hit on memory mapped I/O control, or CPU control registers, and do almost anything. | E.g. for ctype.h interfaces the spec should just say that values outside | the recognized range will simply be truncated as if by assignment to an | unsigned char. That might have been a good idea, perhaps, if it had been specified that way initially - only perhaps because it means penalizing good code with meaningless extra checks or no-op data manipulations (&0xFF or whatever) that do nothing for it except make the code run slower, just so bad code behaves in some kind of predictable (but probably still incorrect) way. But it wasn't specified like that. And standards bodies are not legislatures - they don't (or shouldn't) go defining how things should be, and then attempt to force implementations to obey. Rather, they set out what is known to work on all implementations (just omitting ones with admitted bugs which should be fixed), so that applications will know what they need to do to correctly use the interfaces provided, and what they should not do, as the results would either be unspecified (or implementation defined) or even simply undefined. They also make it clear what a new implementation needs to implement in order to be compatible with the other existing implementations, so that applications which work with other implementations will also work with the new one. | What I am pretty sure of though is that there's a vast difference | between the massive number of warnings spit out by the compiler vs. the | relatively low number of actual cases of passing values outside of -1..255. | We certainly wouldn't want to claim UB and abort for all of the warnings! It is certainly true that the compiler is guessing when it issues one of these warnings, in some cases it cannot know what the range of value will be at run time, in others its analysis functionality is simply not up to the task. So a lot of false warnings occur - for some of the warnings the vast majority look to be bogus (which is annoying) for others a warning most commonly means a problem exists. kre