On 12/16/2016 10:27 AM, Jakub Jelinek wrote:
On Fri, Dec 16, 2016 at 10:10:00AM -0700, Martin Sebor wrote:
No. The first call to sm_read_sector just doesn't exit. So it is warning
about dead code.
If the code is dead then GCC should eliminate it. With it eliminated
There is (especially with jump threading, but not limited to that, other
optimizations may result in that too), code that even very smart optimizing
compiler isn't able to prove that is dead.
the warning would be gone. The warning isn't smart and it doesn't
try to be. It only works with what GCC gives it. In this case the
dump shows that GCC thinks the code is reachable. If it isn't that
would seem to highlight a missed optimization opportunity, not a need
to make the warning smarter than the optimizer.
No, it highlights that the warning is done in a wrong place where it suffers
from too many false positives.
I don't inherently see this as generating "too many false positives".
And as Martin says, the warning works with precisely what it is presented.
I think the particular stumbling point is path isolation at some point
as resulted in a NULL explicitly in calls at various places. That is a
*GOOD* thing to detect and warn against as it represents cases that are
often well hidden and often difficult for a human to analyze (based on
my work with NULL pointer dereference warnings).
None look like real bugs to me.
They do to me. There are calls in gengtype.c to a function decorated
with attribute nonnull (lbasename) that pass to it a pointer that's
potentially null. For example below. get_input_file_name returns
Most pointers passed to functions with nonnull attributes are, from the
compiler POV, potentially NULL. Usually the compiler just can't prove it
can't be non-NULL, it is an exception if it can.
True. But what's happening here IIUC is that there is an explict NULL
for those arguments in the IL, most likely due to path isolation.
I would agree that a random pointer which we know nothing about could be
NULL, but we shouldn't warn for it. That would generate too many false
positives.
What those guidelines do mean is that various transformations and
optimizations may make warnings appear or disappear seemingly randomly.
That's unfortunate, but inherent in this class of problems until we have
a real static analyzer.
jeff