On 3/15/22 10:40, Siddhesh Poyarekar wrote:
On 15/03/2022 21:09, Martin Sebor wrote:
The strncmp function takes arrays as arguments (not necessarily
strings).  The main purpose of the -Wstringop-overread warning
for calls to it is to detect calls where one of the arrays is
not a nul-terminated string and the bound is larger than the size
of the array.  For example:

   char a[4], b[4];

   int f (void)
   {
     return strncmp (a, b, 8);   // -Wstringop-overread
   }

Such a call is suspect: if one of the arrays isn't nul-terminated
the call is undefined.  Otherwise, if both are nul-terminated there

Isn't "suspect" too harsh a description though?  The bound does not specify the size of a or b, it specifies the maximum extent to which to compare a and b, the extent being any application-specific limit.  In fact the limit could be the size of some arbitrary third buffer that the contents of a or b must be copied to, truncating to the bound.

The intended use of the strncmp bound is to limit the comparison to
at most the size of the arrays or (in a subset of cases) the length
of an initial substring. Providing an arbitrary bound that's not
related to the sizes as you describe sounds very much like a misuse.

As a historical note, strncmp was first introduced in UNIX v7 where
its purpose, alongside strncpy, was to manipulate (potentially)
unterminated character arrays like file names stored in fixed size
arrays (typically 14 bytes).  Strncpy would fill the buffers with
ASCII data up to their size and pad the rest with nuls only if there
was room.

Strncmp was then used to compare these potentially unterminated
character arrays (e.g., archive headers in ld and ranlib).  The bound
was the size of the fixed size array.  Its other use case was to compare
leading portions of strings (e.g, when looking for an environment
variable or when stripping "./" from path names).

Since the early UNIX days, both strncpy and to a lesser extent strncmp
have been widely misused and, along with many other functions in
<string.h>, a frequent source of bugs due to common misunderstanding
of their intended purpose.  The aim of these warnings is to detect
the common (and sometimes less common) misuses and bugs.

I agree the call is undefined if one of the arrays is not nul-terminated and that's the thing; nothing about the bound is undefined in this context, it's the NUL termination that is key.

is no point in calling strncmp with a bound greater than their sizes.

There is, when the bound describes something else, e.g. the size of a third destination buffer into which one of the input buffers may get copied into.  Or when the bound describes the maximum length of a set of strings where only a subset of the strings are reachable in the current function and ranger sees it, allowing us to reduce our input string size estimate.  The bounds being the maximum of the lengths of two input strings is just one of many possibilities.

With no evidence that this warning is ever harmful I'd consider

There is, the false positives were seen in Fedora/RHEL builds.

I haven't seen these so I can't very well comment on them.  But I can
assure you that warning for the code above is intentional.  Whether
or not the arrays are nul-terminated, the expected way to call
the function is with a bound no greater than their size (some coding
guidelines are explicit about this; see for example the CERT C Secure
Coding standard rule ARR38-C).

(Granted, the manual makes it sound like -Wstringop-overread only
detects provable past-the-end reads.  That's a mistake in
the documentation that should be fixed.  The warning was never quite
so limited, nor was it intended to be.)

Martin


suppressing it a regression.  Since the warning is a deliberate
feature in a released compiler and GCC is now in a regression
fixing stage, this patch is out of scope even if a case where
the warning wasn't helpful did turn up (none has been reported
so far).

Wait, I just reported an issue and it's across multiple packages in Fedora/RHEL :)

I think this is a regression since gcc 11 due to misunderstanding the specification and assuming too strong a relationship between the size argument of strncmp (and indeed strnlen and strndup) and the size of objects being passed to it.  Compliant code relies on the compiler to do the right thing here, i.e. optimize the strncmp call to strcmp and not panic about the size argument being larger than the input buffer size. If at all such a diagnostic needs to stay, it ought to go into the analyzer, where such looser heuristic suggestions are more acceptable and sometimes even appreciated.

FWIW, I'm open to splitting the warning levels as you suggested if that's the consensus since it at least provides a way to make these warnings saner. However I still haven't found the rationale presented so far compelling enough to justify these false positives; I just don't see a proportional enough reward.  Hopefully more people can chime in with their perspective on this.

Thanks,
Siddhesh


Reply via email to