#3794: Complier warnings on OS X 10.11.1 with Xcode 7.1.1
---------------------+---------------------
Reporter: chdiza | Owner: brendan
Type: defect | Status: new
Priority: minor | Milestone:
Component: SMTP | Version:
Resolution: | Keywords:
---------------------+---------------------
Comment (by vinc17):
Replying to [comment:11 code@…]:
> {{{
> A char is signed on every platform.
> }}}
No, a {{{char}}} is '''not''' signed on every platform. This is
implementation-defined. On Power PC, {{{char}}} is unsigned.
> {{{
> This is not a harmless warning, IT IS A BUG.
> }}}
If you mean that the warning is a bug, yes. Note that the initial code
({{{*a > 0 && *a < 128}}}) was correct whether {{{char}}} is signed or
not.
> {{{
> This is illustrated by the following tiny program:
>
> #include <stdio.h>
> int main(int argc, char **argv)
> {
> char x;
> x = 128;
> printf("%d\n", (int)x);
> return 0;
> }
>
> On any sane platform, this will print -128, not 128.
> }}}
If {{{char}}} is unsigned, it will print 128. If {{{char}}} is signed, the
result is implementation-defined because in such a case, 128 is not
representable in a {{{char}}} (assuming {{{CHAR_BIT = 8}}}).
> {{{
> That's because the value of x is, in fact, -128; the compiler has
performed an
> implicit cast
> }}}
The correct term is "conversion" (a cast is an explicit conversion).
> {{{
> Vincent's proposed fix is technically correct, but is pretty
> unreadable, possibly obscuring the intent to the reader.
> }}}
My proposal was just based on Kevin's in [comment:7]. I forgot that the
conversion to {{{unsigned char}}} made the comparison with 128 possible
again (the first intent of this conversion was just to fix Kevin's code).
> {{{
> A better fix
> that retains the intent would have been (almost) what Petr suggested,
> except he had the sign wrong, and there's no longer any reason to
> compare to 0, since unsigned chars can never be negative:
> }}}
You still need to reject the value 0.
> {{{
> while (a && *((unsigned char *)a) < 128)
> }}}
This should be
{{{
while (*((unsigned char *)a) && *((unsigned char *)a) < 128)
}}}
> {{{
> However, if you're going to be comparing chars to numeric literal
> ASCII codes, you should really consider whether the data should
> actually be unsigned char instead of just char. The main reason NOT
> to do that is if you have to use it with old POSIX API calls that got
> the sign wrong, expectng a signed char* rather than an unsigned char*,
> }}}
I don't think any POSIX API expects a {{{signed char*}}}.
> {{{
> of which there are a number (like strcmp() et al.).
> }}}
No, that's just {{{char *}}}. {{{char}}} and {{{signed char}}} are not
compatible types, even when {{{char}}} is signed.
--
Ticket URL: <http://dev.mutt.org/trac/ticket/3794#comment:14>
Mutt <http://www.mutt.org/>
The Mutt mail user agent