[Neal Norwitz] > It looks like %zd of a negative number is treated as an unsigned > number on OS X, even though the man page says it should be signed. > > """ > The z modifier, when applied to a d or i conversion, indicates that > the argument is of a signed type equivalent in size to a size_t. > """
It's not just some man page ;-), this is required by the C99 standard (which introduced the `z` length modifier -- and it's the `d` or `i` here that imply `signed`, `z` is only supposed to specify the width of the integer type, and can also be applied to codes for unsigned integer types, like %zu and %zx). > The program below returns -123 on Linux and 4294967173 on OS X. > > n > -- > #include <stdio.h> > int main() > { > char buffer[256]; > if(sprintf(buffer, "%zd", (size_t)-123) < 0) > return 1; > printf("%s\n", buffer); > return 0; > } Well, to be strictly anal, while the result of (size_t)-123 is defined, the result of casting /that/ back to a signed type of the same width is not defined. Maybe your compiler was "doing you a favor" ;-) _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com