Petr Viktorin <[email protected]> added the comment:
The call:
struct.unpack('>?', b'\xf0')
means to unpack a "native bool", i.e. native size and alignment. Internally,
this does:
static PyObject *
nu_bool(const char *p, const formatdef *f)
{
_Bool x;
memcpy((char *)&x, p, sizeof x);
return PyBool_FromLong(x != 0);
}
i.e., copies "sizeof x" (1 byte) of memory to a temporary buffer x, and then
treats that as _Bool.
While I don't have access to the C standard, I believe it says that assignment
of a true value to _Bool can coerce to a unique "true" value. It seems that if
a char doesn't have the exact bit pattern for true or false, casting to _Bool
is undefined behavior. Is that correct?
Clang 10 on s390x seems to take advantage of this: it probably only looks at
the last bit(s) so a _Bool with a bit pattern of 0xf0 turns out false.
But the tests assume that 0xf0 should unpack to True.
----------
nosy: +petr.viktorin
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue39689>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com