| Issue |
56443
|
| Summary |
Behaviour of converting pointers to larger signed integers differs between GCC and Clang.
|
| Labels |
new issue
|
| Assignees |
|
| Reporter |
CharlesLoveman
|
Compiling the following program produces different outputs between GCC 11 and Clang 14.
```
#include <stdint.h>
#include <stdio.h>
int main() {
int32_t *p = (int32_t *) 0xFFFFFFFFFFFFFFFF;
__int128_t z = (__int128_t) p;
char *zp = &z;
for (int i = 0; i < sizeof(__int128_t); ++i) {
printf("%02hhx", zp[i]);
}
return 0;
}
```
GCC output:
```
ffffffffffffffffffffffffffffffff
```
Clang output:
```
ffffffffffffffff0000000000000000
```
GCC will sign extend a 64 bit pointer when converted to an __int128_t, whereas Clang zeroes out the additional space.
Is this difference in implementation-defined behaviour intentional?
_______________________________________________
llvm-bugs mailing list
[email protected]
https://lists.llvm.org/cgi-bin/mailman/listinfo/llvm-bugs