This one's an absolute GEM. Subtle enough to be the sort of thing I'd
expect someone to do if they wanted to intentionally sabotage a C compiler.
Assume, for a second, you have this function:
void puthexl(unsigned long val)
which prints an 8-digit (32-bit) hex value out onto a display of some
kind (in my case, a serial port).
Now let's try some tests.
puthexl(0x80000000);
puthexl(0x40000000);
puthexl(0x20000000);
puthexl(0x10000000);
puthexl(0x08000000);
puthexl(0x04000000);
puthexl(0x02000000);
puthexl(0x01000000);
puthexl(0x00800000);
puthexl(0x00400000);
puthexl(0x00200000);
puthexl(0x00100000);
puthexl(0x12345678);
You would expect this to produce the output:
8000_0000
4000_0000
2000_0000
1000_0000
0800_0000
0400_0000
0200_0000
0100_0000
0080_0000
0040_0000
0020_0000
0010_0000
1234_5678
But it doesn't. It actually outputs:
8000_0000
0000_0000
2000_0000
1000_0000
0800_0000
0400_0000
0200_0000
0100_0000
0080_0000
0040_0000
0020_0000
0010_0000
1234_5678
This test code runs fine on x86-32 and x86-64 (haven't had a chance to
try ARMEB or MIPSEL yet). So what's so special about 0x4000_0000?
GCC v4.4.0 with lm32 patch, Binutils 2.20.1.20100303.
Part of me wonders if this has something to do with all the Linux issues
on the LM32... and it's making me all the more tempted to use a MIPS
core instead.
--
Phil.
[email protected]
http://www.philpem.me.uk/
_______________________________________________
http://lists.milkymist.org/listinfo.cgi/devel-milkymist.org
IRC: #milkym...@freenode
Webchat: www.milkymist.org/irc.html
Wiki: www.milkymist.org/wiki