Mark Frazer <[EMAIL PROTECTED]> [05/08/02 09:32]: > Richard Guenther <[EMAIL PROTECTED]> [05/08/02 09:29]: > > Try -fno-strict-aliasing. This may be related to PR23192. > > -fno-strict-aliasing does indeed make the problem go away.
changing the de-serialization function to: double parse_double(uint &offset, vector<uint8> const &bytecode) throw (std::invalid_argument) { union { uint64 ival; double dval; } rtn; rtn.ival = uint64(next_byte(offset, bytecode)) << 56; rtn.ival |= uint64(next_byte(offset, bytecode)) << 48; rtn.ival |= uint64(next_byte(offset, bytecode)) << 40; rtn.ival |= uint64(next_byte(offset, bytecode)) << 32; rtn.ival |= uint64(next_byte(offset, bytecode)) << 24; rtn.ival |= uint64(next_byte(offset, bytecode)) << 16; rtn.ival |= uint64(next_byte(offset, bytecode)) << 8; rtn.ival |= uint64(next_byte(offset, bytecode)); return rtn.dval; } Allows for the strict-aliasing optimization to be left in. So, it seems the bug was mine, not gcc's. I'm off to search for other reinterpret_cast abuses in my code... cheers -mark -- To Captain Bender! He's the best! ...at being a big jerk who's stupid and his big ugly face is as dumb as a butt! - Fry