Hello everybody! I've encountered a very strange problem while compiling the AutoDetectProxy unittest in libjingle for iOS. It looks like the compiler calculates wrong offsets for members of the AutoDetectProxy class.
The AutoDetectProxy class is defined like this: class AutoDetectProxy : SignalThread { ... // Methods declarations and definitions; std::string agent_; // 1st member of AutoDetectProxy std::string server_url_; }; In GDB 'p /x sizeof (SignalThread)' prints 0x188. If you are inside of AutoDetectProxy constructor then 'p /x (int)&this- >agent_ - (int)this' will also print 0x188. And 'p /x sizeof(*this)' prints 0x1d4. But if you outside of AutoDetectProxy code, exactly in AutoDetectProxyTest::Create() method that dynamically allocates AutoDetectProxy object and stores the pointer into auto_detect_proxy_ member, then 'p /x (int)&auto_detect_proxy_->agent_ - (int)auto_detect_proxy_' will give you 0x180! And p /x sizeof(*auto_detect_proxy_) will print 0x1cc. And in the same time 'p /x sizeof (SignalThread)' and 'p /x sizeof(*(SignalThread*)auto_detect_proxy_)' still print 0x188. I'm puzzled and have no idea how to fix this. :( Any ideas, please? This is in llvm gcc 4.2 on Mac OS X. _______________________________________________ help-gplusplus mailing list help-gplusplus@gnu.org https://lists.gnu.org/mailman/listinfo/help-gplusplus