On Tuesday, 24 July 2018 at 14:08:26 UTC, Daniel Kozak wrote:
I am not C++ expert so this seems wierd to me:
(...)
int main(int argc, char **argv)
{
        char c = 0xFF;
        std::string sData = {c,c,c,c};
        unsigned int i = (((((sData[0]&0xFF)*256
                                        + (sData[1]&0xFF))*256)
                                        + (sData[2]&0xFF))*256
                                        + (sData[3]&0xFF));
                                        
        if (i != 0xFFFFFFFF) { // it is true why?
                // this print 18446744073709551615 wow
                std::cout << "WTF: " << i  << std::endl;
        }               
        return 0;
}

compiled with:
g++ -O2 -Wall  -o "test" "test.cxx"
when compiled with -O0 it works as expected

Vs. D: ....
So it is code gen bug on c++ side, or there is something wrong with that code.

Signedness of char in C++ is platform dependent.
See https://en.cppreference.com/w/cpp/language/types "char"
You seem to be running into "signed overflow is undefined behaviour" shenanigans.

with all optimizations clang gives a different result than gcc.
https://godbolt.org/g/Dz5djj

Generally use unsigned char (or std::byte) when char means "memory". And prefer a std::vector<unsigned char> to std::string in these cases as well.

Reply via email to