Hello all!

I have: Linux x86 with gcc 3.4.4
And xpcom-standalone compiled form last cvs sources.

Then I'm trying to do simple convertion from UTF167 to UTF8 like:

nsString  test = NS_LITERAL_STRING("<xml>");
printf("unicode string size is: %d\n", test.Length());

nsCString strUTF8 = NS_ConvertUTF16toUTF8(test);
printf("utf8 string size is: %d\n", strUTF8.Length());
// it is really funnny! but seems under linux
// 2 bytes is used for CString!
printf("utf8 string is: %s\n", (const char*)strUTF8.get());


I got a string "<\0x\0m\0l\0>\0" - so, I seems have not a single-byte string in CString, but double-byte...

Same example works fine under Windows...

Do you have any ideas about that happens?

With best regards,
Alexey Kakunin
_______________________________________________
Mozilla-xpcom mailing list
[email protected]
http://mail.mozilla.org/listinfo/mozilla-xpcom

Reply via email to