https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66855

            Bug ID: 66855
           Summary: codecvt wrong endianness in UTF-16 conversions
           Product: gcc
           Version: 5.1.1
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: libstdc++
          Assignee: unassigned at gcc dot gnu.org
          Reporter: delrieutheo at gmail dot com
  Target Milestone: ---

Created attachment 35965
  --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=35965&action=edit
The buggy one

There is a problem with codecvt, the facet codecvt_utf8_utf16 should do
big-endian conversions with default template arguments.

However this is the output I get linking with libstdc++-5.1.1 :

UTF-16

[b098] [b294] [d0dc] [c624] 

UTF-16 to UTF-8

[eb] [82] [98] [eb] [8a] [94] [ed] [83] [9c] [ec] [98] [a4] 

Converting back to UTF-16

[98b0] [94b2] [dcd0] [24c6]

The same code gives the expected result on OS X with libc++ and on Windows.

When I specify the third template argument of codecvt_utf8_utf16 with
std::little_endian, it gives the expected output.

Reply via email to