================
@@ -618,7 +618,7 @@ template <class InternT, class ExternT>
 void utf8_to_utf16_in_error(const std::codecvt<InternT, ExternT, mbstate_t>& 
cvt) {
   // UTF-8 string of 1-byte CP, 2-byte CP, 3-byte CP, 4-byte CP
   const unsigned char input[] = "b\u0448\uD700\U0010AAAA";
-  const char16_t expected[]   = {'b', 0x0448, 0xD700, 0xDBEA, 0xDEAA, 0};
+  const InternT expected[]    = {0x62, 0x0448, 0xD700, 0xDBEA, 0xDEAA, 0};
----------------
EricWF wrote:

Also, the point of having the literal `b` in the expected output is to match 
the `b` in the input,
which gets totally lost with this change.

Would `static_cast<InternT>('b')` work instead?

On second thought, why is the change from `b` to 0x62 needed ? That will never 
narrow or change meaning, right?


https://github.com/llvm/llvm-project/pull/138708
_______________________________________________
cfe-commits mailing list
cfe-commits@lists.llvm.org
https://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to