http://gcc.gnu.org/bugzilla/show_bug.cgi?id=47997

--- Comment #22 from Iain Sandoe <iains at gcc dot gnu.org> 2011-07-21 10:36:02 
UTC ---
hmm, comment #21 is not the right solution ...  (even if it works)

... the right solution is either  
   (a) to handle arrays of arbitrary-sized ints in fix_string_type () (without
assuming that they are whar when not explicitly set to
char{,16,32}_array_type_node)

   or

   (b) to force the type of CPP_OBJC_STRING to be char{,16,32}_array_type_node
as appropriate.

a. might look something like:

Index: gcc/c-family/c-common.c
===================================================================
--- gcc/c-family/c-common.c     (revision 176554)
+++ gcc/c-family/c-common.c     (working copy)
@@ -911,6 +911,32 @@ fix_string_type (tree value)
       nchars = length / (TYPE_PRECISION (char32_type_node) / BITS_PER_UNIT);
       e_type = char32_type_node;
     }
+  else if (TREE_TYPE (value) && TREE_CODE (TREE_TYPE (value)) == ARRAY_TYPE)
+    {
+      int prec;
+
+      if (TREE_TYPE (TREE_TYPE ((value))))
+        prec = TYPE_PRECISION (TREE_TYPE (TREE_TYPE ((value))));
+      else
+        prec = TYPE_PRECISION (wchar_type_node);
+
+      nchars = length / (prec / BITS_PER_UNIT);
+      switch (prec)
+        {
+       case BITS_PER_UNIT:
+         e_type = char_type_node;
+         break;
+       case 16:
+         e_type = char16_type_node;
+         break;
+       case 32:
+         e_type = char32_type_node;
+         break;
+       default:
+         e_type = wchar_type_node;
+         break;
+       }
+    }
   else
     {
       nchars = length / (TYPE_PRECISION (wchar_type_node) / BITS_PER_UNIT);

b. needs some more investigation.

thoughts?

Reply via email to