On Wed, 6 Aug 2025, David Faust wrote: > So IMO the best option will be to reject wide strings in the attribute > handler. (The alternative, I guess, is to ensure the argument is > always exported to UTF-8 before being written (?)) > > I see that we can get the character size from the TREE_TYPE of the > STRING_CST node. Am I correct in thinking that it will be sufficient > to reject any string argument using characters larger than 8 bits?
I'd be inclined to express it as a check for the element type being char_type_node or char8_type_node (so like the logic for what strings can initialize an array of character type, for example). -- Joseph S. Myers josmy...@redhat.com