https://gcc.gnu.org/bugzilla/show_bug.cgi?id=123201
anlauf at gcc dot gnu.org changed:
What |Removed |Added
----------------------------------------------------------------------------
CC| |anlauf at gcc dot gnu.org
--- Comment #3 from anlauf at gcc dot gnu.org ---
Given comment#0 and the range of commits, a likely candidate is r16-3499.
Here's a reduced testcase (please confirm!):
program p
implicit none
call val ("U++", "U--")
contains
subroutine val (x, c)
character(kind=1), intent(in) :: x ! control: pass by reference
character(kind=1), value :: c
print *, "x,c = ", x, c
if (c /= x) stop 1
end
end
On x86_64 this prints:
x,c = UU
Running f951 under gdb and setting a breakpoint in the added code (my bp is
at line 6655, yours may differ):
6651 /* Truncate actual string argument. */
6652 gfc_conv_expr (parmse, e);
6653 parmse->expr = gfc_build_wide_string_const (e->ts.kind, flen,
6654
e->value.character.string);
6655 parmse->string_length = build_int_cst (gfc_charlen_type_node,
flen);
(gdb) pge parmse->expr
"U"
(gdb) ptc parmse->expr
STRING_CST
(gdb) p (char) e->value.character.string[0]
$1 = 85 'U'
(gdb) p flen
$2 = 1
So the input expression (e), whose first character is 'U', gets converted
to a string constant of length 1 which is "U". Do you get the same or
something different?
I looked at gfc_build_wide_string_const, which calls gfc_encode_character,
which calls gfc_encode_character, which calls native_encode_expr.
My best guess so far is that endianness (is 32-bit Solaris/SPARC big-endian?)
could explain a difference to x86. Or some ABI problem passing a single
character that was latent before and got exposed.
Or sth. completely different.