https://gcc.gnu.org/bugzilla/show_bug.cgi?id=69604
--- Comment #8 from Harald Anlauf <anlauf at gmx dot de> --- Independent of the modification in comment #7, there is an issue with wrong code for the native complex type. Consider: program p complex :: z[*], w = 1 real :: x[*], y = 2 type t complex :: c real :: r end type t type(t) :: u[*], v !--- Using derived types: v% c = 42 v% r = 42 u = v ! OK print *, u print *, v u = t(w, y) ! OK print *, u print *, t(w, y) !--- Using native types: z = (1.0, 0.0) ! Bad result w = (1.0, 0.0) print *, w, z ! z = w ! ICE x = y z = y ! Bad result print *, y, x, z end Running this code gives: ( 42.0000000 , 0.00000000 ) 42.0000000 ( 42.0000000 , 0.00000000 ) 42.0000000 ( 1.00000000 , 0.00000000 ) 2.00000000 ( 1.00000000 , 0.00000000 ) 2.00000000 ( 1.00000000 , 0.00000000 ) ( 3.36311631E-44, -1.02218151 ) 2.00000000 2.00000000 ( 3.36311631E-44, -1.02218151 ) Looking at the dump tree, the assignments to the native complex z are converted to SAVE_EXPR <z> = __complex__ (1.0e+0, 0.0); SAVE_EXPR <z> = COMPLEX_EXPR <y, 0.0>; while I do not see anything like this for the assignment of the derived type instances.