Compiling genautomata with -O2 on ARM results in a program that segfaults when run. The problem is in output_min_issue_delay_table. What appears to be happening is that the RTL loop optimizer is identifying a loop invariant and hoisting it. However, the particular instance of the invariant that is being hoisted has a REG_EQUAL (const_int 0) note attached, so when the invariant gets spilled, the value of 0 is substituted instead of the address of the array. Essentially, the loop seems to consist of
x = malloc(...) for (i=...) { if (x != 0) t1 = x; else t1 = 0; t1[i] = yyy; } which gets optimized to x = malloc(...) for (i=...) { if (x != 0) t1 = x; else t1 = x; // NOTE REG_EQUAL (0); t1[i] = yyy; } which is further transformed to x = malloc t1 = x; //Note REG_EQUAL (0); for (i=...) { t1[i] = yyy; } I've attached the following files. foo.c: Source code that shows the problem -- look for compressed_min_issue_delay_vect. foo.s: Generated assembly, with annotated instruction that is invalid foo.c.139r.loop2_invariant: which shows the hoisting of the invariant. This regression is caused by the fix to PR rtl-optimization/31360 -- Summary: [4.3 regression] Invalid loop optimization causes bootstrap failure in genautomata Product: gcc Version: 4.3.0 Status: UNCONFIRMED Severity: critical Priority: P3 Component: rtl-optimization AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: rearnsha at gcc dot gnu dot org GCC target triplet: arm-netbsdelf2 http://gcc.gnu.org/bugzilla/show_bug.cgi?id=31848