https://gcc.gnu.org/bugzilla/show_bug.cgi?id=93582
Jakub Jelinek <jakub at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- Status|RESOLVED |REOPENED Last reconfirmed| |2020-02-05 CC| |jakub at gcc dot gnu.org Resolution|INVALID |--- Ever confirmed|0 |1 --- Comment #12 from Jakub Jelinek <jakub at gcc dot gnu.org> --- The reduced tests don't really show up what is happening though. Better testcase with the false positive warning even on x86_64 at e.g. -O2: struct S { unsigned int s1:1; unsigned int s2:1; unsigned int s3:1; unsigned int s4:1; unsigned int s5:4; unsigned char s6; unsigned short s7; unsigned short s8; }; struct T { int t1; int t2; }; static inline int bar (struct S *x) { if (x->s4) return ((struct T *)(x + 1))->t1 + ((struct T *)(x + 1))->t2; else return 0; } int foo (int x, int y) { struct S s; s.s6 = x; s.s7 = y & 0x1FFF; s.s4 = 0; return bar (&s); } The warning is in dead code, but due to the optimize_bit_field_compare "optimization" we aren't able to find that out until combine. We have: s.s4 = 0; _4 = BIT_FIELD_REF <s, 8, 0>; _6 = _4 & 8; I think this is generally something that FRE/PRE is able to optimize, at least if it is s.s4 = 0; _6 = s.s4; but the BIT_FIELD_REF in there prevents that. In this case the full 8 bits are actually undefined, there is just one defined bit in there, but it actually isn't optimized even if I add constant initialization of the other bitfields (s1-s5). Could FRE/PRE handle this perhaps at the point of BIT_AND_EXPR with BIT_FIELD_REF and constant operands, and only for the needed bits try to look them all up?