https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94631

            Bug ID: 94631
           Summary: Wrong codegen for arithmetic on bitfields
           Product: gcc
           Version: unknown
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: c
          Assignee: unassigned at gcc dot gnu.org
          Reporter: bugdal at aerifal dot cx
  Target Milestone: ---

Test case:

struct foo {
        unsigned long long low:12, hi:52;
};
unsigned long long bar(struct foo *p)
{
        return p->hi*4096;
}

Should generate only a mask off of the low bits, but gcc generates code to mask
off the low 12 bits and the high 12 bits (reducing the result to 52 bits).
Presumably GCC is interpreting the expression p->hi as having a phantom type
that's only 52 bits wide, rather than having type unsigned long long.

clang/LLVM compiles it correctly.

I don't believe there's any language in the standard supporting what GCC is
doing here.

Reply via email to