> For the case in question offset is (D.2640_7 + -1) * 20 + 16. I wonder
> why DECL_FIELD_OFFSET of the outermost COMPONENT_REF is not added
> to bitpos by get_inner_reference (that is what get_bit_range assumes ...).
DECL_FIELD_OFFSET is added to offset and DECL_FIELD_BIT_OFFSET to bitpos.
> So, how would you make sure this works? Match the fact that
> get_inner_reference does _not_ add DECL_FIELD_OFFSET to bitpos,
> and, if DECL_FIELD_OFFSET is an INTEGER_CST simply subtract that
> from offset and add it to bitpos? I suppose that would work.
Yes, but the amount is simply the negative bitstart (which is a multiple of
BITS_PER_UNIT). This is the same kind of adjustment now done at the end of
get_inner_reference to avoid negative bit positions there too.
> Though doing that in get_inner_reference for DECL_BIT_FIELD_TYPE
> fields may make sense as well.
That would be more complicated, as we would need to split the offset into
variable and fixed part.
Tentative patch attached, regtested for Ada on x86 and x86-64. I'll do a full
testing cycle if it is approved.
* expr.c (get_bit_range): Add OFFSET parameter and adjust BITPOS.
Change type of BITOFFSET to signed. Make sure the lower bound of
the computed range is non-negative by adjusting OFFSET and BITPOS.
(expand_assignment): Adjust call to get_bit_range.
--- expr.c (revision 186054)
+++ expr.c (working copy)
@@ -4431,19 +4431,22 @@ optimize_bitfield_assignment_op (unsigne
/* In the C++ memory model, consecutive bit fields in a structure are
considered one memory location.
- Given a COMPONENT_REF EXP at bit position BITPOS, this function
+ Given a COMPONENT_REF EXP at position (OFFSET, BITPOS), this function
returns the bit range of consecutive bits in which this COMPONENT_REF
- belongs in. The values are returned in *BITSTART and *BITEND.
+ belongs in. The values are returned in *BITSTART and *BITEND. Both
+ OFFSET and BITPOS may be adjusted in the process.
If the access does not need to be restricted 0 is returned in
*BITSTART and *BITEND. */
get_bit_range (unsigned HOST_WIDE_INT *bitstart,
unsigned HOST_WIDE_INT *bitend,
- tree exp,
- HOST_WIDE_INT bitpos)
+ tree *offset,
+ HOST_WIDE_INT *bitpos,
+ tree exp)
- unsigned HOST_WIDE_INT bitoffset;
+ HOST_WIDE_INT bitoffset;
tree field, repr;
gcc_assert (TREE_CODE (exp) == COMPONENT_REF);
@@ -4490,7 +4493,25 @@ get_bit_range (unsigned HOST_WIDE_INT *b
bitoffset += (tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
- tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
- *bitstart = bitpos - bitoffset;
+ /* If the adjustment is larger than bitpos, we would have a negative bit
+ position for the lower bound and this may wreak havoc later. This can
+ occur only if we have a non-null offset, so adjust offset and bitpos
+ to make the lower bound non-negative. */
+ if (bitoffset > *bitpos)
+ HOST_WIDE_INT adjust = bitoffset - *bitpos;
+ gcc_assert ((adjust % BITS_PER_UNIT) == 0);
+ gcc_assert (*offset != NULL_TREE);
+ *bitpos += adjust;
+ = size_binop (MINUS_EXPR, *offset, size_int (adjust / BITS_PER_UNIT));
+ *bitstart = 0;
+ *bitstart = *bitpos - bitoffset;
*bitend = *bitstart + tree_low_cst (DECL_SIZE (repr), 1) - 1;
@@ -4595,7 +4616,7 @@ expand_assignment (tree to, tree from, b
if (TREE_CODE (to) == COMPONENT_REF
&& DECL_BIT_FIELD_TYPE (TREE_OPERAND (to, 1)))
- get_bit_range (&bitregion_start, &bitregion_end, to, bitpos);
+ get_bit_range (&bitregion_start, &bitregion_end, &offset, &bitpos, to);
/* If we are going to use store_bit_field and extract_bit_field,
make sure to_rtx will be safe for multiple use. */