https://gcc.gnu.org/bugzilla/show_bug.cgi?id=69400
--- Comment #6 from Richard Biener <rguenth at gcc dot gnu.org> --- The dividend is val = {-1, 0}, len = 2 (bah, we need a debug_wide_int for gdb use!) We do n = divisor_blocks_needed; while (n > 1 && b_divisor[n - 1] == 0) n--; stripping the leading zeros and feed that n to if (remainder) { wi_pack ((unsigned HOST_WIDE_INT *) remainder, b_remainder, n); *remainder_len = canonize (remainder, (n + 1) / 2, dividend_prec); /* The remainder is always the same sign as the dividend. */ if (dividend_neg) *remainder_len = wi::sub_large (remainder, zeros, 1, remainder, *remainder_len, dividend_prec, UNSIGNED, 0); so in int_const_binop we end up with val = {-2 }, len = 1 (that's still correct I think given wide-int rules). Interestingly things go downhill in wide_int_to_tree where we compute ext_len as 3 and go the build_new_int_cst path which does if (len < ext_len) { --ext_len; TREE_INT_CST_ELT (nt, ext_len) = zext_hwi (-1, cst.get_precision () % HOST_BITS_PER_WIDE_INT); for (unsigned int i = len; i < ext_len; ++i) TREE_INT_CST_ELT (nt, i) = -1; thus puts { -2, -1, 0 } in the INTEGER_CST storage (instead of { -2, 0, 0 }). I think it needs to put 0 there for TYPE_UNSIGNED? But then not sure why it computes ext_len to 3 instead of 2 for this case anyway... it does if (TYPE_UNSIGNED (type) && wi::neg_p (cst)) return cst.get_precision () / HOST_BITS_PER_WIDE_INT + 1; but I'd have expected return cst.get_len () + 1; well, the three kinds of lengths are somewhat odd, so I simply assume 3 is correct for now. Index: gcc/tree.c =================================================================== --- gcc/tree.c (revision 232666) +++ gcc/tree.c (working copy) @@ -1267,7 +1267,7 @@ build_new_int_cst (tree type, const wide TREE_INT_CST_ELT (nt, ext_len) = zext_hwi (-1, cst.get_precision () % HOST_BITS_PER_WIDE_INT); for (unsigned int i = len; i < ext_len; ++i) - TREE_INT_CST_ELT (nt, i) = -1; + TREE_INT_CST_ELT (nt, i) = TYPE_UNSIGNED (type) ? 0 : -1; } else if (TYPE_UNSIGNED (type) && cst.get_precision () < len * HOST_BITS_PER_WIDE_INT) doesn't fix this unfortunately, so the bug must be elsewhere (eventually -2 and len = 1 is not correct as result from divmod).