On Mon, Jan 22, 2024 at 11:27:52AM +0100, Richard Biener wrote:
> We run into
> 
> static tree
> native_interpret_int (tree type, const unsigned char *ptr, int len)
> { 
> ...
>   if (total_bytes > len
>       || total_bytes * BITS_PER_UNIT > HOST_BITS_PER_DOUBLE_INT)
>     return NULL_TREE;
> 
> OTOH using a V_C_E to "truncate" a _BitInt looks wrong?  OTOH the
> check doesn't really handle native_encode_expr using the "proper"
> wide_int encoding however that's exactly handled.  So it might be
> a pre-existing issue that's only uncovered by large _BitInts
> (__int128 might show similar issues?)

I guess the || total_bytes * BITS_PER_UNIT > HOST_BITS_PER_DOUBLE_INT
conditions make no sense, all we care is whether it fits in the buffer
or not.
But then there is
fold_view_convert_expr
(and other spots) which use
  /* We support up to 1024-bit values (for GCN/RISC-V V128QImode).  */
  unsigned char buffer[128];
or something similar.
Perhaps we could use XALLOCAVEC there instead (or use it only for the
larger sizes and keep the static buffer for the common case).

        Jakub

Reply via email to