https://gcc.gnu.org/bugzilla/show_bug.cgi?id=102989

--- Comment #26 from Richard Biener <rguenth at gcc dot gnu.org> ---
Some random comments.

I wouldn't go with a new tree code, given semantics are INTEGER_TYPE it should
be an INTEGER_TYPE.  The TYPE_PRECISION issue is real - we have 16 spare bits
in tree_type_common so we could possibly afford to make it 16 bits.  Does the C
standard limit the number of bits?  Does it allow implementation defined
limits?

As of SSA representation and "lowering" this feels much like Middle-End Array
Expressions in the end.  I agree that first and foremost we should have
the types as registers but then we can simply lower early to a representation
supported by the target?  AKA make _BitInt(199) intfast_t[n] with appropriate
'n' and lower all accesses, doing arithmetic either via builtins or
internal functions on the whole object.

Constants are tricky indeed but I suppose there's no way to write a
199 bit integer constant in source?  We can always resort to constants
of the intfast_t[n] representation (aka a CTOR).

That said, if C allows us to limit to 128bits then let's do that for now.
32bit targets will still see all the complication when we give that a stab.

Reply via email to