> It does make sense to give the target control over the mode used for
> sizetype. Of course a global change of the default (for example to
> use Pmode as Ada did) will require testing each affected target,
> so I think it makes sense to keep the default as-is.
No disagreement here.
> Btw, we still have the issue on which _precision_ we should use for
> sizetype -- if we expect modulo-semantics of arithmetic using it
> (thus basically sign-less arithmetic) then the precision has to match
> the expectation the C frontend (and other frontends) assume how pointer
> offsets are handled. Currently the C frontend gets this not correct
> which means negative offsets will be not correctly handled.
Is this theoritical or practical? Are you talking about GET_MODE_BITSIZE vs
GET_MODE_PRECISION wrt TYPE_PRECISION?
> Similar issues arise from the mode/precision chosen for the bitsize
> types. We choose a way to wide precision for them, so the
> modulo-semantics assumption does not usually hold for bitsize
Again because of GET_MODE_PRECISION vs GET_MODE_BITSIZE? Otherwise we round up
the precision since GCC 4.5 so there should be no more weird precision.