> If you find a particular use of TYPE_SIZE is using a size that isn't > correct for your type whose precision is not a multiple of > BITS_PER_UNIT, then in my model the correct fix is to change that > use of TYPE_SIZE rather than to change the value of TYPE_SIZE for > that type - and such a change (to use TYPE_PRECISION, maybe) would > also be incremental progress towards eliminating TYPE_SIZE.
What about DECL_SIZE ? It's set from TYPE_SIZE but there is no DECL_PRECISION. Also, TYPE_PRECISION is an integer, where TYPE_SIZE is a tree. They're not universally swappable code-wise. And I noticed that sizetype and bitsizetype are not really any different any more either, why have both?