Hello,
On Fri, 13 Jul 2001, Joseph S. Myers wrote:
> Think in the C standard context, not the Unicode or Java context. Why is
> the specification of uint_least16_t inappropriate here? Why should your
> specification not be supported on systems with 9-bit or 32-bit char? The
> C standard does not address communication with Java; while, as a quality-
> of-implementation issue, an implementation might find it desirable to make
> it 16 bits, why should you prohibit an implementation on a Cray, say, from
> choosing a wider type if 16-bit accesses are unavailable or inefficient
> and memory is not so critical?
I think in context of software architectures like J2EE or .NET, where C is
a part of the puzzle, but several programming languages and components are
used to build the whole picture. In such an environment, sharing the same
data represantation between programming languages avoids time consuming
conversions (especially conversions that change the length of the data
and require buffer allocation are critical).
I accept, that the specification does not make sense, if type char has 9-bit
or 32-bit. So I suggest to say type uint_least16_t should have double size
of type char.
> > - if concatenated with narrow or wide strings the result should be
> > the largest occuring string type
> What exactly do you mean by "largest"?
That string type, whose underlying character type has the largest
number of bits.
Greetings Markus Eble
-
Linux-UTF8: i18n of Linux on all levels
Archive: http://mail.nl.linux.org/linux-utf8/