What exactly _would_ be wrong with calling UNICODE a
thirty-two bit encoding
If I have a 32 bit integer type, holding a Unicode code point, I have
11 bits left over to hold other data. That's worth knowing.
Btw, saying approximately 20.087 bits (Am I calculating that
right -- log2[ 17*65536 ]?) causes many people to think they
are just being teased.
They'll get over it.
Thomas Lord
regexps.com
- RE: Perception that Unicode is 16-bit (was: Re: Surrogate ... Marco Cimarosti
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Antoine Leca
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Cathy Wissink
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Marco Cimarosti
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Joel Rees
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Tom Lord
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Marco Cimarosti
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Carl W. Brown
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Tom Lord
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Tex Texin
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Joel Rees
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Joel Rees

