Peter Constable: > On 02/20/2001 03:34:28 AM Marco Cimarosti wrote: > > "Unicode is now a 32-bit character encoding standard, > > although only about one million of codes actually exist, > > [...] > > Well, it's probably a better answer to say that Unicode is a 20.1-bit > encoding since the direct encoding of characters is the coded [...] Your explanation is very correct. This is precisely how I used to start my endless explanations to those colleagues :-) And they invariably interrupted the explanation asking: "So, how many bits does it have?" That's why I wanted to simplify even more saying something like: "It is 32 bits (yes 32, like 4 bytes, OK?) but, as not all combinations are used, there are techniques to shrink it down a lot." _ Marco
- RE: Perception that Unicode is 16-bit (was: Re: Surrogate ... Marco Cimarosti
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Antoine Leca
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Cathy Wissink
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Marco Cimarosti
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Joel Rees
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Tom Lord
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Marco Cimarosti
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable
- RE: Perception that Unicode is 16-bit (was: Re: Surro... Carl W. Brown
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Tom Lord
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Tex Texin
- Re: Perception that Unicode is 16-bit (was: Re: Surro... Peter_Constable

