On 25.07.08, Steve Litt wrote:

> I know nothing about unicode, so I'd need to be brought up to speed on that 
> before writing the program. 


    * The Absolute Minimum Every Software Developer Absolutely,
      Positively Must Know About Unicode and Character Sets (No Excuses!)

    * On the Goodness of Unicode
    * Wikipedia article on Unicode

> I assume unicode is a 16 bit representation of characters.

Actually, unicode is a character <--> number mapping without an upper
limit to the numbers. 

Once upon a time, 16 bit where enough to represent all defined unicode
characters, but even then several different encodings into a computer
readable format existed. Programs that relied on unicode == 16 bit
(including LaTeX) have problems now with higher unicode numbers.


Reply via email to