----- Original Message ----- From: "Doug Ewell" <[EMAIL PROTECTED]> To: "Unicode Mailing List" <[EMAIL PROTECTED]> Cc: "Chris Jacobs" <[EMAIL PROTECTED]>; "Pim Blokland" <[EMAIL PROTECTED]> Sent: Monday, March 17, 2003 12:05 AM Subject: Re: Custom fonts (was: Tolkien wanta-be)
> Chris Jacobs <c dot t dot m dot jacobs at hccnet dot nl> wrote: > > > A codepoint in itself does not specify a character. > > Font + codepoint does specify a character. > > Charset + codepoint also can specify a character. > > All true for non-Unicode fonts. But then one is left to wonder why we > are discussing this on the Unicode list. > > > Say font A has on E000 an apple symbol, while font B has there a > > banana. > > Say for this reason I gave font B an offset of 0100 > > > > Then on my system U+E000 in plaintext should indeed display an apple > > symbol and U+E100 a banana symbol. > > But if there are more fonts with an apple symbol U+E000 does not > > specify the font to use. > > This isn't conformant and won't work. Which rule in The Unicode Standard Version 3.0 exactly is this not conformant with?

