At 04:34 +0000 2002-09-27, [EMAIL PROTECTED] wrote:

>Some apps won't display a glyph from a specified font if its corresponding
>Unicode Ranges Supported bit in the OS/2 table isn't set.  So, font
>developers producing fonts intended to be used with such apps set the
>corresponding bit even if only one glyph from the entire range is
>present in the font.

Good heavens.

In Mac OS X if you have a glyph with a Unicode address attached to 
it, it will display in any application that can display any Unicode 
character. I don't understand why a particular bit has to be set in 
some table. Why can't the OS just accept what's in the font?
-- 
Michael Everson * * Everson Typography *  * http://www.evertype.com
48B Gleann na Carraige; Cill Fhionntain; Baile �tha Cliath 13; �ire
Telephone +353 86 807 9169 * * Fax +353 1 832 2189 (by arrangement)

Reply via email to