On Saturday, September 28, 2002, at 03:19 PM, David Starner wrote:
On Sat, Sep 28, 2002 at 01:19:58PM -0700, Murray Sargent wrote:
Michael Everson said:
I don't understand why a particular bit has to be set in
some table. Why can't the OS just accept what's in the font?
The main reason is
John H. Jenkins scripsit:
This just seems wildly inefficient to me, but then I'm coming from an
OS where this isn't done.
As a cross-platform app, Mozilla can't count on very much from the platform.
--
John Cowan [EMAIL PROTECTED] http://www.reutershealth.com
Mr. Lane, if you
Title: Re: script or block detection needed for Unicode fonts
John Jenkins wrote:
"This just seems wildly inefficient to me, but then I'm coming
from anOS where this isn't done. The app doesn't keep track of
whether or nota particular font can draw a particular character; t
At 04:34 + 2002-09-27, [EMAIL PROTECTED] wrote:
Some apps won't display a glyph from a specified font if its corresponding
Unicode Ranges Supported bit in the OS/2 table isn't set. So, font
developers producing fonts intended to be used with such apps set the
corresponding bit even if only
Michael Everson said:
I don't understand why a particular bit has to be set in
some table. Why can't the OS just accept what's in the font?
The main reason is performance. If an application has to check the font
cmap for every character in a file, it slows down reading the file.
Accordingly
On Sat, Sep 28, 2002 at 01:19:58PM -0700, Murray Sargent wrote:
Michael Everson said:
I don't understand why a particular bit has to be set in
some table. Why can't the OS just accept what's in the font?
The main reason is performance. If an application has to check the font
cmap for
PROTECTED]
Cc: [EMAIL PROTECTED]
Sent: Saturday, September 28, 2002 13:19
Subject: RE: script or block detection needed for Unicode fonts
Michael Everson said:
I don't understand why a particular bit has to be set in
some table. Why can't the OS just accept what's in the font?
The main reason
On 09/26/2002 11:34:42 PM jameskass wrote:
Some apps won't display a glyph from a specified font if its corresponding
Unicode Ranges Supported bit in the OS/2 table isn't set. So, font
developers producing fonts intended to be used with such apps set the
corresponding bit even if only one
I was reading the thread glyph selection for Unicode in browsers and
wanted to pass on my reply to Mark's email. Since my query is related to
the topic of glyph selection, I wanted to know if anyone in this mail group
can tell me if doing a text can based on block ranges is more appropriate
On 09/26/2002 02:54:55 PM chuck clemens wrote:
It appears as I mentioned in email to
Mark that Unicode fonts use block ranges. Can someone verify this?
The TrueType font format allows a vendor to indicate which Unicode ranges
a font supports. In Windows, there are APIs and data structures
Peter Constable wrote,
It appears as I mentioned in email to
Mark that Unicode fonts use block ranges. Can someone verify this?
The TrueType font format allows a vendor to indicate which Unicode ranges
a font supports. In Windows, there are APIs and data structures for making
this
11 matches
Mail list logo