On 2014-03-16 07:25 (GMT) Philip Taylor composed:

Felix, you answer is very helpful and very informative, but
there are places where (to me) it seems to make no sense at
all.  May I ask you to expand on the following, please ?

DPI is often used interchangeably with display resolution

DPI is a single number (1-dimensional) whilst display resolution
is an ordered pair (2-dimensional); how can they be used
interchangeably ?

I didn't mean they were literally interchangeable, only that people don't always understand the difference, or even know that there is a difference, and often use one where the other would be uniquely correct or more appropriate.

DEs for desktop systems (and the software than runs on them,
including web browsers) almost universally by default assume a
display density of 96 DPI (or PPI), same as the CSS reference px
unit.

It has never been clear to me why CSS has a reference px unit.
What is the point of CSS including a unit such as "px", which
should mean "one pixel", and then giving it an entirely arbitrary
meaning that does not (other than by pure chance) map to one
pixel at all ?

I don't recall the history. It seems to me mostly a rationalization for the provision of a px unit as a legal CSS length. As a device pixel is something that varies in physical size, I suppose having something that can be reduced to a predictable physical measurement would be a necessary part of a complete specification for things that define object sizes and sizing. The angular definition isn't entirely arbitrary, as it is derived from a common practice from a young internet as defined by Microsoft.

FWIW, http://blogs.msdn.com/b/fontblog/archive/2005/11/08/490490.aspx provides logical explanation for 96.

In Geckos, variations in assumed DPI have no impact on its 16px OEM
default size, which physically speaking is 12pt whenever the DPI is
in fact 96.

Why ?  Also, has this always been the case, or did it change between
the version of Gecko used until Seamonkey 2.17.1 (the last version
that I regard as usable, because of the aberrant behaviour regarding
font scaling that occurred post that release) and subsequent versions?

I'm assuming your "why" refers to the fact that Gecko's default is not affected by DPI.

It has been thus for as long as I've been using Mozilla, more than 13 years, sometime before Netscape 6.2 (maybe before 6.0 too) and Mozilla 1.0 were released. Whether it ever was not the case I don't remember, but I think it's never been otherwise. As to why I don't remember either, but a search of bugzilla.mozilla.org's early years on the issue of attempting to change default font size to behave as it does in IE (and KHTML) would no doubt provide an answer.

To get bigger fonts in px from Geckos requires either settings
personalization, zoom, or bigger CSS size declarations, all of which
are effective in IE as well, but often obviated from necessity
because IE's default automatically goes up as DE DPI goes up, e.g.,
default @120 DPI being 20px instead of the 16px that it is @96, or
24px when DPI is 144.

Why does Gecko not emulate this eminently sensible behaviour ?

In part because its default is specified in pixels, not points. In another part, because it's FOSS, which means cross-platform compromises that dictate choices made in internal design. Safari behaves the same as Gecko. Little as I understand about Chrome, IIUC it behaves like Safari, since at the outset at least it ran on the exact same WebKit rendering engine, which was originally forked from KHTML. More recently WebKit was forked itself into Blink for use in Chrome with minimized control from Apple.

The reason it's ostensibly a good thing is because there is positive
non-zero rational relationship between a CSS size declaration in em,
and to optimal as reflected by the browser default size, which does
not exist for a px unit.

I don't understand the above starting at "and to optimal"; is there
possibly a typo somewhere in there that is confusing me ?

I use optimal as a descriptive term for browser default. 1em at the root (html; 1rem) equals the browser default 1:1. The browser default presumptively is either optimal as shipped by its vendor, or optimal as personalized by its user when the OEM setting was sufficiently divorced from his own environment's optimal to cause the user to change it.

as display density increases, the px unit decreases in physical
size (to a point, after which it doubles, and then at another point,
after which it in effect will have tripled, etc.),

What causes it to ?suddenly? double, triple, etc ?

I'm not up to speed on any advances that may exist within any DEs themselves.

WRT browser engines, available hardware is not up to the task of dealing with non-integer device pixel to logical pixel ratios. So, 96 is used until physical DPI has reached a doubling from 96 to 192, where it remains until the tripling point at 3X96=288. Whether the 4X point has been reached in available hardware I doubt, but would be 384.

Many thanks in advance for any light that you can shed on the above.

Hopefully at least the bulk of this is on-topic in the eyes of our list mom. :-)
--
"The wise are known for their understanding, and pleasant
words are persuasive." Proverbs 16:21 (New Living Translation)

 Team OS/2 ** Reg. Linux User #211409 ** a11y rocks!

Felix Miata  ***  http://fm.no-ip.com/
______________________________________________________________________
css-discuss [css-d@lists.css-discuss.org]
http://www.css-discuss.org/mailman/listinfo/css-d
List wiki/FAQ -- http://css-discuss.incutio.com/
List policies -- http://css-discuss.org/policies.html
Supported by evolt.org -- http://www.evolt.org/help_support_evolt/

Reply via email to