Re: setting X server DPI

2009-03-11 Thread Tim
On Tue, 2009-03-10 at 17:40 -0500, Michael Hennebry wrote:
 Ideally, the X server has the correct DPIs
 and the application is written to use them.
 The application can discover the number of pixels in a 12pt font
 and enlarge or not depending on the answer and the purpose.
 Given that many applications don't do that,
 lying about the DPIs is a perfectly sensible thing to do.
 That isn't good for applications that would otherwise do the right
 thing.

The problem is that when you abuse something to fake something to suit
someone who doesn't know how to do things right, you break things that
everyone else *needs* to work properly.

Because it's done WRONG, we can't use it right.  We can't specify our
DPI correctly, and get fonts drawn correctly.  We have to bodge
*everything*, and *nothing* is correct.

-- 
[...@localhost ~]$ uname -r
2.6.27.19-78.2.30.fc9.i686

Don't send private replies to my address, the mailbox is ignored.  I
read messages from the public lists.



-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-11 Thread Tim
Tim:
 i.e. 12 point text is the same size whether printed on 2 inches of
 paper, or 20 inches of paper.

Tom Horsley:
 Absolutely true, and absolutely the point. If you specify a 12 point
 font on a 46 1920x1080 display, you will wind up drawing some
 random smudge of bits that is indeed able to fit on a line that
 is 12/72 of an inch high, but there aren't enough frigging pixels
 to render the font in any fashion that makes it remotely possible
 to discern what the character actually represents.
 
 As long as all the software in the universe insists on defaulting
 to things like 9 or 12 point fonts for menu items and login screens
 no sane person would want the default DPI to actually match the
 hardware because they couldn't possibly read what it says enough
 to even find the dialog box they need to fix it.

So fix up the things that do it wrong, like DON'T specify stupidly small
fonts.  Don't BUGGER UP EVERYTHING else.

If your experience is so limited that you cannot understand that
stuffing up measurement systems so that *nothing* works right, you're
not in a position to argue this point.

This is computing, we need the computer to do what it's told, to be 100%
predictable so you can control it under all circumstances (print on
screen, paper, A4 sheets, posters, etc.).  You, or the system, cannot be
taken serious when you have to tell a system to use 64 point text to
actually at print 12 point size.


-- 
[...@localhost ~]$ uname -r
2.6.27.19-78.2.30.fc9.i686

Don't send private replies to my address, the mailbox is ignored.  I
read messages from the public lists.



-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Tim
On Sun, 2009-03-08 at 11:44 -0600, Petrus de Calguarium wrote:
 96x96 should be the default. I don't know why it isn't.

No.  The DPI should be set to the values that actually represent the
hardware.

Font sizing, and the like, should be set by picking the font size you
want, not buggering up the DPI.  Set that wrong, and you make it
impossible for other things to do their job right.  Graphics card pixel
counts are the number of dots per scan line that you have available to
draw something with (definition).  Visual display device DPI is the
tiniest noticeable detail that can be resolved.  The two are
interrelated, but people often get one thing wrong - with CRTs it's
still quite good to draw to the display with more detail than it can
display, you get smoother detail.  For LCDs, you really need 1:1
configuration between graphics card and display device.

GUI sizing, an important and often overlooked thing, should be set as
an independent value.  That used to be easily done on window managers
that I've seen in the past.  Tiny buttons are a major pain, and so is
the converse - absurdly huge buttons wasting space on low resolution
displays.

Anyone who thinks that increasing the resolution *should* create smaller
fonts, or GUI gadgets, has got it extremely wrong.  And that includes
all the programmers who stupidly do that.


-- 
[...@localhost ~]$ uname -r
2.6.27.15-78.2.23.fc9.i686

Don't send private replies to my address, the mailbox is ignored.  I
read messages from the public lists.



-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Bill Crawford
On Tuesday 10 March 2009 11:31:39 Tim wrote:

 Anyone who thinks that increasing the resolution *should* create smaller
 fonts, or GUI gadgets, has got it extremely wrong.  And that includes
 all the programmers who stupidly do that.

Except that, if you want to do so, because you want more real estate, it is 
often the only way to get it :o)

Which isn't to concede that you have a point; I should be able to easily tune 
the size of things with a scale, and the thing should be drawn at the size I 
want.

-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Tom Horsley
On Tue, 10 Mar 2009 22:01:39 +1030
Tim wrote:

  96x96 should be the default. I don't know why it isn't.  
 
 No.  The DPI should be set to the values that actually represent the
 hardware.

Actually, that attitude is the one that is utter nonsense. If you
want to get slavish about actual representation, then you need to
know the distance of the viewer and specify font sizes by the angular
diameter the viewer will experience with the font :-).

The reason 96 should be the default is that 96 can at least be read
on every display device I've ever seen, so you'll at least be able
to see what your are doing while getting things set the way you
actually want them.

The actual representation of a 9 point font (a perfectly readable
size on a laser priner) makes about 3 or 4 pixels available to render
lower case letters on a 46 inch 1920x1080 HD display. Great fun trying
to navigate to the font settings dialog when all the menu items
are 4 pixels high.

Or consider Samsung HD displays. The one I just got has EDID info
that claims it is 160 x 90 millimeters - that gives 305 DPI as the
actual representation which makes the fonts so large the font
settings dialog won't fit on the screen.

Then there are projectors. You can't possibly tell what the
actual representation is because it depends on how far away the
screen it.

By all means let the actual representation fanatics set the DPI
to the actual representation if that is what they want, but for
Gods sake don't make it the default setting. Make the default
something everyone will be able to read.

-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Tim
Tim:
 No.  The DPI should be set to the values that actually represent the
 hardware.

Tom Horsley
 Actually, that attitude is the one that is utter nonsense. If you
 want to get slavish about actual representation, then you need to
 know the distance of the viewer and specify font sizes by the angular
 diameter the viewer will experience with the font :-).

No, you don't.  Certainly not from the point you'd advocate, of giving
some false meaning to font sizes.

Yes, if designing wall posters, or the like, you'd work out how it'd be
viewed, then pick a font size that's appropriate to the thing.  You
wouldn't redefine 12 points to be something else so you could say that's
12 point text up there (when it most definitely is NOT), just because it
seems the same as when I hold 12 point text on a sheet of A4 in my
hands.

In the case of posters, you might know that it's going to be read at
five feet away, mostly front on, and usability guides might say that you
should have 3 inch high text for certain parts of the text.  You'd
specify the height, and it'd work out the point size to use, the real
point size, not some artificially made up thing.

 The reason 96 should be the default is that 96 can at least be read
 on every display device I've ever seen, so you'll at least be able
 to see what your are doing while getting things set the way you
 actually want them.

The reason is just cargo cult mentality.

 The actual representation of a 9 point font (a perfectly readable
 size on a laser priner) makes about 3 or 4 pixels available to render
 lower case letters on a 46 inch 1920x1080 HD display. Great fun trying
 to navigate to the font settings dialog when all the menu items
 are 4 pixels high.

And that's what happens when you use measurement systems
inappropriately.  DPI has a real meaning, and so does point sizes.
When you misuse one, then another, you compound the problem.

Point sizes are *absolute*, specific fractions of an inch, if you want
a simple one-phrase explanation.  The computer uses your DPI, combined
with a description of the size of the display medium, to work out how
many dots to use get text at the point size specified.

i.e. 12 point text is the same size whether printed on 2 inches of
paper, or 20 inches of paper.

If you want to scale fonts to be readable at certain distances, of an
*apparent* size, but not actually the same size on the different
displays, THEN YOU DON'T SPECIFY SIZES IN POINTS!

Specifying fonts in pixel sizes is the wrong way to go about it, for the
same reasons.  You can only use such font sizing schemes when designing
graphics for a fixed size display.

 Or consider Samsung HD displays. The one I just got has EDID info
 that claims it is 160 x 90 millimeters - that gives 305 DPI as the
 actual representation which makes the fonts so large the font
 settings dialog won't fit on the screen.

Your attempting to use a broken ruler to support broken facts.

 Then there are projectors. You can't possibly tell what the
 actual representation is because it depends on how far away the
 screen it.

That's where you're wrong.  Firstly, you can tell the distance from the
screen (given semi-decent hardware which actually takes note of the
focus settings - focus is a distance-dependent thing).  Knowing the
optics and the distance, you can know the screen size (and therefore
know the actual DPI - where you're changing the size, rather than the
number of dots, this time).

But still, as I said above, if you're expecting certain point size text
to display at different sizes, then you're doing it wrong.  You've got
two choices at doing it right:  Not using point sizes.  Or not doing
something stupid like wanting to specify 12 point text when you're going
to do a projected display.

 By all means let the actual representation fanatics set the DPI
 to the actual representation if that is what they want, but for
 Gods sake don't make it the default setting. Make the default
 something everyone will be able to read.

Make the default settings stop *misusing* printing systems.  It's just
pandering to the ignorant to call something 12 point text when it's not
12 point text, simply because people are used to it.  Even when misused
in the way that most people expect it, it doesn't work how you want it
to work.  12 point text is different on one thing to another, whether
you measure it with a ruler, or try playing scaling games for viewing
distance.  It's as bogus as Wattage stated in PMPO.

As it stands, on computer software, when you pick 12 points, or even 12
pixels, for text, the unit is actually meaningless.  It's 12 variable
*somethings*.  You might as well call it centimetres, then claim that we
use different centimetres than everyone else.

-- 
[...@localhost ~]$ uname -r
2.6.27.15-78.2.23.fc9.i686

Don't send private replies to my address, the mailbox is ignored.  I
read messages from the public lists.



-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: 

Re: setting X server DPI

2009-03-10 Thread Tom Horsley
On Wed, 11 Mar 2009 02:24:39 +1030
Tim wrote:

 i.e. 12 point text is the same size whether printed on 2 inches of
 paper, or 20 inches of paper.

Absolutely true, and absolutely the point. If you specify a 12 point
font on a 46 1920x1080 display, you will wind up drawing some
random smudge of bits that is indeed able to fit on a line that
is 12/72 of an inch high, but there aren't enough frigging pixels
to render the font in any fashion that makes it remotely possible
to discern what the character actually represents.

As long as all the software in the universe insists on defaulting
to things like 9 or 12 point fonts for menu items and login screens
no sane person would want the default DPI to actually match the
hardware because they couldn't possibly read what it says enough
to even find the dialog box they need to fix it.

If some poor soul suffers from OCD so badly that he goes into
uncontrollable tremors and breaks out in a sweat if the measured
size of the font doesn't match the requested size, they by all means
let him check the box that says render actual size, but I'm
not willing to set my screen to actual size just because he
breaks into a cold sweat knowing that I have it set wrong.
I need to be able to read what it says.

A default setting that can be read is vastly superior to a
pedantically correct setting that results in everything
being tiny little screen smudges.

-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Patrick O'Callaghan
On Tue, 2009-03-10 at 13:27 -0400, Tom Horsley wrote:
 Absolutely true, and absolutely the point. If you specify a 12 point
 font on a 46 1920x1080 display, you will wind up drawing some
 random smudge of bits that is indeed able to fit on a line that
 is 12/72 of an inch high, but there aren't enough frigging pixels
 to render the font in any fashion that makes it remotely possible
 to discern what the character actually represents.

This true but it shouldn't be. It's true because the sizes of things in
X are defined in terms of pixels, and it's wrong because 12pt type is
12pt, no matter what medium it's on. It's an absolute size, not a given
number of pixels.

The fault is with how X works. Probably no-one remembers it now, but the
NeWS display server defined by Sun and based on Postscript did actually
manage real sizes, not pixel dimensions.

poc

-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Tom Horsley
On Tue, 10 Mar 2009 13:35:51 -0430
Patrick O'Callaghan wrote:

 This true but it shouldn't be. It's true because the sizes of things in
 X are defined in terms of pixels, and it's wrong because 12pt type is
 12pt, no matter what medium it's on. It's an absolute size, not a given
 number of pixels.

No they aren't. All the font rendering libraries these days take
point sizes at the primary means of specifying font size, but the point
(he-he :-) is that the physical device eventually renders the fonts
by turning pixels on or off. If you only have 40 pixels per inch, then
a 12 point font is gonna need to be rendered on that device in
((12/72)*40 pixels (which comes out to 7 even if you round up), then
considering that the lower case characters are only about half height
and you have a grand total of 3 or 4 pixels available to render the
entire set of glyphs in a font. No can read :-). Even with anti-aliasing
and greyscale values for the pixels, rendering readable characters
cannot be done.

Often this is what you want. If you are preparing a print preview,
unreadable little smudges that give you an idea of where the line
goes is perfectly OK, but if you are trying to read menus and
dialog boxes and get work done it is hopeless.

You could re-write every application in the universe to carefully
deduce some readability factor based on the available pixels
to render the requested font size and request a different size
font for thing it intends to be readable, or you can lie about
the DPI and not rewrite every single app in the universe. Guess
which one is more practical :-).

Just like you can calculate orbits with an earth-centric model
and epicycles within epicycles, or you could use a sun-centric
model and a simple ellipse. They both get the same result, but
one is a heck of a lot simpler than the other.

-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-10 Thread Michael Hennebry

Ideally, the X server has the correct DPIs
and the application is written to use them.
The application can discover the number of pixels in a 12pt font
and enlarge or not depending on the answer and the purpose.
Given that many applications don't do that,
lying about the DPIs is a perfectly sensible thing to do.
That isn't good for applications that would otherwise do the right thing.
There probably isn't a good answer for letting existing applications coexist.
Perhaps new applications could look for resources like REAL_DPIX and REAL_DPIY.
If they are there and represent numbers,
they would be used instead of the lies told the X server.

--
Michael   henne...@web.cs.ndsu.nodak.edu
Pessimist: The glass is half empty.
Optimist:   The glass is half full.
Engineer:   The glass is twice as big as it needs to be.

--
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


setting X server DPI

2009-03-08 Thread David Hláčik

Hello guys, how to configure X server's DPI on Fedora 10?

I have in gnome DPI set to 96DPI, but when i check Xorg.log i see that there 
is 75x75 DPI, which is the reason , why my fonts are so blurry.


Thanks for help,

D.


--
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-08 Thread Petrus de Calguarium
David Hláčik wrote:

 Thanks for help,
 
96x96 should be the default. I don't know why it isn't. I have tried it on 
an old 1992 crt monitor and 96x96 worked splendidly, so I don't know what 
kind of archaic hardware the present default is set for.

To change, edit /etc/kde/kdm/kdmrc and append ' -dpi 96' (no quotes, of 
course) to the ServerArgsLocal line.



-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines


Re: setting X server DPI

2009-03-08 Thread Tom Horsley
On Sun, 08 Mar 2009 11:44:44 -0600
Petrus de Calguarium wrote:

 To change, edit /etc/kde/kdm/kdmrc and append ' -dpi 96' (no quotes, of 
 course) to the ServerArgsLocal line.

Which works only if you are using KDM and not GDM.

I've got a long rant on DPI one a website I'm working on
with all my linux info dumped. See:

http://braindump.home.att.net/dpi.html

-- 
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines