On Wed, 4 May 2016 19:01:09 +0200
Alberto Salvia Novella <[email protected]> wrote:

> Mattias Andrée:
> > What's wrong with dots per inch?  
> 
> How can an application reliably know which is the current
> pixel density of the desktop?
> 
> 

Well, you cannot know anything reliably. The EDID
does contain all information you need for DPI, however
with limited precision. X.org reports a bogus DPI. But
if pretend that all monitors' dimensions are in whole
centimetres, than the number of pixels per centimetre
can be calculated

  ppc_x = output_width_px(monitor) / output_width_cm(monitor);
  ppc_y = output_height_px(monitor) / output_height_cm(monitor);

Notice that this is easier to calculate than the pixels per inch.

  ppi_x = output_width_px(monitor) / output_width_cm(monitor) * 2.540;
  ppi_y = output_height_px(monitor) / output_height_cm(monitor) * 2.540;

But why is pixels preferred over dots?

Attachment: pgpzgTZ4Ybf73.pgp
Description: OpenPGP digital signature

_______________________________________________
xdg mailing list
[email protected]
https://lists.freedesktop.org/mailman/listinfo/xdg

Reply via email to