On 20 August 2015 at 20:44, Alan W. Irwin <ir...@beluga.phys.uvic.ca> wrote:
> On 2015-08-20 14:43+0100 Phil Rosenberg wrote:
>
>>>> 1) Text size. As far as I am aware the text size in wxWidgets is
>>>> entirely correct - I haven't double checked this today, but this was
>>>> the case last time I checked. The docs state that the text size
>>>> provided by the user is in mm, and I use a value of 90 pixels per inch
>>>> (converted to metric) to convert to pixels. Counting pixels on the
>>>> screen it appears that the output is as I intended. The pixels per
>>>> inch value I used is the same as used by inkscape - so the wxWidgets
>>>> output should have similar sized text to the svg output rendered by
>>>> inkscape. However as you can find online (e.g.
>>>>
>>>>
>>>> http://stackoverflow.com/questions/1346922/svg-and-dpi-absolute-units-and-user-units-inkscape-vs-firefox-vs-imagemagick)
>>>> different svg renderers assume different pixels per inch in their
>>>> rendering. However if other drivers are providing text that is 2.2
>>>> times the size of that in wxWidgets they must be assuming around 40
>>>> pixels per inch which is much lower than any other standard I have
>>>> seen.
>>>> Basically where I am going is that we should set a standard value for
>>>> pixels per inch for all raster drivers. It would potentially be useful
>>>> to have an API call to set the pixels per inch too. Then each driver
>>>> can be checked to make sure it conforms - perhaps with the Peace
>>>> example being used to check.
>>>
>>>
>>>
>>> I am old school scientist on this issue on the one hand, and completely
>>> semiempirical on the other.  :-)
>>>
>>> I. Old school:
>>>
>>> Pixels per mm (and therefore pixels per inch or DPI) should not be an
>>> assumption.  Your monitor has a definite resolution in pixels and a
>>> definite size in mm so therefore it has a known value for pixels per
>>> mm and DPI in both X and Y.
>>>
>>> However, there are some important caveats about this for the
>>> Linux/X11 case see
>>>
>>> <http://unix.stackexchange.com/questions/75344/how-does-x-server-calculate-dpi>.
>>>
>>> That reference does not mention xrandr which is one way to determine
>>> actual DPI for the Linux/X11 case.
>>>
>>> irwin@raven> xrandr |grep " connected"
>>> HDMI-1 connected 1440x900+0+0 (normal left inverted right x axis y
>>> axis) 410mm x 256mm
>>>
>>> I verified those dimensions with a meter stick since my monitor manual
>>> does not include the actual display dimensions.
>>>
>>> Those results imply actual XDPI is
>>>
>>> irwin@raven> echo "1440*25.4/410" |bc -l
>>> 89.20975609756097560975
>>>
>>> and YDPI is
>>>
>>> irwin@raven> echo "900*25.4/256" |bc -l
>>> 89.29687500000000000000
>>>
>>> So those appear to be reasonably reliable numbers for DPI for my
>>> monitor, but as the discussion in the above URL states, the xdpyinfo
>>> application does _not_ provide reliable DPI information.
>>>
>>> irwin@raven> xdpyinfo | grep -E 'dimensions|resolution'
>>>   dimensions:    1440x900 pixels (381x238 millimeters)
>>>   resolution:    96x96 dots per inch
>>>
>>> That is, that app does report the correct pixels for the monitor, but
>>> arbitrarily fakes the size in millimeters to always return fake DPI
>>> values of 96!  I have read web assertions that these fake values are
>>> simply an attempt to follow a fake DPI "standard" set by Microsoft
>>> Windows, but I don't know that is the case.
>>>
>>> So PLplot could in theory (for at least the Linux/X11 xrandr case) be
>>> able to interrogate the operating system to determine the actual
>>> pixels per mm values and then use those values for all rendering that
>>> is done in units of mm.  But just because the xrandr-derived value was
>>> correct in my case does not mean is will be correct in all cases, and
>>> there is also concern about what to do for other platforms. So I doubt
>>> very much we will go there.
>>>
>>> Another approach might be to assume some reasonable DPI value and then
>>> attempt to enforce that for each of our device drivers. But that might
>>> be tricky for our device drivers like qt and cairo which depend on
>>> external libraries which might have their own independent standards
>>> about what to assume for DPI value.  So for now and perhaps
>>> indefinitely, I think we are stuck with the semiempirical approach
>>> below.
>>>
>>> II.  Semiempirical.
>>>
>>> The semiempirical approach is simply to multiply the character size
>>> for each device driver so it agrees reasonably with the others for the
>>> second page of example 2.  That is, the numbers on that page should be
>>> fairly closely inscribed by the inner square.
>>>
>>> I have just looked at that page for a wide variety of our devices, and
>>> they all pass this test more or less except for wxwidgets where the
>>> numbers are roughly half the size of the inner square.  So it is on
>>> that basis I suggest you approximately double the character size to
>>> match more closely what the other devices do here.
>>>
>>>
>>
>> HI Alan, in some ways I am fine with that. I think actually wxWidgets
>> has a cross platform display class which dimensions can be grabbed
>> from.
>> I did read a while ago about something on Windows where there is a
>> requirement to get into the App Store that an app notices high density
>> (i.e. phone/tablet) screens and scales text appropriately, but I can't
>> remember the details.
>>
>> However, if we decide we have no interest in matching the size of text
>> to an actual metric like mm, then we should remove that from the
>> documentation. The problem then becomes, what do we put in it's place?
>> Having something like "plplot uses an arbitrary and undocumented font
>> scaling which is approximately the same for all drivers but is not
>> linked to any physical dimension of the plot" obviously isn't
>> satisfactory but that is basically what we are doing.
>
>
> Good point.  The semiempirical approach tries to avoid opening this
> can of worms, but you are likely correct we should clean up this mess.
> Since using actual DPI values for each monitor
> used by our users is impossible for the reasons I stated, then probably the
> documented goal (although the documentation should state this is
> not realized yet for all device drivers) should be that PLplot does pixel
> to metric conversions based on a xdpi and ydpi specified by the
> user (via the existing -dpi command-line option or equivalent plspage call)
> or by the default dpi value set by PLplot.
>
> To implement that last, I suggest that default value (note, using 90 for
> that default value seems OK to me) should be specified as follows:
>
> #define PLPLOT_DEFAULT_DPI 90
>
> in include/plplot.h and
>
> plsc->xdpi = PLPLOT_DEFAULT_DPI;
> plsc->ydpi = PLPLOT_DEFAULT_DPI;
>
> in plstrm_init (which initialises all streams).
>
> If you agree with this approach for setting default PLplot-wide
> plsc->xdpi and plsc->ydpi values will you please push a commit to this
> effect to get this preliminary out of the way?

This seems perfectly sensible to me. I will make this change and commit it.

Phil

------------------------------------------------------------------------------
_______________________________________________
Plplot-devel mailing list
Plplot-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/plplot-devel

Reply via email to