>> 1) Text size. As far as I am aware the text size in wxWidgets is >> entirely correct - I haven't double checked this today, but this was >> the case last time I checked. The docs state that the text size >> provided by the user is in mm, and I use a value of 90 pixels per inch >> (converted to metric) to convert to pixels. Counting pixels on the >> screen it appears that the output is as I intended. The pixels per >> inch value I used is the same as used by inkscape - so the wxWidgets >> output should have similar sized text to the svg output rendered by >> inkscape. However as you can find online (e.g. >> >> http://stackoverflow.com/questions/1346922/svg-and-dpi-absolute-units-and-user-units-inkscape-vs-firefox-vs-imagemagick) >> different svg renderers assume different pixels per inch in their >> rendering. However if other drivers are providing text that is 2.2 >> times the size of that in wxWidgets they must be assuming around 40 >> pixels per inch which is much lower than any other standard I have >> seen. >> Basically where I am going is that we should set a standard value for >> pixels per inch for all raster drivers. It would potentially be useful >> to have an API call to set the pixels per inch too. Then each driver >> can be checked to make sure it conforms - perhaps with the Peace >> example being used to check. > > > I am old school scientist on this issue on the one hand, and completely > semiempirical on the other. :-) > > I. Old school: > > Pixels per mm (and therefore pixels per inch or DPI) should not be an > assumption. Your monitor has a definite resolution in pixels and a > definite size in mm so therefore it has a known value for pixels per > mm and DPI in both X and Y. > > However, there are some important caveats about this for the > Linux/X11 case see > <http://unix.stackexchange.com/questions/75344/how-does-x-server-calculate-dpi>. > > That reference does not mention xrandr which is one way to determine > actual DPI for the Linux/X11 case. > > irwin@raven> xrandr |grep " connected" > HDMI-1 connected 1440x900+0+0 (normal left inverted right x axis y > axis) 410mm x 256mm > > I verified those dimensions with a meter stick since my monitor manual > does not include the actual display dimensions. > > Those results imply actual XDPI is > > irwin@raven> echo "1440*25.4/410" |bc -l > 89.20975609756097560975 > > and YDPI is > > irwin@raven> echo "900*25.4/256" |bc -l > 89.29687500000000000000 > > So those appear to be reasonably reliable numbers for DPI for my > monitor, but as the discussion in the above URL states, the xdpyinfo > application does _not_ provide reliable DPI information. > > irwin@raven> xdpyinfo | grep -E 'dimensions|resolution' > dimensions: 1440x900 pixels (381x238 millimeters) > resolution: 96x96 dots per inch > > That is, that app does report the correct pixels for the monitor, but > arbitrarily fakes the size in millimeters to always return fake DPI > values of 96! I have read web assertions that these fake values are > simply an attempt to follow a fake DPI "standard" set by Microsoft > Windows, but I don't know that is the case. > > So PLplot could in theory (for at least the Linux/X11 xrandr case) be > able to interrogate the operating system to determine the actual > pixels per mm values and then use those values for all rendering that > is done in units of mm. But just because the xrandr-derived value was > correct in my case does not mean is will be correct in all cases, and > there is also concern about what to do for other platforms. So I doubt > very much we will go there. > > Another approach might be to assume some reasonable DPI value and then > attempt to enforce that for each of our device drivers. But that might > be tricky for our device drivers like qt and cairo which depend on > external libraries which might have their own independent standards > about what to assume for DPI value. So for now and perhaps > indefinitely, I think we are stuck with the semiempirical approach > below. > > II. Semiempirical. > > The semiempirical approach is simply to multiply the character size > for each device driver so it agrees reasonably with the others for the > second page of example 2. That is, the numbers on that page should be > fairly closely inscribed by the inner square. > > I have just looked at that page for a wide variety of our devices, and > they all pass this test more or less except for wxwidgets where the > numbers are roughly half the size of the inner square. So it is on > that basis I suggest you approximately double the character size to > match more closely what the other devices do here. > >
HI Alan, in some ways I am fine with that. I think actually wxWidgets has a cross platform display class which dimensions can be grabbed from. I did read a while ago about something on Windows where there is a requirement to get into the App Store that an app notices high density (i.e. phone/tablet) screens and scales text appropriately, but I can't remember the details. However, if we decide we have no interest in matching the size of text to an actual metric like mm, then we should remove that from the documentation. The problem then becomes, what do we put in it's place? Having something like "plplot uses an arbitrary and undocumented font scaling which is approximately the same for all drivers but is not linked to any physical dimension of the plot" obviously isn't satisfactory but that is basically what we are doing. One item of note - and this is based on memory, not recent checking. But I think on initialisation the scale parameter for text is set as a nonlinear function of the plot size. It could be that the disparities are to do with the fact that the plots are different sizes and this nonlinearity, rather than the fact that either is doing anything wrong. Phil ------------------------------------------------------------------------------ _______________________________________________ Plplot-devel mailing list Plplot-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/plplot-devel