Public bug reported:

The recently set-hidpi-scale-factor branch provides extremely valuable
support for hidpi displays, but it does introduce a few easily fixed
regressions because of the way it anchors scaling too rigidly to
simulate 140 DPI.

This works well on a 15" Full HD or 4K screen, but sets a non-integer
scale factor on 14" and 17" Full HD or 4K screens.  Most toolkits don't
support non-integer scaling factors very well, and even if they did, it
would still result in a loss of sharpness.

The displays mentioned above are perfectly usable at the nearest integer
scaling factor, and windows render much more cleanly when set to 1x or
2x than 1.125x or 1.875x.  The +-1/8 scale difference in size compared
to 1x or 2x isn't worth breaking pixel perfect widget rendering.  It
would be a good idea to try to default to integer scaling factors
whenever reasonable.  Otherwise most people will have mangled widgets
unless they spend the time to set the scale factor themselves.

Here's the relevant merge: https://code.launchpad.net/~kaihengfeng/unity
/set-hidpi-scale-factor/+merge/299380

** Affects: unity (Ubuntu)
     Importance: Undecided
         Status: New


** Tags: hidpi

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1649736

Title:
  unity picks poor hidpi scale factor defaults

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/unity/+bug/1649736/+subscriptions

-- 
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to