I see lots of discussion has happened on that front already; sorry for being late to the party.

I'm currently wrestling against a Dell UP3214Q, which seems to exhibit all of the problems you refer to. There are many existing binary games out there that most likely will never get touched again that are RandR clients and try to either maximize themselves to one of the displays or ask the window manager to do it for them, after discovering the monitor topology. Some of these games ship their own libXRandR, some of them statically link against it, some of them inline the protocol code to take bug fixes that weren't widespread in distros at the time they were packaged. It's pretty bad. Given that the only viable solution seems to lie completely on the X server side of things.

It seems the X server should be responsible for both making sure the two halves are laid out properly (right now they're in reverse order) and hidden from clients. This database of quirks would ideally live in the server itself where it can be easily leveraged by everyone shipping DDXen.

Thanks,
 - Pierre-Loup

On 01/16/2014 11:11 AM, Aaron Plattner wrote:
So, monitor manufacturers are starting to make high-resolution displays that
consist of one LCD panel that appears to the PC as two.  The one I've got is a
Dell UP2414Q.  It shows up to the PC as two DisplayPort 1.2 multistream devices
that have the same GUID but different EDIDs.  There's an extension block in the
EDID that's supposed to indicate which side is the left tile and which is the
right, though I haven't tried to decode it yet.

The problem, obviously, is that applications (including some games) treat the
two tiles as if they were completely separate monitors.  Windows maximize to
only half of the screen.  My question is, how do we want to deal with these
monitors?

As far as I see it, we have four options:

  1. Hide the presence of the second tile in the X server.

     Somehow combine the two tiles into a single logical output at the RandR
     protocol level.  The X server would be responsible for setting up the right
     configuration to drive the logical output using the correct physical
     resources.

  2. Hide the presence of the second tile in libXrandr.

     This would allow interested applications to query the real state of the
     hardware while also making it easier to do modesets on a per-monitor level
     rather than per-output.

     This could be exposed either as a new "simple" modeset API in libXrandr or
     similar, or by modifying the existing interface and having a new interface
     to punch through the façade and get at the real configuration, for clients
     that care.

  3. Update every application that uses RandR 1.2.

     Applications can detect the presence of these monitors and deal with them
     themselves, but this might have poor adoption because programmers are a 
lazy
     bunch in general.

  4. Do nothing and hope the problem goes away.

     Hopefully, the situation with current 4k monitors is temporary and we'll
     start seeing single-tile 4k displays soon, fixing the problem "forever".
     Until we get 8k tiled displays.

If the real output devices are still exposed through the protocol, it might make
sense to add new properties describing their relative positions to make it
easier for clients to lay them out in the right order.  This might be useful for
power-walls too.

The problem with the first two options is that driving these monitors consumes
two crtcs.  If we present them as a single output to applications, they'll make
the assumption that they can just assign a single crtc to that output and use
the remaining crtcs for something else.  I suspect that deleting crtcs or
otherwise marking them as used as a side effect of setting a mode on a different
crtc is going to explode a lot of existing applications.

~~

Regardless of what we do about the current crop of 4k monitors, one feature I
would like to add is a standardized OutputGroup property.  Multiple outputs with
the same value of OutputGroup should be considered (both by clients and the
server) as a single logical monitor.  This would affect the Xinerama information
presented by rrxinerama.c, and window managers that use RandR 1.2 directly would
be encouraged to consider output groups in their UI behavior.

The X server could configure OutputGroups automatically when setting up the
initial configuration based on the presence of tiled displays, and clients could
reconfigure the groups at runtime to get different behavior if desired.

Does this sound like a reasonable extension to RandR?


_______________________________________________
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Reply via email to