Re: xserver and splitting ultra-wide monitors
Hello Keith, I have verified that the current implementation actually works exactly as suggested. Initially, an auto-generated monitor exists. Adding a user defined monitor replaces it. Removing the user defined monitor restores the auto-generated monitor. I have refined the randr spec regarding this. I also changed the definition to allow more than one monitor per output. The change can be found at https://gitlab.freedesktop.org/xorg/proto/xorgproto/-/merge_requests/64 After I got some advice from Oliver how to improve my PR, I updated the description there and added my findings as well as an estimation what side effects this change could have. This PR to xorg-server is at https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests/981 Please let me know if there is anything missing or if some more discussion is required to get this merged. Kind regards, Michael. Michael Wyraz writes: For the second monitor, the output must be set to "none" which is obviously wrong since it is connected to a device. The reason why it is set to "none" is some code in xserver that removes an monitor if another one is added to the same output: That's actually required in the RandR spec: For each output in 'info.outputs, each one is removed from all pre-existing Monitors. If removing the output causes the list of outputs for that Monitor to become empty, then that Monitor will be deleted as if RRDeleteMonitor were called. The notion of splitting one physical output into multiple virtual monitors was not considered when this extension was defined, which is why it doesn't work. I don't see any particular reason for *not* supporting your use case. However, there are subtleties here. We want to remove any automatically created 'Monitor' objects when mapping user-specified monitors to them, and we want to re-generate automatically generated 'Monitors' when all virtual monitors associated with an output are removed. I think what we want is: * If no user-specified Monitors map to a particular Output, then automatically create a Monitor for that Output * If any user-specified Monitors map to a particular Output, then remove the automatically generated Monitor for that Output. In the current spec, there's no real separation between user-specified and automatically-generated Monitors, I think that would be necessary to make this work?
xserver and splitting ultra-wide monitors
Hello xorg developers, sorry to ask this again but I did not receive any response yet. I have created a patch for xorg to allow multiple virtual monitors on one display: https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests/981. This allows to properly set up a split-screen environment on an ultra-wide screen. Along with a small gtk3 patch, it works very well e.g. on xfce. There is a discussion on https://gitlab.gnome.org/GNOME/gtk/-/issues/2013#note_1564376 where multiple users confirm that it works as expected. I'm also using it daily on my workstation. What must be done that the patch can be merged into xorg? Kind regards, Michael.
Re: xserver and splitting ultra-wide monitors
Hello xorg developers, I have created a MR for this feature at https://gitlab.freedesktop.org/xorg/xserver/-/merge_requests/981. What do you think about it? Could it be merged? Kind regards, Michael. Am 01.10.22 um 09:52 schrieb Michael Wyraz: Hello xorg developers, I have attached my patch to xserver that removes the "one-monitor-per-output" restriction. My attempt and the result is described in https://gitlab.gnome.org/GNOME/gtk/-/issues/2013#note_1564376 . The result is amazing, split-screen works flawlessly with all of my desktop applications, similar as if I had 2 monitors. I'd be happy if that could make it into xserver. Should I create a PR from the patch in the gitlab? Kind regards, Michael. Am 29.09.22 um 22:41 schrieb Keith Packard: Michael Wyraz writes: For the second monitor, the output must be set to "none" which is obviously wrong since it is connected to a device. The reason why it is set to "none" is some code in xserver that removes an monitor if another one is added to the same output: That's actually required in the RandR spec: For each output in 'info.outputs, each one is removed from all pre-existing Monitors. If removing the output causes the list of outputs for that Monitor to become empty, then that Monitor will be deleted as if RRDeleteMonitor were called. The notion of splitting one physical output into multiple virtual monitors was not considered when this extension was defined, which is why it doesn't work. I don't see any particular reason for *not* supporting your use case. However, there are subtleties here. We want to remove any automatically created 'Monitor' objects when mapping user-specified monitors to them, and we want to re-generate automatically generated 'Monitors' when all virtual monitors associated with an output are removed. I think what we want is: * If no user-specified Monitors map to a particular Output, then automatically create a Monitor for that Output * If any user-specified Monitors map to a particular Output, then remove the automatically generated Monitor for that Output. In the current spec, there's no real separation between user-specified and automatically-generated Monitors, I think that would be necessary to make this work?
Re: xserver and splitting ultra-wide monitors
Hello xorg developers, I have attached my patch to xserver that removes the "one-monitor-per-output" restriction. My attempt and the result is described in https://gitlab.gnome.org/GNOME/gtk/-/issues/2013#note_1564376 . The result is amazing, split-screen works flawlessly with all of my desktop applications, similar as if I had 2 monitors. I'd be happy if that could make it into xserver. Should I create a PR from the patch in the gitlab? Kind regards, Michael. Am 29.09.22 um 22:41 schrieb Keith Packard: Michael Wyraz writes: For the second monitor, the output must be set to "none" which is obviously wrong since it is connected to a device. The reason why it is set to "none" is some code in xserver that removes an monitor if another one is added to the same output: That's actually required in the RandR spec: For each output in 'info.outputs, each one is removed from all pre-existing Monitors. If removing the output causes the list of outputs for that Monitor to become empty, then that Monitor will be deleted as if RRDeleteMonitor were called. The notion of splitting one physical output into multiple virtual monitors was not considered when this extension was defined, which is why it doesn't work. I don't see any particular reason for *not* supporting your use case. However, there are subtleties here. We want to remove any automatically created 'Monitor' objects when mapping user-specified monitors to them, and we want to re-generate automatically generated 'Monitors' when all virtual monitors associated with an output are removed. I think what we want is: * If no user-specified Monitors map to a particular Output, then automatically create a Monitor for that Output * If any user-specified Monitors map to a particular Output, then remove the automatically generated Monitor for that Output. In the current spec, there's no real separation between user-specified and automatically-generated Monitors, I think that would be necessary to make this work? --- xorg-server-21.1.4/randr/rrmonitor.c.orig 2022-09-30 00:09:40.458561832 +0200 +++ xorg-server-21.1.4/randr/rrmonitor.c 2022-09-30 00:09:46.298529786 +0200 @@ -528,27 +528,6 @@ continue; } -/* For each output in 'info.outputs', each one is removed from all - * pre-existing Monitors. If removing the output causes the list - * of outputs for that Monitor to become empty, then that - * Monitor will be deleted as if RRDeleteMonitor were called. - */ - -for (eo = 0; eo < existing->numOutputs; eo++) { -for (o = 0; o < monitor->numOutputs; o++) { -if (monitor->outputs[o] == existing->outputs[eo]) { -memmove(existing->outputs + eo, existing->outputs + eo + 1, -(existing->numOutputs - (eo + 1)) * sizeof (RROutput)); ---existing->numOutputs; ---eo; -break; -} -} -if (existing->numOutputs == 0) { -(void) RRMonitorDelete(client, screen, existing->name); -break; -} -} if (monitor->primary) existing->primary = FALSE; }
Re: xserver and splitting ultra-wide monitors
Hello Keith, I'm glad that you have a look on this. I first tried to contact you directly using the intel address in https://gitlab.freedesktop.org/xorg/doc/xorg-docs/-/blob/master/MAINTAINERS but this seems not be valid anymore. I absolutely agree with what you wrote about the user-specified vs automatically created monitors. But the code I'd like to remove does not deal with these. It's only about having 2 monitors on one output. https://gitlab.freedesktop.org/xorg/xserver/-/blob/master/randr/rrmonitor.c Lines 526-529 should not be touched. If a monitor with the same name will be added elsewhere, the existing should be removed (avoids to have multiple monitors with the same name) Lines 537-551 sould be removed. It's only about having not more than one monitor on one output. I removed these on my local xserver and did the following test: # xrandr --listmonitors Monitors: 1 0: +*DisplayPort-1 3440/820x1440/346+0+0 DisplayPort-1 # xrandr --setmonitor VIRTUAL-LEFT 1720/0x1440/1+0+0 DisplayPort-1 # xrandr --listmonitors Monitors: 1 0: VIRTUAL-LEFT 1720/0x1440/1+0+0 DisplayPort-1 # xrandr --setmonitor VIRTUAL-RIGHT 1720/0x1440/1+1720+0 DisplayPort-1 # xrandr --listmonitors Monitors: 2 0: VIRTUAL-LEFT 1720/0x1440/1+0+0 DisplayPort-1 1: VIRTUAL-RIGHT 1720/0x1440/1+1720+0 DisplayPort-1 # xrandr --delmonitor VIRTUAL-LEFT # xrandr --delmonitor VIRTUAL-RIGHT # xrandr --listmonitors Monitors: 1 0: +*DisplayPort-1 3440/820x1440/346+0+0 DisplayPort-1 As you can see, all works fine, I can create 2 monitors on the one output and the handling of the auto-generated monitor still works as expected. When I do # xrandr --fb 3440x1441; xrandr --fb 3440x1440 afterwards (this triggers an update to the display manager), I can use both monitors on xfce (at least I can drag windows over both and maximize on the second but that's a different issue within gtk). Kind regards, Michael. Am 29.09.22 um 22:41 schrieb Keith Packard: Michael Wyraz writes: For the second monitor, the output must be set to "none" which is obviously wrong since it is connected to a device. The reason why it is set to "none" is some code in xserver that removes an monitor if another one is added to the same output: That's actually required in the RandR spec: For each output in 'info.outputs, each one is removed from all pre-existing Monitors. If removing the output causes the list of outputs for that Monitor to become empty, then that Monitor will be deleted as if RRDeleteMonitor were called. The notion of splitting one physical output into multiple virtual monitors was not considered when this extension was defined, which is why it doesn't work. I don't see any particular reason for *not* supporting your use case. However, there are subtleties here. We want to remove any automatically created 'Monitor' objects when mapping user-specified monitors to them, and we want to re-generate automatically generated 'Monitors' when all virtual monitors associated with an output are removed. I think what we want is: * If no user-specified Monitors map to a particular Output, then automatically create a Monitor for that Output * If any user-specified Monitors map to a particular Output, then remove the automatically generated Monitor for that Output. In the current spec, there's no real separation between user-specified and automatically-generated Monitors, I think that would be necessary to make this work?
Re: xserver and splitting ultra-wide monitors
Michael Wyraz writes: > For the second monitor, the output must be set to "none" which is > obviously wrong since it is connected to a device. The reason why it is > set to "none" is some code in xserver that removes an monitor if another > one is added to the same output: That's actually required in the RandR spec: For each output in 'info.outputs, each one is removed from all pre-existing Monitors. If removing the output causes the list of outputs for that Monitor to become empty, then that Monitor will be deleted as if RRDeleteMonitor were called. The notion of splitting one physical output into multiple virtual monitors was not considered when this extension was defined, which is why it doesn't work. I don't see any particular reason for *not* supporting your use case. However, there are subtleties here. We want to remove any automatically created 'Monitor' objects when mapping user-specified monitors to them, and we want to re-generate automatically generated 'Monitors' when all virtual monitors associated with an output are removed. I think what we want is: * If no user-specified Monitors map to a particular Output, then automatically create a Monitor for that Output * If any user-specified Monitors map to a particular Output, then remove the automatically generated Monitor for that Output. In the current spec, there's no real separation between user-specified and automatically-generated Monitors, I think that would be necessary to make this work? -- -keith signature.asc Description: PGP signature
xserver and splitting ultra-wide monitors
Hello Xorg developers, about a year ago I investigated in how to split ultra-wide monitors on linux into multiple virtual monitors. While this is basically possible with xserver and xrandr, a combination of different issues stops from using this feature on most desktops. Unfortunately, the issues still exist. I'd like to start another attempt to get this finally work. The first step would be xserver and xrandr. It is easily possible to split a monitor with xrandr into two: xrandr --setmonitor VIRTUAL-LEFT 2560/0x1440/1+0+0 DP-4 xrandr --setmonitor VIRTUAL-RIGHT 2560/1x1440/1+2560+0 none For the second monitor, the output must be set to "none" which is obviously wrong since it is connected to a device. The reason why it is set to "none" is some code in xserver that removes an monitor if another one is added to the same output: https://gitlab.freedesktop.org/mwyraz/xserver/-/merge_requests/1/diffs I believe, this code should be removed entirely for two reasons. The first is, that two virtual monitors on one output is perfectly valid. The other is, that the function is about adding monitor. It should not be silently delete another monitor. If there is a reason to delete one monitor if another is added, such functionality should be part of the tooling (like in xrandr) - this would also allow to implement different behaviours (e.g. a warning or an option to enforce to have multiple monitors on one output). What do you think about this? Could you help to get this solved? If so and this could be fixed, my next step would be to address the related issue in gtk (https://gitlab.gnome.org/GNOME/gtk/-/issues/2013) so that gtk based desktops would properly work on a splitted ultra-wide screen. Kind regards, Michael.