Re: UDL device cannot get its own screen

2019-11-14 Thread Böszörményi Zoltán

2019. 11. 13. 19:08 keltezéssel, Böszörményi Zoltán írta:

2019. 11. 13. 18:25 keltezéssel, Ilia Mirkin írta:


Have you looked at setting AutoAddGPU to false? AutoBindGPU is too
late -- that's when you already have a GPU, whether to bind it to the
primary device (/screen/whatever). You need to not have a GPU in the
first place.


Yes, I tried AutoAddGPU=false. Then the UDL device was not set up at all.

What I noticed in debugging Xorg via GDB is that the UDL device was
matched to the wrong platform device in xf86platformProbeDev.
[long details deleted]


Now the xserver MR is at 
https://gitlab.freedesktop.org/xorg/xserver/merge_requests/335
with explaining the same as I wrote in the previous mail in the commit message.

I have also created 
https://gitlab.freedesktop.org/xorg/xserver/merge_requests/336
to fix the same issue when using BusID for the UDL device.

Best regards,
Zoltán Böszörményi
___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Re: UDL device cannot get its own screen

2019-11-13 Thread Böszörményi Zoltán

2019. 11. 13. 18:25 keltezéssel, Ilia Mirkin írta:

On Wed, Nov 13, 2019 at 11:59 AM Böszörményi Zoltán  wrote:


2019. 11. 12. 17:41 keltezéssel, Ilia Mirkin írta:

On Tue, Nov 12, 2019 at 9:23 AM Böszörményi Zoltán  wrote:

But no, all GPU devices (now only one, the UDL device) have screen 0
(a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:

[  2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
[  2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' 
scrnIndex
256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
confScreen->device->identifier 'Intel0'
confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
confScreen->device->myScreenSection->device->screen 0

Somehow, Option "Device" should ensure that the UDL device is actually
treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
instead of modeset(Gn)) and it should be woken up automatically.

This is what AutoBindGPU is supposed to do, isn't it?

But instead of assigning to screen 0, it should be assigned to whatever
screen number it is configured as.

I know it's not a common use case nowadays, but I really want separate
fullscreen apps on their independent screens, including a standalone UDL
device, instead of having the latters as a Xinerama extension to some
other device.


If you see a "G", that means it's being treated as a GPU device, which
is *not* what you want if you want separate screens. You need to try
to convince things to *not* set the devices up as GPU devices, but
instead put each device (and each one of its heads, via ZaphodHeads)
no a separate device, which in turn will have a separate screen.


I created a merge request that finally made it possible what I wanted.

https://gitlab.freedesktop.org/xorg/xserver/merge_requests/334

Now, no matter if I use the intel or modesetting drivers for the
Device sections using the Intel heads, or AutoBindGPU set to true or
false, the UDL device is correctly matched with its Option "kmsdev"
setting to the plaform device's device path.

This patch seems to be a slight layering violation, but since the
modesetting driver is built into the Xorg server sources, the patch
may get away with it.


Have you looked at setting AutoAddGPU to false? AutoBindGPU is too
late -- that's when you already have a GPU, whether to bind it to the
primary device (/screen/whatever). You need to not have a GPU in the
first place.


Yes, I tried AutoAddGPU=false. Then the UDL device was not set up at all.

What I noticed in debugging Xorg via GDB is that the UDL device was
matched to the wrong platform device in xf86platformProbeDev.

xf86_platform_devices[0] == Intel, /dev/dri/card1, primary platform device
xf86_platform_devices[1] == UDL, /dev/dri/card0

devList[0] == "Intel0"
devList[1] == "Intel1"
devList[2] == "UDL"
devList[3] == "Intel2" (GPU device)

Since the device path was not matched and the PCI ID did not match,
(after all, the UDL device is NOT PCI), this code was executed:

else {
/* for non-seat0 servers assume first device is the master */
if (ServerIsNotSeat0())
break;

if (xf86IsPrimaryPlatform(_platform_devices[j]))
break;
}

So, probeSingleDevice() was called with xf86_platform_devices[0] and
devList[2], resulting in the UDL device set up as a GPU device and not
a framebuffer on its own right.

My MR modifies this so if there is an explicit Option "kmsdev" setting,
it's matched first. The final else branch is only executed in the default
case with no explicit configuration.

With this MR, the explicit configuration for UDL works, regardless the
AutoBindGPU value.

Best regards,
Zoltán Böszörményi
___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Re: UDL device cannot get its own screen

2019-11-13 Thread Böszörményi Zoltán

2019. 11. 12. 17:41 keltezéssel, Ilia Mirkin írta:

On Tue, Nov 12, 2019 at 9:23 AM Böszörményi Zoltán  wrote:

But no, all GPU devices (now only one, the UDL device) have screen 0
(a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:

[  2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
[  2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' 
scrnIndex
256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
confScreen->device->identifier 'Intel0'
   confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
confScreen->device->myScreenSection->device->screen 0

Somehow, Option "Device" should ensure that the UDL device is actually
treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
instead of modeset(Gn)) and it should be woken up automatically.

This is what AutoBindGPU is supposed to do, isn't it?

But instead of assigning to screen 0, it should be assigned to whatever
screen number it is configured as.

I know it's not a common use case nowadays, but I really want separate
fullscreen apps on their independent screens, including a standalone UDL
device, instead of having the latters as a Xinerama extension to some
other device.


If you see a "G", that means it's being treated as a GPU device, which
is *not* what you want if you want separate screens. You need to try
to convince things to *not* set the devices up as GPU devices, but
instead put each device (and each one of its heads, via ZaphodHeads)
no a separate device, which in turn will have a separate screen.


I created a merge request that finally made it possible what I wanted.

https://gitlab.freedesktop.org/xorg/xserver/merge_requests/334

Now, no matter if I use the intel or modesetting drivers for the
Device sections using the Intel heads, or AutoBindGPU set to true or
false, the UDL device is correctly matched with its Option "kmsdev"
setting to the plaform device's device path.

This patch seems to be a slight layering violation, but since the
modesetting driver is built into the Xorg server sources, the patch
may get away with it.

Best regards,
Zoltán Böszörményi
___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Re: UDL device cannot get its own screen

2019-11-12 Thread Böszörményi Zoltán

2019. 11. 05. 15:22 keltezéssel, Böszörményi Zoltán írta:

Hi,

2019. 10. 23. 15:32 keltezéssel, Ilia Mirkin írta:

On Wed, Oct 23, 2019 at 2:41 AM Böszörményi Zoltán  wrote:


2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:

On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán  wrote:

Section "Device"
  Identifier  "UDL"
  Driver  "modesetting"
  Option  "kmsdev" "/dev/dri/card0"
  Screen  2
  Option  "Monitor-DVI-I-1-1" "DVI-I-1-1"


I think you have an extra -1 in here (and the monitor name doesn't
exist as per above). And I think the "Screen" index is wrong -- it's
not what one tends to think it is, as I recall. I think you can just
drop these lines though.


Without "Screen N" lines, all the outputs are assigned to :0
so the screen layout setup in the ServerLayout section is not
applied properly.



As I remember it, the Screen here is for ZaphodHeads-type
configurations, and it indicates which head you're supposed to use of
the underlying device. My suggestion was to only remove it here, not
everywhere.


Okay, but it still doesn't create a working setup.


So, finally I got back into experimenting with this.

I have read "man 5 xorg.conf" more closely and found option
GPUDevice in Section "Screen". Here's the configuration I came up
with but it still doesn't work:

==
Section "ServerFlags"
Option  "AutoBindGPU" "false"
EndSection

Section "Monitor"
Identifier  "Monitor-DP-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Monitor"
Identifier  "Monitor-VGA-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Monitor"
Identifier  "Monitor-HDMI-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Monitor"
Identifier  "Monitor-DVI-I-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Device"
Identifier  "Intel0"
Driver  "modesetting"
BusID   "PCI:0:2:0"
Screen  0
Option  "Monitor-DP-1" "DP-1"
Option  "ZaphodHeads" "DP-1"
EndSection

Section "Device"
Identifier  "Intel1"
Driver  "modesetting"
BusID   "PCI:0:2:0"
Screen  1
Option  "Monitor-VGA-1" "VGA-1"
Option  "ZaphodHeads" "VGA-1"
EndSection

Section "Device"
Identifier  "Intel2"
Driver  "modesetting"
BusID   "PCI:0:2:0"
Screen  2
Option  "Monitor-HDMI-1" "HDMI-1"
Option  "ZaphodHeads" "HDMI-1"
EndSection

Section "Device"
Identifier  "UDL"
Driver  "modesetting"
Option  "kmsdev" "/dev/dri/card0"
# Suggestion of Ilia Mirkin: Don't set Screen here
#Screen 2
Option  "Monitor-DVI-I-1" "DVI-I-1"
Option  "ZaphodHeads" "DVI-I-1"
EndSection

Section "Screen"
Identifier  "SCREEN"
Option  "AutoServerLayout" "on"
Device  "Intel0"
Monitor "Monitor-DP1"
SubSection  "Display"
Modes   "1024x768"
Depth   24
EndSubSection
EndSection

Section "Screen"
Identifier  "SCREEN1"
Option  "AutoServerLayout" "on"
Device  "Intel1"
Monitor "Monitor-VGA1"
SubSection  "Display"
Modes   "1024x768"
Depth   24
EndSubSection
EndSection

Section "Screen"
Identifier  "SCREEN2"
Option  "AutoServerLayout" "on"
Device  "UDL"
GPUDevice   "Intel2"
Monitor "Monitor-DVI-I-1"

Re: [PATCH v3 0/4] drm/udl: Convert to SHMEM

2019-11-08 Thread Böszörményi Zoltán

Hi!

2019. 11. 08. 8:36 keltezéssel, Thomas Zimmermann írta:

Hi Böszörményi


FYI, it's Zoltan, as it's my first name. :-)


Am 07.11.19 um 16:10 schrieb Böszörményi Zoltán:
Have you tried to increase the buffer size? There's a command-line
option to control this setting. [1]


Yes, I did, it didn't help. I used swiotlb=49152 (96MB) and
swiotlb=65536 (128MB) vs the default 32768 (64MB).

This parameter controls the _number of slab slots_ instead of
a single contiguous size as I read in the kernel sources.

With swiotlb=65536 I get the same error and dmesg lines:

[   97.671898] udl 2-1.2:1.0: swiotlb buffer is full (sz: 1978368 bytes), total 65536 
(slots), used 28 (slots)
[  107.477068] udl 2-1.2:1.0: swiotlb buffer is full (sz: 524288 bytes), total 65536 
(slots), used 584 (slots)
[  108.311947] udl 2-1.2:1.0: swiotlb buffer is full (sz: 2080768 bytes), total 65536 
(slots), used 0 (slots)
[  110.330940] udl 2-1.2:1.0: swiotlb buffer is full (sz: 3031040 bytes), total 65536 
(slots), used 56 (slots)
[  111.102755] udl 2-1.2:1.0: swiotlb buffer is full (sz: 3145728 bytes), total 65536 
(slots), used 1536 (slots)


It turned out, it's the combination of IO_TLB_SEGSIZE * (1<

Best regards,
Zoltán



Best regards
Thomas

[1] https://wiki.gentoo.org/wiki/IOMMU_SWIOTLB



[  133.320410] udl 2-1.2:1.0: overflow 0x0001199e4000+2211840 of DMA
mask  bus mask 0
[  133.320424] WARNING: CPU: 0 PID: 739 at kernel/dma/direct.c:35
report_addr+0x3e/0x70
[  133.320425] Modules linked in: 8021q garp mrp stp llc intel_rapl_msr
intel_rapl_common x86_pkg_temp_thermal intel_powerclamp coretemp
kvm_intel snd_hda_codec_hdmi kvm snd_hda_codec_realt
ek snd_hda_codec_generic ledtrig_audio snd_hda_intel snd_hda_codec
iTCO_wdt elo irqbypass iTCO_vendor_support intel_cstate snd_hda_core
intel_uncore snd_hwdep intel_rapl_perf snd_pcm pcspkr
  i2c_i801 snd_timer e1000e snd joydev lpc_ich soundcore ip6t_REJECT
nf_reject_ipv6 nf_log_ipv6 ip6table_filter ip6_tables nf_log_ipv4
nf_log_common xt_LOG xt_limit xt_multiport xt_conntrack
  iptable_nat nf_nat xt_connmark nf_conntrack nf_defrag_ipv6
nf_defrag_ipv4 libcrc32c iptable_mangle i915 udl i2c_algo_bit
drm_kms_helper syscopyarea sysfillrect sysimgblt fb_sys_fops drm cr
c32_pclmul crc32c_intel serio_raw video
[  133.320463] CPU: 0 PID: 739 Comm: Xorg Not tainted 5.3.8 #1
[  133.320465] Hardware name: TOSHIBA 4852E70/Intel H61 Express Chipset,
BIOS XBKT200 01/04/2017
[  133.320467] EIP: report_addr+0x3e/0x70
[  133.320470] Code: 00 89 4d f8 85 d2 74 44 8b 0a 8b 5a 04 ba fe ff ff
ff 39 ca ba 00 00 00 00 19 da 73 17 80 3d 9c 16 14 d0 00 0f 84 24 09 00
00 <0f> 0b 8b 5d fc c9 c3 8d 76 00 8b 90 5c 01 00 00 0b 90 58 01 00 00
[  133.320472] EAX:  EBX:  ECX: f5b89e00 EDX: 0007
[  133.320473] ESI:  EDI: ecf3921c EBP: ec56bcf4 ESP: ec56bce8
[  133.320475] DS: 007b ES: 007b FS: 00d8 GS: 00e0 SS: 0068 EFLAGS:
00213286
[  133.320476] CR0: 80050033 CR2: b7236020 CR3: 2c72a000 CR4: 000406f0
[  133.320477] Call Trace:
[  133.320484]  dma_direct_map_page+0x158/0x180
[  133.320487]  dma_direct_map_sg+0x4f/0xa0
[  133.320564]  i915_gem_map_dma_buf+0x1b8/0x1d0 [i915]
[  133.320568]  dma_buf_map_attachment+0x4f/0x90
[  133.320572]  udl_gem_prime_import+0x43/0x12a [udl]
[  133.320607]  drm_gem_prime_fd_to_handle+0x97/0x180 [drm]
[  133.320625]  ? drm_gem_prime_export+0xa0/0xa0 [drm]
[  133.320642]  ? drm_gem_prime_import+0x20/0x20 [drm]
[  133.320658]  ? drm_prime_handle_to_fd_ioctl+0x70/0x70 [drm]
[  133.320673]  drm_prime_fd_to_handle_ioctl+0x2f/0x50 [drm]
[  133.320689]  drm_ioctl_kernel+0x8f/0xd0 [drm]
[  133.320706]  drm_ioctl+0x21c/0x3c0 [drm]
[  133.320721]  ? drm_prime_handle_to_fd_ioctl+0x70/0x70 [drm]
[  133.320726]  ? file_modified+0x30/0x30
[  133.320728]  ? file_update_time+0xfe/0x130
[  133.320731]  ? page_add_file_rmap+0x72/0xd0
[  133.320734]  ? fault_dirty_shared_page.isra.122+0x6d/0xb0
[  133.320750]  ? drm_version+0x80/0x80 [drm]
[  133.320753]  do_vfs_ioctl+0x9a/0x6c0
[  133.320757]  ksys_ioctl+0x56/0x80
[  133.320760]  sys_ioctl+0x16/0x20
[  133.320763]  do_fast_syscall_32+0x82/0x1c7
[  133.320766]  entry_SYSENTER_32+0x9f/0xf2
[  133.320768] EIP: 0xb7f84a75
[  133.320770] Code: e8 1c 00 00 00 89 d3 eb cf 8d 74 26 00 b8 40 42 0f
00 eb b5 8b 04 24 c3 8b 1c 24 c3 8b 3c 24 c3 90 51 52 55 89 e5 0f 34 cd
80 <5d> 5a 59 c3 90 90 90 90 8d 76 00 58 b8 77 00 00 00 cd 80 90 8d 76
[  133.320772] EAX: ffda EBX: 000c ECX: c00c642e EDX: bff26be0
[  133.320773] ESI: 0221ad20 EDI: c00c642e EBP: 000c ESP: bff26b88
[  133.320775] DS: 007b ES: 007b FS:  GS: 0033 SS: 007b EFLAGS:
00203296
[  133.320777] ---[ end trace 18cd4f77716f2f5f ]---

With your drm-next and your patch set, the call trace is obviously
different:

[   37.486584] udl 2-1.2:1.0: swiotlb buffer is full (sz: 536576 bytes),
total 32768 (slots), used 1536 (slots)
[   37.486591] udl 2-1.2:1.0: overflow 0x00011a47d000+536576 of DMA
mask  bus mask 0
[   37.486598] --

Re: [PATCH v3 0/4] drm/udl: Convert to SHMEM

2019-11-07 Thread Böszörményi Zoltán

2019. 11. 07. 16:10 keltezéssel, Böszörményi Zoltán írta:

what's the trick to actually enable the UDL device?

With 5.3.8, 5.3.9 or 5.4-rc6 + drm-next and this patchset, I get this:
[long messages]


I didn't mention that the system is 32-bit, using a PAE kernel.
Is it a problem for swiotlb?

The machine has this CPU:

model name  : Intel(R) Celeron(R) CPU G540 @ 2.50GHz

Best regards,
Zoltán Böszörményi
___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Re: [PATCH v3 0/4] drm/udl: Convert to SHMEM

2019-11-07 Thread Böszörményi Zoltán

Hi,

2019. 11. 07. 10:43 keltezéssel, Thomas Zimmermann írta:

Udl's GEM implementation is mostly SHMEM and we should attempt to
replace it with the latter.

Patches #1 and #2 update udl to simplify the conversion. In patch #3
the udl code is being replaced by SHMEM. The GEM object's mmap() and
free_object() functions are wrappers around their SHMEM counterparts.
For mmap() we fix-up the page-caching flags to distinguish between
write-combined and cached access. For free(), we have to unmap the
buffer's mapping that has been established by udl's fbdev code.
Patch #4 removes the obsolete udl code.

The patchset has been tested by running the fbdev console, X11 and
Weston on a DisplayLink adapter.


what's the trick to actually enable the UDL device?

With 5.3.8, 5.3.9 or 5.4-rc6 + drm-next and this patchset, I get this:

# DISPLAY=:0 xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x76 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload 
crtcs: 2 outputs: 3 associated providers: 0 name:modesetting
Provider 1: id: 0x41 cap: 0x2, Sink Output crtcs: 1 outputs: 1 associated providers: 0 
name:modesetting


# DISPLAY=:0 xrandr --setprovideroutputsource 0x41 0x76

# DISPLAY=:0 xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x76 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload 
crtcs: 2 outputs: 3 associated providers: 1 name:modesetting
Provider 1: id: 0x41 cap: 0x2, Sink Output crtcs: 1 outputs: 1 associated providers: 1 
name:modesetting


# DISPLAY=:0 xrandr
Screen 0: minimum 320 x 200, current 1024 x 768, maximum 8192 x 8192
VGA-1 connected primary 1024x768+0+0 (normal left inverted right x axis y axis) 
376mm x 301mm
   1024x768  75.03*+  60.00
   1280x1024 60.02 +
   1152x864  75.00
   832x624   74.55
   800x600   75.0060.32
   640x480   75.0059.94
   720x400   70.08
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-1 connected 1024x768+0+0 (normal left inverted right x axis y axis) 304mm x 
228mm
   1024x768  60.00*+
DVI-I-1-1 connected (normal left inverted right x axis y axis)
   1024x768  75.03 +  60.00
   1920x1080 60.00 +
   1680x1050 59.88
   1280x1024 75.0260.02
   1440x900  74.9859.90
   1280x720  60.00
   800x600   75.0060.32
   640x480   75.0072.8166.6759.94
   720x400   70.08
  1024x768 (0x42) 78.750MHz +HSync +VSync
h: width  1024 start 1040 end 1136 total 1312 skew0 clock  60.02KHz
v: height  768 start  769 end  772 total  800   clock  75.03Hz
  1280x1024 (0x46) 108.000MHz +HSync +VSync
h: width  1280 start 1328 end 1440 total 1688 skew0 clock  63.98KHz
v: height 1024 start 1025 end 1028 total 1066   clock  60.02Hz
  1024x768 (0x4a) 65.000MHz -HSync -VSync
h: width  1024 start 1048 end 1184 total 1344 skew0 clock  48.36KHz
v: height  768 start  771 end  777 total  806   clock  60.00Hz
  800x600 (0x4b) 49.500MHz +HSync +VSync
h: width   800 start  816 end  896 total 1056 skew0 clock  46.88KHz
v: height  600 start  601 end  604 total  625   clock  75.00Hz
  800x600 (0x4c) 40.000MHz +HSync +VSync
h: width   800 start  840 end  968 total 1056 skew0 clock  37.88KHz
v: height  600 start  601 end  605 total  628   clock  60.32Hz
  640x480 (0x4d) 31.500MHz -HSync -VSync
h: width   640 start  656 end  720 total  840 skew0 clock  37.50KHz
v: height  480 start  481 end  484 total  500   clock  75.00Hz
  640x480 (0x50) 25.175MHz -HSync -VSync
h: width   640 start  656 end  752 total  800 skew0 clock  31.47KHz
v: height  480 start  490 end  492 total  525   clock  59.94Hz
  720x400 (0x51) 28.320MHz -HSync +VSync
h: width   720 start  738 end  846 total  900 skew0 clock  31.47KHz
v: height  400 start  412 end  414 total  449   clock  70.08Hz

# DISPLAY=:0 xrandr --output DVI-I-1-1 --mode 1024x768 --right-of DP-1
xrandr: Configure crtc 2 failed

Even after the last command, my monitor say "no signal" from the UDL (DL-195)
device and dmesg has a kernel warning now:

[  133.320404] udl 2-1.2:1.0: swiotlb buffer is full (sz: 2211840 bytes), total 32768 
(slots), used 0 (slots)
[  133.320410] udl 2-1.2:1.0: overflow 0x0001199e4000+2211840 of DMA mask  bus 
mask 0

[  133.320424] WARNING: CPU: 0 PID: 739 at kernel/dma/direct.c:35 
report_addr+0x3e/0x70
[  133.320425] Modules linked in: 8021q garp mrp stp llc intel_rapl_msr intel_rapl_common 
x86_pkg_temp_thermal intel_powerclamp coretemp kvm_intel snd_hda_codec_hdmi kvm 
snd_hda_codec_realt
ek snd_hda_codec_generic ledtrig_audio snd_hda_intel snd_hda_codec iTCO_wdt elo irqbypass 
iTCO_vendor_support intel_cstate snd_hda_core intel_uncore snd_hwdep intel_rapl_perf 
snd_pcm pcspkr
 i2c_i801 snd_timer e1000e snd joydev lpc_ich soundcore ip6t_REJECT 

Re: UDL device cannot get its own screen

2019-11-05 Thread Böszörményi Zoltán

Hi,

2019. 10. 23. 15:32 keltezéssel, Ilia Mirkin írta:

On Wed, Oct 23, 2019 at 2:41 AM Böszörményi Zoltán  wrote:


2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:

On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán  wrote:

Section "Device"
  Identifier  "UDL"
  Driver  "modesetting"
  Option  "kmsdev" "/dev/dri/card0"
  Screen  2
  Option  "Monitor-DVI-I-1-1" "DVI-I-1-1"


I think you have an extra -1 in here (and the monitor name doesn't
exist as per above). And I think the "Screen" index is wrong -- it's
not what one tends to think it is, as I recall. I think you can just
drop these lines though.


Without "Screen N" lines, all the outputs are assigned to :0
so the screen layout setup in the ServerLayout section is not
applied properly.



As I remember it, the Screen here is for ZaphodHeads-type
configurations, and it indicates which head you're supposed to use of
the underlying device. My suggestion was to only remove it here, not
everywhere.


Okay, but it still doesn't create a working setup.

In the meantime I switched to the GIT version of Xorg, but
it didn't make a difference (for now).

I decided to start from a mostly clean configuration, whatever default
settings or drivers are used. It's modesetting across the board.

The configuration file has just this:

=
Section "ServerFlags"
Option "AutoBindGPU"  "true/false"
EndSection
=

Xorg.0.log has these lines (same as 1.20.4), regardless of the AutoBindGPU 
setting:

... all 3 monitors' EDID data is read and okay ...

[   879.136] (II) modeset(G0): Damage tracking initialized
[   879.140] (II) modeset(0): Damage tracking initialized
[   879.140] (II) modeset(0): Setting screen physical size to 609 x 270

modeset(G0) is UDL and there is no "screen physical size" set for it.

xrandr shows 3 outputs for Intel and 1 for UDL. UDL doesn't have an active mode 
set:

# DISPLAY=:0 xrandr
Screen 0: minimum 320 x 200, current 2304 x 1024, maximum 8192 x 8192
VGA-1 connected primary 1280x1024+0+0 (normal left inverted right x axis y 
axis) 376mm x 301mm
   1280x1024 60.02*+
   1152x864  75.00
   1024x768  75.0360.00
   832x624   74.55
   800x600   75.0060.32
   640x480   75.0059.94
   720x400   70.08
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-1 connected 1024x768+1280+0 (normal left inverted right x axis y axis) 304mm 
x 228mm
   1024x768  60.00*+
DVI-I-1-1 connected (normal left inverted right x axis y axis)
   1920x1080 60.00 +
   1680x1050 59.88
   1280x1024 75.0260.02
   1440x900  74.9859.90
   1280x720  60.00
   1024x768  75.0360.00
   800x600   75.0060.32
   640x480   75.0072.8166.6759.94
   720x400   70.08
  1280x1024 (0x44) 108.000MHz +HSync +VSync
h: width  1280 start 1328 end 1440 total 1688 skew0 clock  63.98KHz
v: height 1024 start 1025 end 1028 total 1066   clock  60.02Hz
  1024x768 (0x48) 78.750MHz +HSync +VSync
h: width  1024 start 1040 end 1136 total 1312 skew0 clock  60.02KHz
v: height  768 start  769 end  772 total  800   clock  75.03Hz
  1024x768 (0x49) 65.000MHz -HSync -VSync
h: width  1024 start 1048 end 1184 total 1344 skew0 clock  48.36KHz
v: height  768 start  771 end  777 total  806   clock  60.00Hz
  800x600 (0x4a) 49.500MHz +HSync +VSync
h: width   800 start  816 end  896 total 1056 skew0 clock  46.88KHz
v: height  600 start  601 end  604 total  625   clock  75.00Hz
  800x600 (0x4b) 40.000MHz +HSync +VSync
h: width   800 start  840 end  968 total 1056 skew0 clock  37.88KHz
v: height  600 start  601 end  605 total  628   clock  60.32Hz
  640x480 (0x4c) 31.500MHz -HSync -VSync
h: width   640 start  656 end  720 total  840 skew0 clock  37.50KHz
v: height  480 start  481 end  484 total  500   clock  75.00Hz
  640x480 (0x4f) 25.175MHz -HSync -VSync
h: width   640 start  656 end  752 total  800 skew0 clock  31.47KHz
v: height  480 start  490 end  492 total  525   clock  59.94Hz
  720x400 (0x50) 28.320MHz -HSync +VSync
h: width   720 start  738 end  846 total  900 skew0 clock  31.47KHz
v: height  400 start  412 end  414 total  449   clock  70.08Hz
#

I can't actually set a mode for it manually:

# DISPLAY=:0 xrandr --output DVI-I-1-1 --mode 1280x1024
xrandr: Configure crtc 2 failed

So, for some reason, while the output is there, the monitor is detected
via EDID, there is no CRTC assigned to it.

With AutoBindGPU=false, the UDL device is not actually activated,
despite the lines present about modeset(

Re: UDL device cannot get its own screen

2019-10-23 Thread Böszörményi Zoltán

2019. 10. 23. 9:42 keltezéssel, Pekka Paalanen írta:

On Tue, 22 Oct 2019 17:50:21 +0200
Böszörményi Zoltán  wrote:


Hi,

I have the below configuration for an Intel based POS system that,
while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
only two are usable. DP1 for the built-in touchscreen and VGA1 for
the external VGA connector.

I wanted to use an USB DisplayLink device as the 3rd output, with all
three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.


...


The third observation is that while I am using this configuration below,
so the UDL device should be assigned to :0.2 (and active!), it is really
assigned to :0[.0] as an inactive output. See that there's no "*" indicator
set for any of the supported modes on DVI-I-1-1.

How can I set up 3 different Screens correctly for 3 separate fullscreen
applications?

I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
patch from Dave Airlie that at least wakes up the UDL device and makes
it visible without extra magic with providers/sinks.


Hi,

for your specific use case, auto-bind is exactly what you do not want.
So drop the patch or (since the patch is in upstream master already)
use the option it adds to stop auto-binding.


With Option "AutoBindGPU" "false" in effect (equivalent of backing the
patch out) the UDL device does not get assigned to ANY of the screens.

I want it to have its own :0.2 bit that doesn't happen.




Thanks,
pq



___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Re: UDL device cannot get its own screen

2019-10-23 Thread Böszörményi Zoltán

2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:

On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán  wrote:


Hi,

I have the below configuration for an Intel based POS system that,
while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
only two are usable. DP1 for the built-in touchscreen and VGA1 for
the external VGA connector.

I wanted to use an USB DisplayLink device as the 3rd output, with all
three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.

[...]

How can I set up 3 different Screens correctly for 3 separate fullscreen
applications?

I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
patch from Dave Airlie that at least wakes up the UDL device and makes
it visible without extra magic with providers/sinks.


If it's being treated as a GPU, that's your first problem for this
kind of setup. You should see modeset(2), in your logs, but I suspect
you're seeing modeset(G0) (the "G" indicates "GPU").


modeset(2) is the unconnected HDMI-1 output advertised by the Intel chip.
modeset(G0) is UDL.





[...]
Section "Monitor"
 Identifier  "DVI-I-1-1"


The others are Monitor-*, this one isn't. You probably want this to be
DVI-I-1, as noted below. I guess you get the extra -1 from seeing it
as a slaved GPU's output in your current configuration.


Indeed. Fixed.




 Option  "AutoServerLayout" "on"
 Option  "Rotate" "normal"
EndSection

[...]

>>

Section "Device"
 Identifier  "UDL"
 Driver  "modesetting"
 Option  "kmsdev" "/dev/dri/card0"
 Screen  2
 Option  "Monitor-DVI-I-1-1" "DVI-I-1-1"


I think you have an extra -1 in here (and the monitor name doesn't
exist as per above). And I think the "Screen" index is wrong -- it's
not what one tends to think it is, as I recall. I think you can just
drop these lines though.


Without "Screen N" lines, all the outputs are assigned to :0
so the screen layout setup in the ServerLayout section is not
applied properly.

I have read Dave Airlie's patch (that has been accepted into Xorg
1.21.0) more closely, and it indeed binds UDL to DISPLAY=:0
I think this patch needs a followup patch so it would use the
screen ID specified in the Device section.




EndSection

[...]

Section "ServerLayout"
 Identifier  "LAYOUT"
 Option  "AutoServerLayout" "on"
 Screen  0 "SCREEN"
 Screen  1 "SCREEN1" RightOf "SCREEN"
 Screen  2 "SCREEN2" RightOf "SCREEN1"
EndSection

Best regards,
Zoltán Böszörményi
___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel


___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

UDL device cannot get its own screen

2019-10-22 Thread Böszörményi Zoltán

Hi,

I have the below configuration for an Intel based POS system that,
while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
only two are usable. DP1 for the built-in touchscreen and VGA1 for
the external VGA connector.

I wanted to use an USB DisplayLink device as the 3rd output, with all
three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.

The first observation is that I can't seem to be able to use the Intel
DDX driver in conjunction with the modesetting DDX that the UDL device
uses. The symptom is that two modesetting outputs are initialized,
one for UDL and one for the disconnected HDMI1 Intel output. At least
now the X server don't crash as with Xorg 1.19.x with a similar attempt.

The second is that when the modesetting driver is used, the Intel outputs
are renamed from VGA1 to VGA-1 and so on, i.e. the outputs get an extra
"-" between the output type and the number so it needed extra typing
to port the original config from intel to modesetting.

The third observation is that while I am using this configuration below,
so the UDL device should be assigned to :0.2 (and active!), it is really
assigned to :0[.0] as an inactive output. See that there's no "*" indicator
set for any of the supported modes on DVI-I-1-1.

How can I set up 3 different Screens correctly for 3 separate fullscreen
applications?

I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
patch from Dave Airlie that at least wakes up the UDL device and makes
it visible without extra magic with providers/sinks.

# DISPLAY=:0 xrandr
Screen 0: minimum 320 x 200, current 1024 x 768, maximum 8192 x 8192
DP-1 connected primary 1024x768+0+0 (normal left inverted right x axis y axis) 
304mm x 228mm
   1024x768  60.00*+
DVI-I-1-1 connected (normal left inverted right x axis y axis)
   1024x768  75.03 +  60.00
   1920x1080 60.00 +
   1680x1050 59.88
   1280x1024 75.0260.02
   1440x900  74.9859.90
   1280x720  60.00
   800x600   75.0060.32
   640x480   75.0072.8166.6759.94
   720x400   70.08
  1024x768 (0x4a) 65.000MHz -HSync -VSync
h: width  1024 start 1048 end 1184 total 1344 skew0 clock  48.36KHz
v: height  768 start  771 end  777 total  806   clock  60.00Hz

# cat /etc/X11/xorg.conf.d/videocard.conf
Section "Monitor"
Identifier  "Monitor-DP-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Monitor"
Identifier  "Monitor-HDMI-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Monitor"
Identifier  "Monitor-VGA-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Monitor"
Identifier  "DVI-I-1-1"
Option  "AutoServerLayout" "on"
Option  "Rotate" "normal"
EndSection

Section "Device"
Identifier  "Intel0"
Driver  "modesetting"
Option  "kmsdev" "/dev/dri/card1"
Screen  0
Option  "Monitor-DP1" "DP-1"
Option  "ZaphodHeads" "DP-1"
EndSection

Section "Device"
Identifier  "Intel1"
Driver  "modesetting"
Option  "kmsdev" "/dev/dri/card1"
Screen  1
Option  "Monitor-VGA-1" "VGA-1"
Option  "ZaphodHeads" "VGA-1"
EndSection

# Intentionally not referenced in ServerLayout below
Section "Device"
Identifier  "Intel2"
Driver  "modesetting"
Option  "kmsdev" "/dev/dri/card1"
Option  "Monitor-HDMI-1" "HDMI-1"
Option  "ZaphodHeads" "HDMI-1"
EndSection

Section "Device"
Identifier  "UDL"
Driver  "modesetting"
Option  "kmsdev" "/dev/dri/card0"
Screen  2
Option  "Monitor-DVI-I-1-1" "DVI-I-1-1"
EndSection

Section "Screen"
Identifier  "SCREEN"
Option  "AutoServerLayout" "on"
Device  "Intel0"
Monitor "Monitor-DP-1"
SubSection  "Display"
Modes   "1024x768"
Depth   24
EndSubSection
EndSection

Section "Screen"
Identifier  "SCREEN1"
Option  "AutoServerLayout" "on"
Device  "Intel1"
Monitor "Monitor-VGA-1"
SubSection  "Display"
Modes   "1024x768"
Depth   24
EndSubSection
EndSection

Section "Screen"
Identifier  "SCREEN2"
Option  "AutoServerLayout" "on"
Device  "UDL"
Monitor "Monitor-DVI-I-1-1"
SubSection  "Display"
Modes   "1024x768"
Depth   24
EndSubSection
EndSection

Section "ServerLayout"
Identifier  

How to activate UDL DRM from xorg.conf?

2018-10-09 Thread Böszörményi Zoltán

Hi,

I need to setup a three monitor system on an Intel based
touchscreen POS machine using separate X screens so standalone
fullscreen apps can be shown on each screen.

Using the Intel driver with two Section "Device" with Driver "intel"
and different ZaphodHeads settings work so I can have DISPLAY=:0
and DISPLAY=:0.1 already.

What I can't seem to be able to achieve is using a 3rd
Section "Device" that would use the modesetting driver over
the udl / udldrmfb kernel driver. Setting

 Option "kmsdev" "/dev/dri/card1"

does not help. Also, /dev/fb1 does not appear after udldrmfb is
registered during boot. /dev/fb0 for i915 exists.

When Xorg attempts to use both the two device sections with the
Intel driver AND the third device section with the modesetting
driver to drive the Intel chip and the result is a crash in the server.

When I try with an empty configuration, Xorg picks up all
Intel outputs and it also sees the XRandr provider from UDL
but no output is activated for it.

The thing is that I need a fixed configuration with pre-set
resolutions for all monitors in xorg.conf so I can also
pre-calculate the touchscreen transformation matrix in advance.
And the reason I need a pre-set touchscreen configuration
(as opposed to using xinput map-to-output after X has started)
is that as soon as the monitor is turned off and on or the USB
connection is replugged, the xinput setup is lost from the X server.

So, how can I convince the modesetting driver to activate the
UDL KMS mode from xorg.conf?

I am using kernel 4.18.11 and Xorg 1.19.6.

Thanks in advance,
Zoltán Böszörményi

___
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel