On 30-06-2025 10:21, Laurent Pinchart wrote:
On Mon, Jun 30, 2025 at 10:03:16AM +0200, Mike Looijmans wrote:
On 27-06-2025 20:19, Laurent Pinchart wrote:
On Fri, Jun 27, 2025 at 04:50:46PM +0200, Mike Looijmans wrote:
XRGB8888 is the default mode that Xorg will want to use. Add support
for this to the Zynqmp DisplayPort driver, so that applications can use
32-bit framebuffers. This solves that the X server would fail to start
unless one provided an xorg.conf that sets DefaultDepth to 16.
Signed-off-by: Mike Looijmans <mike.looijm...@topic.nl>
---
drivers/gpu/drm/xlnx/zynqmp_disp.c | 5 +++++
1 file changed, 5 insertions(+)
diff --git a/drivers/gpu/drm/xlnx/zynqmp_disp.c
b/drivers/gpu/drm/xlnx/zynqmp_disp.c
index 80d1e499a18d..501428437000 100644
--- a/drivers/gpu/drm/xlnx/zynqmp_disp.c
+++ b/drivers/gpu/drm/xlnx/zynqmp_disp.c
@@ -312,6 +312,11 @@ static const struct zynqmp_disp_format avbuf_gfx_fmts[] = {
.buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888,
.swap = true,
.sf = scaling_factors_888,
+ }, {
+ .drm_fmt = DRM_FORMAT_XRGB8888,
+ .buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_RGBA8888,
+ .swap = true,
+ .sf = scaling_factors_888,
I'm afraid that's not enough. There's a crucial difference between
DRM_FORMAT_ARGB8888 (already supported by this driver) and
DRM_FORMAT_XRGB8888: for the latter, the 'X' component must be ignored.
The graphics layer is blended on top of the video layer, and the blender
uses both a global alpha parameter and the alpha channel of the graphics
layer for 32-bit RGB formats. This will lead to incorrect operation when
the 'X' component is not set to full opacity.
I spent a few hours digging in the source code and what I could find in
the TRM and register maps, but there's not enough information in there
to explain how the blender works. The obvious "XRGB" implementation
would be to just disable the blender.
That won't work when using global alpha unfortunately :-(
What I got from experimenting so far is that the alpha component is
ignored anyway while the video path isn't active. So as long as one
isn't using the video blending path, the ARGB and XRGB modes are identical.
Correct, *if* global alpha is set to full opaque, then you can disable
the blender. That could confuse userspace though, enabling the graphics
plane with XRGB would work, and then enabling the video plane with
global alpha would fail.
Guess I'll need assistance from AMD/Xilinx to completely implement the
XRGB modes.
The blender can ignore the alpha channel of the graphics plane for
formats that have no alpha channel. It would be nice if there was a bit
to force that behaviour for 32-bit RGB too, but I couldn't find any :-(
It's worth asking though.
Yes, my problem exactly.
(For our application, this patch is sufficient as it solves the issues
like X11 not starting up, OpenGL not working and horrendously slow
scaling performance)
}, {
.drm_fmt = DRM_FORMAT_RGBA8888,
.buf_fmt = ZYNQMP_DISP_AV_BUF_FMT_NL_GFX_ABGR8888,
--
Mike Looijmans
System Expert
TOPIC Embedded Products B.V.
Materiaalweg 4, 5681 RJ Best
The Netherlands
T: +31 (0) 499 33 69 69
E: mike.looijm...@topic.nl
W: www.topic.nl