As somebody who writes software for neuroscience research, I would find
this new property very useful.
Even if the 'link bpc' is not the perfectly accurate answer in presence of
dithering or display stream compression, I think it could provide an idea
about the minimum precision available, so a DRM client can at least make
sure its minimum requirements are met. E.g., a 'link bpc' of 10 would at
least guarantee 10 bpc, and effectively a bit more if spatial dithering is
applied in addition. That said, I don't have practical experience with the
effects of DSC, I don't have any suitable hardware. Afaik it is not truly
lossless, but only (supposed to be, usually) perceptually lossless. It
would be great to have some property that informs clients if DSC is active
or not, or allow some control over that.
Also as somebody who has spent many hours of his life hunting down some
sysfs or debugfs files for reporting such numbers, the files usually being
differently named, at different paths, with different formats, or not
existing at all, depending on kernel version and gpu configuration. I'd
like to do less of that in the future.
In this field of application it is often very important to know about the
actual precision of "what goes out of the computer", e.g., visual test
stimuli. Scientists use different methods to verify their experimental
stimulation setups, of different levels of rigorosity, including
photometers, colorimeters etc. to measure the actual light emitted by a
display. But knowing if things work, or where in the pipeline from app to
photon they break or degrade, if they break, is useful, and the more the
software can help with this, or warn about problems, the better for us.
Even if Wayland compositors wouldn't pick this up quickly, if it is a drm
connector property, I think it would be accessible under a native X11
X-Server via RandR output properties -- and my kind of applications still
heavily relies on native X11, as the Wayland eco system currently is not
ready for the more demanding or non-trivial use cases in this field. Also,
those properties are read-accessible to non-root, non-drm masters, so
applications like mine could read the property even under leased drm
connectors (Vulkan/WSI/display, OpenGL/EGL/drm), or probably even under a
running Wayland desktop if Wayland protocol lacks the means to do so.
In the scenarios used by my app, the app often knows what an optimal
setting for 'max bpc' or reported value from such a 'link bpc' would be, so
it can be used to reconfigure things (under X11 RandR, or as a drm master),
adapt to the situation, or at least warn the user if they are about to ruin
their experimental data collection, possibly guide them a bit in
troubleshooting.
But I could imagine regular desktop use cases, where a Wayland compositor
can somewhat know what good minimum values for 'link bpc' would be, and
maybe adapt, or give the user a hint about potentially degraded quality,
and what to do about it ("Check your cables", "Reduce video resolution or
refresh rate", "Run with less displays",...).
Similar to Nicolas rgb 10 bpc vs. yuv 10 bpc example for video playback:
While all these are critical for apps like mine, or other pro apps
depending on color quality, I'd assume a Wayland compositor could use the
same constraints, even if the worst case desktop scenario may only be an
underwhelmed user, if their HDR videos don't look as spiffy as they hoped.
- A HDR-10 display mode on a true HDR sink implies one really wants a 'link
bpc' of at least 10 bpc, especially given the large nonlinearity of EOTF's
like Perceptual Quantizer, or things will look poor. In a scientific
research setting that would not just be a bummer, but degradation would be
an absolute show stopper. Something one wants to fix, be it by
checking/swapping cables, or maybe by selecting a video mode with lower
bandwidth requirements, etc.
- Same is true for wide color gamut WCG color spaces, where one wants more
than 8 bits to resolve the larger color volume fine enough for good results.
- I'd also assume or hope that a wayland client asking for a fullscreen
(=possibly direct scanout capable) RGB10 framebuffer or fp16 fb or even
RGBA16 fb would imply to the compositor that that client really wants to
get at 10 bpc or even 12+ bpc out of the display connector. So having a too
low link bpc would be a reason to possibly notify the user.
Excuse the verbose reply, but at least from my corner of applications this
would have a big thumbs up.
Thanks,
-mario
On Fri, Mar 20, 2026 at 7:09 PM Nicolas Frattaroli <
[email protected]> wrote:
> On Friday, 20 March 2026 15:32:37 Central European Standard Time Michel
> Dänzer wrote:
> > On 3/19/26 13:28, Nicolas Frattaroli wrote:
> > > This series adds a new "link bpc" DRM property. It reflects the display
> > > link's actual achieved output bits per component, considering any
> > > degradation of the bit depth done by drivers for bandwidth or other
> > > reasons. The property's value is updated during an atomic commit, which
> > > is also when it fires an uevent if it changed to let userspace know.
> > >
> > > There's a weston implementation at [1] which makes use of this new
> > > property to warn when a user's requested bpc could not be reached.
> > >
> > > [1]:
> https://gitlab.freedesktop.org/wayland/weston/-/merge_requests/1850
> >
> > I see no description of a real-world use case, either in this series
> > or in the weston MR, beyond logging a message when the "link bpc" &
> > "max bpc" property values don't match. They are not expected to match
> > in general, so I have a hard time seeing the usefulness of that.
>
> Hello,
>
> these are valid concerns. The problem being addressed is related to
> userspace being able to detect whether the link has degraded due to,
> say, a sketchy cable.
>
> This patch started out as a method of forcing the output link's BPC
> value to a certain value, but this is not desirable. The max bpc
> property is already used to restrict the link's bpc due to sketchy
> hardware that advertises a higher max bpc than it can actually
> achieve.
>
> This adds the other side of the equation, where userspace isn't
> necessarily keen on blindly accepting the combination of output
> link parameters the kernel degraded to. This allows userspace to
> detect that an explicitly chosen value it tried did not work, and
> try again with a different color format/VRR/bpc/etc.
>
> A particular real-world use case is for playback of video content.
> When playing back YUV 4:2:0 10-bit video content in a full-screen
> setting, having RGB 10-bit degrade to YUV 4:2:0 10-bit rather than
> RGB 8-bit is more desirable. However, this is a tradeoff only
> userspace knows to make; the kernel doesn't necessarily know that
> the framebuffer it has been handed as RGB 10-bit is secretly just
> a video player's playback of YUV 4:2:0 10-bit content. As for
> the property that let's userspace actually set the output color
> format, that's a separate series of mine.
>
> I agree that the weston implementation isn't a great showcase,
> but it's actually supposed to compare link bpc with an explicitly
> set max bpc config value, not the property value. The config value
> exists to request a certain bpc.
>
> > Moreover, there's no description of what exactly the "link bpc" property
> > value means, e.g. vs things like DSC or dithering, or how a compositor /
> > user would determine which value they need / want under given
> circumstances.
>
> I agree that I should've expanded on this after splitting it out of the
> HDMI patch. It's the output BPC as HDMI understands it. That means DSC is
> not
> a factor. I don't know if any display protocols do dithering at the
> protocol level, I only know some monitors dither internally, which isn't
> something that can be detected.
>
> > In summary, I'm skeptical that this will be useful in practice in the
> > current form. I do see potential for spurious bug reports based on the
> > "link bpc" property having the "wrong" value though.
>
> Kind regards,
> Nicolas Frattaroli
>
>
>