On 03/02/2018 11:29 PM, Ilia Mirkin wrote:
On Fri, Mar 2, 2018 at 5:16 PM, Mario Kleiner
<mario.kleiner...@gmail.com> wrote:
On 03/01/2018 07:21 PM, nouveau-requ...@lists.freedesktop.org wrote:


Message: 1
Date: Thu, 1 Mar 2018 08:15:55 -0500
From: Ilia Mirkin <imir...@alum.mit.edu>
To: Mario Kleiner <mario.kleiner...@gmail.com>
Cc: nouveau <nouveau@lists.freedesktop.org>
Subject: Re: [Nouveau] [PATCH] Fix colormap handling at screen depth
         30.
Message-ID:

<cakb7uvhnuf6w41vxhpc+q+jvsrognhuxd8czvfoghmysq_x...@mail.gmail.com>
Content-Type: text/plain; charset="UTF-8"

NVLoadPalette is pretty hard-coded to 256. I haven't looked at what
all xf86HandleColormaps does, but it seems pretty suspicious. Also


It's also pretty dead :). NVLoadPalette is not ever used, because nouveau
hooks up the .gamma_set function in xf86CrtcFuncsRec, so xf86HandleColormaps
ignores the NVLoadPalette pointer. Iow. dead code that can be removed. I'll
send some follow up patch, once this one is in. We have similar dead code in
intel-ddx and modesetting-ddx which only serves to confuse the reader.

note that the kernel currently only exposes a 256-sized LUT to
userspace, even for 10bpc modes.


Yes, but that doesn't matter. In xbgr2101010 mode, the gpu seems to properly
interpolate between the 256 (or 257) hw lut slots, as far as my measurments
go. The X-Server maintains separate color palettes, per-x-screen xf86vidmode
gamma lut's and per-crtc RandR gamma lut's and munches them together to
produce the final 256 slot hw lut for the kernel, up/downsampling if needed.

OK, so even if you're passing 1024 to xf86HandleColormaps, gamma_set
still only gets called with a 256-entry LUT? If so, that works nicely
here, but is not intuitive :)

Yes. Lots of remapping in the server, i get dizzy everytime i look at it, and forget almost immediately how stuff fits together when i don't look at it. Anyway, the final downsampling from 1024 -> 256 hw lut happens in xf86RandR12CrtcComputeGamma(), see

https://cgit.freedesktop.org/xorg/xserver/commit/?id=b5f9fcd50a999a00128c0cc3f6e7d1f66182c9d5

for the latest. I'll propose that one to get cherry-picked into the server-1.19 branch as well.


So adapting the values for xf86HandleColorMaps() is about properly sizing
those internal palette's and lut's to avoid out-of-bounds segfaults or loss
of precision somewhere in the whole multi-step remapping procedure, because
one of the server internal tables is a bottleneck with too little slots.

This variant is the one that avoids crashes and also visual artifacts that i
at least observed on tesla gpu's at depth 30.

One weird thing i still observed though is that in depth 30 xbgr2101010
scanout mode nouveau used dithering when i tried to output a linear
intensity ramp, despite me disabling dithering via the xrandr property. But
that is an unrelated problem.

It's sending 8bpc data out to the screen, unless you're using a DP
monitor (and probably would need a Kepler GPU for that anyways).

I think a long time ago i tested 10 bpc output over VGA with the proprietary driver on GeForce 8800, and the current readme for the NVidia blob says it can do 10 bpc over VGA and DisplayPort, but only dithering over DVI and HDMI. I think i read somewhere that at least Pascal could do some deep color output over HDMI as well, which makes sense for HDMI-based HDR-10 support.

Although setting dither to off should still kill the dithering...
probably some experimentation required.

Yes. I can only test the 10 bpc -> 8 bpc dithered output here, as i don't have a native 10 bpc panel available atm. But measurements with the photometer confirmed 10 bpc, ie. the photometer is fooled by the dithering just as well as the human eye, as it integrates over an area of pixels. The dithering is good enough to at least test the whole stack.

I also have a "Datapixx" device which allows to capture the 8 bpc DVI-D signal and send it back to the host for finding out what actually goes out of the DVI connector, so i can check things like if the gamma ramps do what they should, or if dithering is active. However, the device is restricted to 8 bpc, so i wouldn't be able to look at a true 10 bpc stream (DP/HDMI deep color etc.).

-mario


I'm pretty sure I could tell that it was dithering for me on Kepler.
When I added support for 10bpc dither, the dither effect went away
(and it looked no different than the 8bpc gradient). I didn't try
explicitly disabling dithering -- I'll try that tonight and see what
happens (except I've got a Fermi plugged in now).

   -ilia

_______________________________________________
Nouveau mailing list
Nouveau@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/nouveau

Reply via email to