Re: [Intel-gfx] Fwd: State of 10 bits/channel?

2010-07-23 Thread Chris Wilson
On Thu, 22 Jul 2010 17:34:09 -0400, Andrew Lutomirski l...@mit.edu wrote:
 [resend b/c I used the wrong from address last time]
 
 I have a 10 bit/channel monitor (DisplayPort) which works quite nicely
 in 8 bit mode.  I saw some patches from Peter Clifton related to 10
 bit support go in awhile ago.

[snip]
 
 What is the hardware capable of, and what is the state of affairs
 right now?

AIUI, all the intel chips are capable of driving 30-bit displays. The
later generations are more flexible and faster with greater bit depth for
LUTs etc. The output driver support should (in theory) be complete, but it
hasn't been well tested due to scarcity of suitable displays.

  I'm running 2.6.35-rc4+ with a hacked up xf86-video-intel
 with this patch:

Please submit this as a format-patch to the list for inclusion.

 With that patch and DefaultDepth 30, I get a mostly working system,
 but there's no direct rendering (seems to be disabled because DRI is
 disabled because it runs only at depths 16 and 24) title bars on
 gnome-terminal draw incorrectly.

Hmm, the render paths should work on 10bpc surfaces. Please do file bugs
for the failures.

-- 
Chris Wilson, Intel Open Source Technology Centre
___
Intel-gfx mailing list
Intel-gfx@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/intel-gfx


Re: [Intel-gfx] Fwd: State of 10 bits/channel?

2010-07-23 Thread Andrew Lutomirski
On Fri, Jul 23, 2010 at 8:26 AM, Chris Wilson ch...@chris-wilson.co.uk wrote:
 On Thu, 22 Jul 2010 17:34:09 -0400, Andrew Lutomirski l...@mit.edu wrote:
 [resend b/c I used the wrong from address last time]

 I have a 10 bit/channel monitor (DisplayPort) which works quite nicely
 in 8 bit mode.  I saw some patches from Peter Clifton related to 10
 bit support go in awhile ago.

 [snip]

 What is the hardware capable of, and what is the state of affairs
 right now?

 AIUI, all the intel chips are capable of driving 30-bit displays. The
 later generations are more flexible and faster with greater bit depth for
 LUTs etc. The output driver support should (in theory) be complete, but it
 hasn't been well tested due to scarcity of suitable displays.

OK.  I'll try to figure out how to program the output driver.  Do we
want to drive outputs at 30 bits even when the primary surface is 24
bits?  (Benefit: 10-bit LUTs.  Disadvantage: Could break existing
setups.)

Expect lots of questions from me when I run into things that are
unclear or wrong in the docs.


  I'm running 2.6.35-rc4+ with a hacked up xf86-video-intel
 with this patch:

 Please submit this as a format-patch to the list for inclusion.

You only say that because there's one line too few of context to see
how dumb the patch is.  I'll fix and submit.


 With that patch and DefaultDepth 30, I get a mostly working system,
 but there's no direct rendering (seems to be disabled because DRI is
 disabled because it runs only at depths 16 and 24) title bars on
 gnome-terminal draw incorrectly.

 Hmm, the render paths should work on 10bpc surfaces. Please do file bugs
 for the failures.

Will do.  I'll also see how close to working I can get DRI in 30-bit
mode.  (Enabling it right now exposes a silly bug in the compiz
startup scripts which results in having no window manager at all on
F13.  I'll figure out whose bug it is and file that one as well.)


 --
 Chris Wilson, Intel Open Source Technology Centre

___
Intel-gfx mailing list
Intel-gfx@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/intel-gfx


[Intel-gfx] Fwd: State of 10 bits/channel?

2010-07-22 Thread Andrew Lutomirski
[resend b/c I used the wrong from address last time]

I have a 10 bit/channel monitor (DisplayPort) which works quite nicely
in 8 bit mode.  I saw some patches from Peter Clifton related to 10
bit support go in awhile ago.

There are (at least) three modes that would be nice:
 (1) 8bpp framebuffer, 8 bit outputs, but 10-bit LUT with dithering.
 (2) 8bpp framebuffer, 10 bit outputs and LUT.
 (3) 10 bpp framebuffer, outputs, and LUT.

(1) would be nice with any hardware -- color calibration would look
better.  (2) would be a good start for 10 bit displays -- I could
calibrate without banding and userspace would be none the wiser
(except for a different-looking gamma ramp).  (3) would be really cool
and would differentiate us nicely from Windows, which AFAICT doesn't
really support 10 bit outputs on most (all?) hardware.

What is the hardware capable of, and what is the state of affairs
right now?  I'm running 2.6.35-rc4+ with a hacked up xf86-video-intel
with this patch:

diff --git a/src/intel_driver.c b/src/intel_driver.c
index 7761ccf..d0d1a37 100644
--- a/src/intel_driver.c
+++ b/src/intel_driver.c
@@ -570,6 +570,7 @@ static Bool I830PreInit(ScrnInfoPtr scrn, int flags)
       case 15:
       case 16:
       case 24:
+       case 30:
               break;
       default:
               xf86DrvMsg(scrn-scrnIndex, X_ERROR,

(Otherwise, DefaultDepth 30 won't start at all.)

With that patch and DefaultDepth 30, I get a mostly working system,
but there's no direct rendering (seems to be disabled because DRI is
disabled because it runs only at depths 16 and 24) title bars on
gnome-terminal draw incorrectly.

Do any of you know how to ask the system what depth the output is
configured at and what depth the framebuffer is configured at?
Currently, XRRGetCrtcGammaSize return 256, which IIRC should be 129 if
10 bit gamma ramps are being used.  (That's on both CRTCs, one of
which is DP connected to the 10 bit device.)

--Andy
___
Intel-gfx mailing list
Intel-gfx@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/intel-gfx