RE: Do not display the screensaver

2010-05-21 Thread Cui, Hunk
Hi, All,
Through communication with you, I have a certain understanding of this 
Gamma correction RAM (PAR  PDR registers) principle.
Now I use the ddd tools to debugging the xscreensaver-xserver, when I 
debug the server (about Get Gamma Ramp), The function: xf86GetGammaRamp - 
RRCrtcGammaGet - xf86RandR12CrtcGetGamma please see below:

The first line:
xf86CrtcPtr crtc = randr_crtc-devPrivate;

After run upper line,
crtc-gamma_red, crtc-gamma_green, crtc-gamma_blue tables have been loaded 
into the Gamma Correction values

Now I want to ask everyone, In what is the definition about the  
randr_crtc-devPrivate , where the values are arise in?

Thanks,
Hunk Cui

-Original Message-
From: yang...@gmail.com [mailto:yang...@gmail.com] On Behalf Of Yang Zhao
Sent: Thursday, May 20, 2010 12:39 PM
To: Cui, Hunk
Cc: xorg-devel@lists.x.org
Subject: Re: Do not display the screensaver

On 19 May 2010 21:03, Cui, Hunk hunk@amd.com wrote:
        What is mean about the server's internal representation is abstracted 
 away from the driver's representation? I don't understand it. Can you 
 particular explain it. I know the R,G,B originality values are 16 bits per 
 channel. When the values are transfered to the driver layer, they will be 
 dealed with through val = (*red  8) | *green | (*blue8); because the 
 val will be writen into Gamma correction RAM register (the type of hardware 
 register: Each of the entries are made up of corrections for R/G/B. Within 
 the DWORD, the red correction is in b[23:16], green in b[15:8] and blue in 
 b[7:0]).
        Why the driver is allowed to truncate? And why not be transfered by 
 R/G/B former values. Can you know that?

 BTW: Behrmann, Three one dimensional look up tables (LUT) each of 8-bit 
 resolution would not make much sense in this scenario. Please particular 
 explain it. Thank you for your earnest reply.

The term gamma in this discussion is actually misleading: all the
gamma-related calls, as they are currently implemented, eventually
results in writes to the hardware LUT which translates pixel values to
actual electrical output values. Gamma correction is just one of the
things you can do by modifying the values in the table.

The precision of the LUT depends on the hardware. Radeons, for
example, have 10 bits of precision per channel. CARD16 is an
appropriate upper bound for the range of precisions that will
realistically be in use.  Also keep in mind that these LUTs were used
primarily to drive analog outputs not too long ago, which have much,
much higher precisions than their digital counter parts.

A client makes a gamma correction call, server generates a new LUT
with 16 bits of precision per channel, then the driver takes this and
truncates to whatever precision the hardware can actually take.


-- 
Yang Zhao
http://yangman.ca

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

RE: Do not display the screensaver

2010-05-21 Thread Cui, Hunk
Hi, All,
Through communication with you, I have a certain understanding of this 
Gamma correction RAM (PAR  PDR registers) principle.
Now I use the ddd tools to debugging the xscreensaver-xserver, when I 
debug the server (about Get Gamma Ramp), The function: xf86GetGammaRamp - 
RRCrtcGammaGet - xf86RandR12CrtcGetGamma please see below:

The first line:
xf86CrtcPtr crtc = randr_crtc-devPrivate;

After run upper line,
crtc-gamma_red, crtc-gamma_green, crtc-gamma_blue tables have been loaded 
into the Gamma Correction values

Now I want to ask everyone, In what is the definition about the  
randr_crtc-devPrivate , where the values are arise in?

[Cui, Hunk] The randr_crtc-devPrivate-gamma_red / 
randr_crtc-devPrivate-gamma_green / randr_crtc-devPrivate-gamma_blue are 
initialized in xf86InitialConfiguration - xf86CrtcSetInitialGamma function, 
is it?

Thanks,
Hunk Cui

-Original Message-
From: yang...@gmail.com [mailto:yang...@gmail.com] On Behalf Of Yang Zhao
Sent: Thursday, May 20, 2010 12:39 PM
To: Cui, Hunk
Cc: xorg-devel@lists.x.org
Subject: Re: Do not display the screensaver

On 19 May 2010 21:03, Cui, Hunk hunk@amd.com wrote:
        What is mean about the server's internal representation is abstracted 
 away from the driver's representation? I don't understand it. Can you 
 particular explain it. I know the R,G,B originality values are 16 bits per 
 channel. When the values are transfered to the driver layer, they will be 
 dealed with through val = (*red  8) | *green | (*blue8); because the 
 val will be writen into Gamma correction RAM register (the type of hardware 
 register: Each of the entries are made up of corrections for R/G/B. Within 
 the DWORD, the red correction is in b[23:16], green in b[15:8] and blue in 
 b[7:0]).
        Why the driver is allowed to truncate? And why not be transfered by 
 R/G/B former values. Can you know that?

 BTW: Behrmann, Three one dimensional look up tables (LUT) each of 8-bit 
 resolution would not make much sense in this scenario. Please particular 
 explain it. Thank you for your earnest reply.

The term gamma in this discussion is actually misleading: all the
gamma-related calls, as they are currently implemented, eventually
results in writes to the hardware LUT which translates pixel values to
actual electrical output values. Gamma correction is just one of the
things you can do by modifying the values in the table.

The precision of the LUT depends on the hardware. Radeons, for
example, have 10 bits of precision per channel. CARD16 is an
appropriate upper bound for the range of precisions that will
realistically be in use.  Also keep in mind that these LUTs were used
primarily to drive analog outputs not too long ago, which have much,
much higher precisions than their digital counter parts.

A client makes a gamma correction call, server generates a new LUT
with 16 bits of precision per channel, then the driver takes this and
truncates to whatever precision the hardware can actually take.


-- 
Yang Zhao
http://yangman.ca

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

RE: Do not display the screensaver

2010-05-19 Thread Cui, Hunk
Hi, Jackson,
First thanks for your explanation, through the debugging, I found when 
I start the fade to black, the gamma values will be setup to the default 
value (1.0), it will be transferred to the XServer. And in XServer, the value 
will be write into the VidModeSetGamma - xf86ChangeGamma - 
xf86RandR12ChangeGamma - gamma_to_ramp (calculate the RGB values) - 
RRCrtcGammaSet. Now I have some difficulty. In gamma_to_ramp step, I found the 
type of ramp value is CARD16. Why is not the CARD8? For R,G,B values, it only 
have 256bytes RAM.
Can you tell me the reason?

Looking forward to your reply.

Thanks,
Hunk Cui

-Original Message-
From: Adam Jackson [mailto:a...@nwnk.net] 
Sent: Wednesday, May 19, 2010 12:00 AM
To: Cui, Hunk
Cc: xorg-devel@lists.x.org
Subject: Re: Do not display the screensaver

On Thu, 2010-05-13 at 11:26 +0800, Cui, Hunk wrote:
 Hi, all,
 
  The screensaver issue,
 
 About the xscreensaver question,
 
 1. In fade_screens functions (fade.c) it will call
 xf86_gamma_fade function at fade.c, I found it will fading in (from
 black), then first crank the gamma all the way down to 0, then take
 the windows off the screen, Why are the RGB values setup to 0? And
 what is mean about the fading in (from black)?

gnome-screensaver (which I assume is what you're looking at) changes the
gamma ramp to achieve the fade to black effect, because that looks
smoother than adjusting backlight levels.

- ajax
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


RE: Do not display the screensaver

2010-05-19 Thread Kai-Uwe Behrmann
Gamma ramps vary from card to driver. 8,10,12-bit Who knows? A correct 
implementation has to check the gamma ramp size. I thought to have read 
somewhere in the adverticing material that ATI has more then 8-bit ramps.


kind regards
Kai-Uwe Behrmann
--
developing for colour management 
www.behrmann.name + www.oyranos.org



Am 19.05.10, 19:01 +0800 schrieb Cui, Hunk:

Hi, Jackson,
First thanks for your explanation, through the debugging, I found when I start the fade 
to black, the gamma values will be setup to the default value (1.0), it will be transferred to 
the XServer. And in XServer, the value will be write into the VidModeSetGamma - xf86ChangeGamma 
- xf86RandR12ChangeGamma - gamma_to_ramp (calculate the RGB values) - RRCrtcGammaSet. Now 
I have some difficulty. In gamma_to_ramp step, I found the type of ramp value is CARD16. Why is not 
the CARD8? For R,G,B values, it only have 256bytes RAM.
Can you tell me the reason?

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


RE: Do not display the screensaver

2010-05-19 Thread Cui, Hunk
Behrmann,
You said the Gamma ramps have 8, 10, 12-bit, but each of the entries 
are made up of corrections for R/G/B. Within the DWORD, the red correction is 
in b[23:16], green in b[15:8] and blue in b[7:0]. For 24 bit graphics, each 
color (R, G, and B) are comprised of one byte. The Gamma Correction RAM has a 
256 byte block for each color. When the Gamma Correction RAM is enabled for 
graphics use, the data byte of original color is used as an address into the 
Gamma Correction RAM which produces a new byte of data, a new color intensity. 
Then they will be wrote to the Hardware registers.

Thanks,
Hunk Cui

-Original Message-
From: Kai-Uwe Behrmann [mailto:k...@gmx.de] 
Sent: Wednesday, May 19, 2010 7:22 PM
To: Cui, Hunk
Cc: xorg-devel@lists.x.org
Subject: RE: Do not display the screensaver

Gamma ramps vary from card to driver. 8,10,12-bit Who knows? A correct 
implementation has to check the gamma ramp size. I thought to have read 
somewhere in the adverticing material that ATI has more then 8-bit ramps.

kind regards
Kai-Uwe Behrmann
-- 
developing for colour management 
www.behrmann.name + www.oyranos.org


Am 19.05.10, 19:01 +0800 schrieb Cui, Hunk:
 Hi, Jackson,
   First thanks for your explanation, through the debugging, I found when 
 I start the fade to black, the gamma values will be setup to the default 
 value (1.0), it will be transferred to the XServer. And in XServer, the value 
 will be write into the VidModeSetGamma - xf86ChangeGamma - 
 xf86RandR12ChangeGamma - gamma_to_ramp (calculate the RGB values) - 
 RRCrtcGammaSet. Now I have some difficulty. In gamma_to_ramp step, I found 
 the type of ramp value is CARD16. Why is not the CARD8? For R,G,B values, it 
 only have 256bytes RAM.
   Can you tell me the reason?

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


RE: Do not display the screensaver

2010-05-19 Thread Kai-Uwe Behrmann

I know more the client side of the API.
DisplayPort can serve up to 16-bit per channel.
Three one dimensional look up tables (LUT) each of 8-bit resolution would 
not make much sense in this scenario.


kind regards
Kai-Uwe Behrmann
--
developing for colour management 
www.behrmann.name + www.oyranos.org



Am 19.05.10, 19:48 +0800 schrieb Cui, Hunk:

Behrmann,
You said the Gamma ramps have 8, 10, 12-bit, but each of the entries 
are made up of corrections for R/G/B. Within the DWORD, the red correction is 
in b[23:16], green in b[15:8] and blue in b[7:0]. For 24 bit graphics, each 
color (R, G, and B) are comprised of one byte. The Gamma Correction RAM has a 
256 byte block for each color. When the Gamma Correction RAM is enabled for 
graphics use, the data byte of original color is used as an address into the 
Gamma Correction RAM which produces a new byte of data, a new color intensity. 
Then they will be wrote to the Hardware registers.

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


RE: Do not display the screensaver

2010-05-19 Thread Adam Jackson
On Wed, 2010-05-19 at 17:07 +0200, Kai-Uwe Behrmann wrote:
 I know more the client side of the API.
 DisplayPort can serve up to 16-bit per channel.
 Three one dimensional look up tables (LUT) each of 8-bit resolution would 
 not make much sense in this scenario.

Strictly, so can DVI and HDMI.

- ajax


signature.asc
Description: This is a digitally signed message part
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

RE: Do not display the screensaver

2010-05-19 Thread Cui, Hunk
Hi, Jackson  Behrmann,

What is mean about the server's internal representation is abstracted 
away from the driver's representation? I don't understand it. Can you 
particular explain it. I know the R,G,B originality values are 16 bits per 
channel. When the values are transfered to the driver layer, they will be 
dealed with through val = (*red  8) | *green | (*blue8); because the 
val will be writen into Gamma correction RAM register (the type of hardware 
register: Each of the entries are made up of corrections for R/G/B. Within the 
DWORD, the red correction is in b[23:16], green in b[15:8] and blue in b[7:0]).
Why the driver is allowed to truncate? And why not be transfered by 
R/G/B former values. Can you know that?

BTW: Behrmann, Three one dimensional look up tables (LUT) each of 8-bit 
resolution would not make much sense in this scenario. Please particular 
explain it. Thank you for your earnest reply.

Thanks,
Hunk Cui

-Original Message-
From: Adam Jackson [mailto:a...@nwnk.net] 
Sent: Wednesday, May 19, 2010 10:45 PM
To: Cui, Hunk
Cc: xorg-devel@lists.x.org
Subject: RE: Do not display the screensaver

On Wed, 2010-05-19 at 19:01 +0800, Cui, Hunk wrote:
 Hi, Jackson,
   First thanks for your explanation, through the debugging, I found
 when I start the fade to black, the gamma values will be setup to
 the default value (1.0), it will be transferred to the XServer. And in
 XServer, the value will be write into the VidModeSetGamma -
 xf86ChangeGamma - xf86RandR12ChangeGamma - gamma_to_ramp (calculate
 the RGB values) - RRCrtcGammaSet. Now I have some difficulty. In
 gamma_to_ramp step, I found the type of ramp value is CARD16. Why is
 not the CARD8? For R,G,B values, it only have 256bytes RAM.

X color specifications are 16 bits per channel.  If your gamma ramp is
less precise than that, your driver is allowed to truncate, but the
server's internal representation is abstracted away from the driver's
representation.

- ajax
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


RE: Do not display the screensaver

2010-05-19 Thread Kai-Uwe Behrmann

Am 20.05.10, 12:03 +0800 schrieb Cui, Hunk:

BTW: Behrmann, Three one dimensional look up tables (LUT) each of 8-bit resolution 
would not make much sense in this scenario. Please particular explain it. Thank you 
for your earnest reply.


LUTs look up a resulting value for each input value. If hardware wants 
to output 10-bit it can not use a 8-bit table as only 256 output values 
would be available. 10-bit in- and output requires 1024 table entries at 
least.
As a typical use case the LUTs should be able to pass through each value 
unaltered. If that is not possible the LUT is a limiting factor.


Btw. this is very basic stuff and should be covered in according IT 
learning material.


kind regards
Kai-Uwe Behrmann
--
developing for colour management 
www.behrmann.name + www.oyranos.org


___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: Do not display the screensaver

2010-05-19 Thread Alex Deucher
On Thu, May 20, 2010 at 12:03 AM, Cui, Hunk hunk@amd.com wrote:
 Hi, Jackson  Behrmann,

        What is mean about the server's internal representation is abstracted 
 away from the driver's representation? I don't understand it. Can you 
 particular explain it. I know the R,G,B originality values are 16 bits per 
 channel. When the values are transfered to the driver layer, they will be 
 dealed with through val = (*red  8) | *green | (*blue8); because the 
 val will be writen into Gamma correction RAM register (the type of hardware 
 register: Each of the entries are made up of corrections for R/G/B. Within 
 the DWORD, the red correction is in b[23:16], green in b[15:8] and blue in 
 b[7:0]).
        Why the driver is allowed to truncate? And why not be transfered by 
 R/G/B former values. Can you know that?

 BTW: Behrmann, Three one dimensional look up tables (LUT) each of 8-bit 
 resolution would not make much sense in this scenario. Please particular 
 explain it. Thank you for your earnest reply.


Different hardware has different sized CLUTs (Color LookUp Tables).
Some have 8 bit CLUTs, others may have 10 bit and others may have 16
bit.  So we store it as a CARD16 and let the driver decide how may
bits they support.

Alex

 Thanks,
 Hunk Cui

 -Original Message-
 From: Adam Jackson [mailto:a...@nwnk.net]
 Sent: Wednesday, May 19, 2010 10:45 PM
 To: Cui, Hunk
 Cc: xorg-devel@lists.x.org
 Subject: RE: Do not display the screensaver

 On Wed, 2010-05-19 at 19:01 +0800, Cui, Hunk wrote:
 Hi, Jackson,
       First thanks for your explanation, through the debugging, I found
 when I start the fade to black, the gamma values will be setup to
 the default value (1.0), it will be transferred to the XServer. And in
 XServer, the value will be write into the VidModeSetGamma -
 xf86ChangeGamma - xf86RandR12ChangeGamma - gamma_to_ramp (calculate
 the RGB values) - RRCrtcGammaSet. Now I have some difficulty. In
 gamma_to_ramp step, I found the type of ramp value is CARD16. Why is
 not the CARD8? For R,G,B values, it only have 256bytes RAM.

 X color specifications are 16 bits per channel.  If your gamma ramp is
 less precise than that, your driver is allowed to truncate, but the
 server's internal representation is abstracted away from the driver's
 representation.

 - ajax
 ___
 xorg-devel@lists.x.org: X.Org development
 Archives: http://lists.x.org/archives/xorg-devel
 Info: http://lists.x.org/mailman/listinfo/xorg-devel

___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: Do not display the screensaver

2010-05-19 Thread Yang Zhao
On 19 May 2010 21:03, Cui, Hunk hunk@amd.com wrote:
        What is mean about the server's internal representation is abstracted 
 away from the driver's representation? I don't understand it. Can you 
 particular explain it. I know the R,G,B originality values are 16 bits per 
 channel. When the values are transfered to the driver layer, they will be 
 dealed with through val = (*red  8) | *green | (*blue8); because the 
 val will be writen into Gamma correction RAM register (the type of hardware 
 register: Each of the entries are made up of corrections for R/G/B. Within 
 the DWORD, the red correction is in b[23:16], green in b[15:8] and blue in 
 b[7:0]).
        Why the driver is allowed to truncate? And why not be transfered by 
 R/G/B former values. Can you know that?

 BTW: Behrmann, Three one dimensional look up tables (LUT) each of 8-bit 
 resolution would not make much sense in this scenario. Please particular 
 explain it. Thank you for your earnest reply.

The term gamma in this discussion is actually misleading: all the
gamma-related calls, as they are currently implemented, eventually
results in writes to the hardware LUT which translates pixel values to
actual electrical output values. Gamma correction is just one of the
things you can do by modifying the values in the table.

The precision of the LUT depends on the hardware. Radeons, for
example, have 10 bits of precision per channel. CARD16 is an
appropriate upper bound for the range of precisions that will
realistically be in use.  Also keep in mind that these LUTs were used
primarily to drive analog outputs not too long ago, which have much,
much higher precisions than their digital counter parts.

A client makes a gamma correction call, server generates a new LUT
with 16 bits of precision per channel, then the driver takes this and
truncates to whatever precision the hardware can actually take.


-- 
Yang Zhao
http://yangman.ca
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel