On 12 May, Keith Packard wrote:
> 
> Around 14 o'clock on May 12, [EMAIL PROTECTED] wrote:
> 
>> The following is an optical model that could be implemented in software
>> and that might be at least partially accelerable by present hardware. (I
>> wonder to what extent this is how the "gamma correction" is done in the
>> current hardware.)
> 
> I haven't examined current hardware, but my intent for Render (at some 
> point) is to add a "gamma corrected" rendering mode where values are 
> converted to "luminosity" before being added together, and then converted 
> back to values.  This conversion would use tables with the same content as 
> the ICCCM XDCCC table, essentially an arbitrary array for each primary.
> 

That could work.  I assume that you mean to convert from RGB to CIE's
XYZ, then add the Y components.  How do you intend to combine the X and
Z components?  One simple approach is to compute x and y values then
interpolate with a alpha weighting.  In theory this should work.  Then
you reconvert back to X and Z, and then back to RGB.

The conversion back and forth between RGB pixel values and XYZ values
still suffers from rounding errors, but these will be smaller than the
current errors.

This is a lot more computation than the subtractive space, but it avoids
all the confusion between the expectation of an additive model with an
internal computation using a subtractive model.

R Horn


_______________________________________________
Render mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/render

Reply via email to