On Thu, 23 May 2002, Eric Faurot wrote:
> I know it doesn't solve the problem of the upcoming  32 bits per
> channel graphic adaptors, But... For now, even with 10 bit per channel
> we have another 2 bit left.

Phew. :-)

> If we want all channels together (why rgba more than rgbZ?) then we
> should think about defining an extended_pixel structure in libbuf, (or
> maybe galloc?), but not in ggi.

I think if and when we get to 64-bit pixels they should go in the
main API.  There are only a limited number of API functions that
take ggi_pixel as an argument.  Changing the size of this item in the
function prototypes really only gets hairy on an API level where 
they are referenced as pointers, and the most important, most 
commonly used functions that deal in ggi_pixels (PutPixel, SetGCFGColor,
SetGCBGColor and ggiMapColor) do not do this.  The hairiness
will be confined to the more rarely used GetGCFGColor, GetGCBGColor, 
and GetPixel.  In any case we're looking at a major version bump
when this happens.

We're really in trouble if they ever decide that 20:20:20:4 is the way 
to go because at that point we're out of space in ggi_color.

> Anyway 64 bits per pixel would be a time&space waste in 99.9% of the
> target/platform supported right now.

There will be an inflation of the size of the informational structures,
but they are already large enough so that it won't be a huge increase.
Since the get/put values can be packed, space won't be wasted where
it is most important.

I agree that it would take a lot of work... all the more reason
to put it off until we really need it.

P.S. What *are* they thinking, btw?  More colors than 8:8:8 is pretty
useless.  I can believe 10:10:10:2 but if they add more than that, 
I would suspect what they really mean is that each channel will
have it's own alpha bits.  In that case we have other ways to deal with it.

--
Brian

Reply via email to