Y and UV are separated. Each 'pixel' is rendered (by a TV) using a Y signal
in combination with a UV signal. A loose parallel would be to have a
framebuffer of red values followed by a framebuffer of GB values. Where RGB
pixel values are 'combined' by the video hardware to render a pixel.

Does this mean planar will be no use to us as it is presumably used for
interleaving framebuffer pixels?

Regards,

David 

 -----Original Message-----
From:   Andrew Apted [mailto:[EMAIL PROTECTED]] 
Sent:   Wednesday, 17 May 2000 4:17
To:     [EMAIL PROTECTED]
Subject:        Re: GGI support for non standard FB devices

David Craig writes:

>  Just to elaborate a bit more we will have to convert RGB values to Y
>  (luminance), and Cr + Cb -> UV values (chrominance). Which makes up the
YUV
>  display framebuffer (also called CIE Luv).
>  
>  The trickiest bit is that Y and UV values which make up each on screen
pixel
>  are stored in two separate (but contiguous) planar framebuffers. Ie
>  720*576*8bit pixels of Y values, followed by 720*576*8bit UV values. We
will
>  need dev/fb to be in this bi planar format which means GGI
framebuffer/pixel
>  operations will need operate on both Y & UV planes simultaneously as well
as
>  converting the (RGB <-> YUV) formats. Does/could GGI support any such
multi
>  plane operations through one single config point? 

A special drawing sub-lib will need to be written.  Something like
libggi/default/planar but for your YUV format.  What sort of planar is
it, bit-planar or interleaved-lines or interleaved-words ?  (Or did
you just mean that Y and UV are separated ?).

>  Ie will I need to implement custom fb read/write procedures for lots of
>  framebuffer access routines or maybe just putpixel & getpixel.

Just having putpixel and getpixel will get it working, plus the color
mapping routines.  You'll get better performance by implementing the
other drawing routines too (drawhline, puthline etc), but that can
come later.

Cheers,
__
\/   Andrew Apted  <[EMAIL PROTECTED]>
 

Reply via email to