On Wed, Jul 26, 2006 at 10:48:40AM +0200, ext Frantisek Dufka wrote:
> Daniel Stone wrote:
> >On Wed, Jul 26, 2006 at 09:34:03AM +0200, ext Frantisek Dufka wrote:
> >>I thought DSP only decodes video to shared framebuffer in main memory 
> >>and the actual transfer to epson chip is done by omapfb driver i.e. by 
> >>ARM CPU from gstreamer or directly from mediaserver code.
> >
> >Correct.
> 
> Sorry, but is is still not clear to me which parts you quoted are 
> correct. So currently DSP code decodes mp4 into RGB and then lets the 
> framebuffer code blit it to the epson chip in RGB? And ioctls with 
> OMAPFB_COLOR_YUV420 is not used due to not solved problems with multiple 
> surfaces?

The paragraph I've left quoted is correct: the DSP decodes into a shared
framebuffer, which is then DMA'ed across.

> >>Am I wrong with this? If not, is it possible to add YUV surfaces to 
> >>xserver (via Xv or Nokia Xsp extensions) so it could be used from X11 
> >>code in Mplayer or (nonpartched or patched) SDL ?
> >
> >Not at the moment: you need multiple planes, so you can also set up a
> >separate YUV surface, you can't just blit in with a different format.
> 
> And why this is a problem? Yes I understand you need extra memory 
> (surface) allocated for YUV operations. Why it is a problem to have 
> 800x600 in RGB and additional 320x240 in YUV both managed by xserver? 
> Isn't this generic thing already solved in xserver and used on other 
> plaforms? Sorry for being too ignorant about X internals :-)

It means that the video hardware does not support RGB565 (everything on
the device, basically) and YUV together.  You can't have part of the
screen being RGB and part being YUV; the problem is a couple of layers
below the X server.

Cheers,
Daniel

Attachment: signature.asc
Description: Digital signature

_______________________________________________
maemo-developers mailing list
maemo-developers@maemo.org
https://maemo.org/mailman/listinfo/maemo-developers

Reply via email to