Hi;

On Mon, 2008-04-14 at 10:13 -0400, Jason Tackaberry wrote:
> On Sun, 2008-04-13 at 23:08 +0100, Matthew Allum wrote:
> > You can query Clutter via the features flags to see if YUV textures are
> > supported - but I am not sure if there is anyway to query if Mesa is
> > implementing this in software or on the GPU - are you saying Evas is
> > doing this ?
> 
> Evas doesn't (to my knowledge) provide a way to ask it if the engine
> will do hardware or software conversion, but it does implement its own
> SIMD-accelerated implementation for the software (non-GL) engines.
> 
> I was actually under the impression that it had a whitelist of cards it
> would use fragment shaders for, and use the software fallback for other
> cards, but looking at the source, it appears as though it always uses
> the fragment program with the GL canvas.  Alas.

Ok. I think the eventual (and hopefully not too far off) route for
clutter is to go the multitexture/shader route for YUV and advertise
only if card support is there (avoiding the currently used inbuilt YUV
texture format). Id rather leave software conversion up to the decoding
lib.

> 
> > The conversion would happen before the pixels ended up on the drawable -
> > Clutter would get this data as RGB via composite. Im not sure however if
> > pushing YUV via XVideo to the drawable would work with xcomposite
> > however (i.e overlays etc).
> 
> It does work with some drivers, for example nvidia, where there are
> texture overlays.  I'm just not entirely clear how that works.  On one
> of my systems with an nvidia 8600 using XComposite and a compositing
> manager (metacity), playing 1080p content uses low enough CPU that I
> have to conclude the colorspace conversion is happening in hardware.

Yeah would make sense. 

> 
> 
> > I think, like you say, sticking with this use of the custom VO could be
> > a safer route.
> 
> Safer inasmuch as performance will be predictable on more hardware, but
> it's a bit of a pain having to patch MPlayer.

Arg right, did not realise you were having to do that.

> 
> > Out of interest/curiosity what does mplayer do that xine doesn't ?
> 
> Perhaps the most significant feature would be software scaling.  I have
> a fairly critical eye and I'm generally not happy with the hardware
> scaling I regularly see with Xv.  (Although with GL there is a bit more
> flexibility there.)
> 
> It also happens to play back h264 content without green smearing.  This
> is a bug that IIRC gstreamer has recently fixed, whereas, for reasons
> that confound me, it seems to have existed in Xine for well over a year.
> 
> On the other hand, Xine does a lot of things well.  In particular its
> deinterlacing is very good.  Does GStreamer have a deinterlacer that can
> dynamically detect telecined content and apply ivtc, or for truly
> interlaced material perform full-framerate motion-adaptive
> deinterlacing?
> 
> In a perfect world, gst would do all this stuff. :)
> 

You could port the missing bits to gst ;-) ? Also note we didn't make
gst a hard dep in Clutter (and have the 'media' interface to make
switching easy) so other playback engines could be used.

Many thanks;

  == Matthew

-- 
To unsubscribe send a mail to [EMAIL PROTECTED]

Reply via email to