Hi Matthew, (I work with dischi on Freevo, and have in the past done most of the work on the GUI core and in particular video player integration.)
On Sun, 2008-04-13 at 11:59 +0100, Matthew Allum wrote: > There is some very rough support for uploading YUV - though I suspect it > could just push the conversion down into Mesa (still in software) on > most cards. > > For really fast video playback (though only on heavily featured cards) > see http://yuvtools.wiki.sourceforge.net/ - which uses multitexturing + > shaders etc to process and display YUV data. The techniques there could > be integrated into Clutter without too much pain assuming you only > intended it to work on GL (and not GL ES, i.e sidestepping COGL). This would definitely be ideal. As a user of clutter, I would expect all the complexity of hardware YUV conversion to be abstracted. Ideally I would just ask clutter if hardware colorspace conversion is possible, and if it's not, I would be left to my own devices to find a way to feed it RGB32. (We would use swscaler in MPlayer for example, which is substantially faster than mesa at this.) Alternatively a fast software fallback could be provided. This is what Evas does. Ultimately hardware conversion is a hard requirement. 1080p h264 is demanding enough; adding software conversion to the mix is the difference between being able to play 1080p content or not. > Assuming you can get some kind of X Drawable ID out of mplayer before it > maps then its likely possible to redirect its output via > XComposite/Damage etc and then to a texture. clutter_glx_texture_pixmap > (in trunk) should soon make this kind of thing very easy todo. > See http://bugzilla.openedhand.com/show_bug.cgi?id=873 Ah this API would be extremely convenient. We can certainly redirect MPlayer. I'm not sure if this would be faster than our current approach, which is a custom VO device in MPlayer that sends YV12 frames to shared memory, where Freevo then pushes the contents (via evas) to a texture (using a fragment program inside evas). Conceivably, the XComposite path could save us at least one memcpy, which adds up fast when you're dealing with 1080p. In the case of a redirected window where the process calls XvShmPutImage (assuming client and server are the same), where does the colorspace conversion actually happen? If the compositing manager sees a Pixmap (as I believe it does), then this should be RGB32. Ideally, I think the X server would push the YUV image, which it received via shmem from the client, off to video memory where the video card takes care of any relevant colorspace conversion. Then converting that pixmap (which lives in video memory) to a GL texture ought to be quite fast, as everything happens on the card. Am I even remotely close? > and if you want to still handle things like key events in mplayer you'd > likely have to proxy them through Clutter. AFAIK mplayer does not offer > up a lib for embedding playback ? It does not. We've managed to work fairly well around all its limitations. We handle all key presses ourselves, and merely control mplayer through its slave interface. Our medium term plan is actually to drop MPlayer, because integrating with it requires several kinds of unsightly kludges. GStreamer is maturing and will eventually be in a position where it can replace MPlayer functionally. We'd ideally like just to support gst. Once it supports DVD menus and AC3/DTS passthrough, we may also be able to drop Xine. Quoting now Emmanuele Bassi: > you're using pyclutter - aren't you also using an nvidia card, > perchance? there are some problems while using python, clutter and > drivers/libraries that require dlopen()-ing. dischi uses Intel, whereas I use nvidia. Might this explain the segfault I'm seeing? Before it segfaults it outputs: (clutter-test.py:2033): ClutterGLX-WARNING **: failed to bind GLXGetProcAddress or GLXGetProcAddressARB The C examples all work fine. > > Is there anything else I have to do? What format does a Texture > support? > > Only ARGB? > > also YUV2, if the driver suppors it (there's a feature for that as > well). Did you mean YUY2? What about YV12? What kind of support for these YUV colorspaces are there within clutter? I was under the impression from Matthew that hardware YUV conversion was not (yet?) supported in clutter? Thanks for all your answers, everyone. Cheers, Jason. -- To unsubscribe send a mail to [EMAIL PROTECTED]
