On Wed, Jan 9, 2013 at 5:05 PM, Gustavo Sverzut Barbieri
<barbi...@profusion.mobi> wrote:
> On Wed, Jan 9, 2013 at 11:49 AM, Arvind R <arvin...@gmail.com> wrote:
>> My understanding is that emotion gets the video backend to render RGBA
>> to the evas canvas that is then displayed by the ecore-evas backend.
>> Correct?
>
> Actually it outputs to YUV as well, being converted to RGB by CPU (mmx/sse)
> or GPU (OpenGL).
>
>
> If so, would it be possible, for instance, using the xine backend to
>> render directly to screen using whatever HW-accleration available to
>> it, and have the evas-canvas as an 'underlay' to the video screen in
>> order to trap events. This would mean modifying the emotion-xine
>> module to be an interceptor in the xine pipeline instead of being a
>> video_output driver.
>>
>> Feasible?
>>
>
> Yes, Cedric did this for Gstreamer. There is support for that in
> Evas_Object_Image with Evas_Video_Surface that you can use to hook and
> change the underlying backend. At the Evas level it will draw an empty hole
> in the image region, leaving it to the top/below HW plane to draw it.

Sadly it did get broken in E17 and I have no time to go and fix it. I
may go back at it, but I am more interested in making this feature and
some that come with it work with Wayland than with X.
--
Cedric BAIL

------------------------------------------------------------------------------
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
MVC, Windows 8 Apps, JavaScript and much more. Keep your skills current
with LearnDevNow - 3,200 step-by-step video tutorials by Microsoft
MVPs and experts. ON SALE this month only -- learn more at:
http://p.sf.net/sfu/learnmore_122712
_______________________________________________
enlightenment-users mailing list
enlightenment-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/enlightenment-users

Reply via email to