Hakan Lindestaf wrote:
Pancake,

Thanks for your help on this. I'll give these suggestions a run, but I have
a few comments first.

# Then run the video.py program and drag the video file over the window. now use 'top' again.
python video.py

So you mean I should play two videos at the same time, on top of each other?

nope, serialize the playbacks or you will not be able to get the cpu usage properly
In my box i get around 10% more of cpu usage with pigment than with
xvimagesink. It's mainly the cost
of injecting every frame as an OpenGL texture. The main problem of this
conversion is the colorspace
conversion, because opengl only handles RGBA format, and most videos are
in YUV, NV12, ... formats, so
the pipeline needs to convert the colorspace of every frame before being
pushed to the sink element.
To accelerate this, pigment uses vertex shaders to make this
transformation in hardware and for 1080p
videos is probably the hardest thing to be done.

I'd be fine with 10%, my problem is that in Elisa it's on the magnitude of
double the cpu time vs gst-launch. That eats up all margin.

but do you have this problem with video.py? elisa should take the same cpu
as video.py. If not, then maybe there's a problem somewhere else.
For 2D there are already hardware colorspace conversors, but actual
hardware has no way to do it
without using pixel shaders or special (not yet released) hardware.

But it works well in gst-launch, what's the difference? Can I see the
"chain" that elisa is using vs. gst-launch -v?


Try using GST_DEBUG=*:3 as environ before running elisa, and redirect
stderr to a file. Then you can inspect the pipeline construction and so on.
You can also try converting the clip and reencode-it to rgba, to do this
task before playing the
video, but you will have to wait few hours before starting to watch the
movie :P

Nah, that's not really an option, I'm looking for a convenient media center
that my wife and son can use, don't want to convert videos for hours before
they are available. Also, it's working great in gst-launch so I don't
understand why it would be so much slower in Elisa if it's using the same
technique.

gst-launch will probably use xvimagesink as video sink, you have to test if
the problem persists with pgmimagesink which is only accessible from the
programs using pigment (try the video.py)
Regards,
/Hakan
--pancake

Reply via email to