On Wednesday 16 February 2005 19:17, HiNet wrote:

>       And X window use share memory to store the original
> unscaled raw image. This is another headache. While frame rate gets
> higher, or there are more than one video streams when using nvidia's
> Video Blitter function, the CPU usage of X gets higher. Why? Shared
> memory is not physically continuous and this is definitely one of the 
> reasons. I've
> ever dreamed, some day there would be no X while using Video 
> Overlay/Blitter.

That day came some time ago...

http://www.mplayerhq.hu/homepage/images/shot-cvidix-01.jpg

MPlayer playing a movie on text console using mplayer's nvidia
vidix driver. Full hardware colourspace and scaling, using colourkey
set to black (why you have the text ontop of the movie).

http://www.mplayerhq.hu/homepage/images/shot-cvidix-02.jpg

The same, except on framebuffer console, not text console.

The vidix drivers are also usable in X, through xvidix. I believe
the vidix drivers were created originally to overcome the extra
memcpy() due to X storing the local image in shared memory..

I doubt they can do more than one stream at once though...

_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)

Reply via email to