I went on to experiment further with the XvMC hardware decoding, and that seemed to introduce stutter. I noticed that the CPU consumption of mythfrontend dropped by 50% when I did this. Based on this, I assume that when this box is not checked, myth is doing XvMC software decoding instead? So, one way or the other, XvMC decoding is being done?
I did not try libmpeg2 or XV. I gather that XV and XvMC are mutually exclusive options. Is that correct? And what about libmpeg2? Is that to be used in combination with either XV or XvMC? I guess at some point, it will become increasingly difficult to discern any improvement, as the picture quality gets closer to nirvana.
On 4/14/05, Will Dormann <[EMAIL PROTECTED]> wrote:
Larry K wrote:
> I'll bet that's my problem, then. FWIW, my recordings are 2.2GB/hour,
> which seems pretty reasonable.
>
> I'll try the Bob Deinterlacing when I get home. Thanks! This forum rocks!
For the best quality TV-out:
- Make sure MythTV is compiled with OpenGL Vsync support (Otherwise Bob
won't look smooth)
- Use a resolution of 800x600
- Set Playback option to "Use Video as Timebase"
- Use Bob Deinterlacing
An earlier version of my tips are here:
http://mythtv.info/moin.cgi/NVidiaMX4000HowTo
I'm planning on updating them to work with the recent builds. Probably
once 0.18 is released. The main difference is now it should be set to
use XV instead of XvMC (XV has higher CPU usage, but is less "quirky")
and also use libmpeg2 for decoding.
-WD
_______________________________________________ mythtv-users mailing list [email protected] http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
