Matt Sullivan wrote:
Hey,
On a different note, I played around with trying to fix the problem by
recompiling the kernel, and after much mucking around, I finally got
the problem sorted, which I believe was caused by having ACPI
comipiled in there. Removed it from my kernel and the jerkiness stopped.
The only problem I have now is that, at the very end of a recording,
the last 5 seconds or so freeze up the frames for about 10 seconds
before playback stops, which I can easily live with and probably has
to have something to do with the hardware XvMC decoding.
[EMAIL PROTECTED] wrote:
I have a similar problem, which I was just able to fix. I was given
some clues from an earlier thread about Time Stretching. Balaji Ramani
told me to recompile (or download his RPMs) WITHOUT opengl_vsync. I did
this and it works perfectly. Not sure what is broken in .17, but it has
something to do with the opengl_vsync. I suppose it might be the new
NVidia drivers too, but I thought I installed those before I
upgraded to
.17. This is definitely worth a try.
Thanks Matt,
Can you explain how to compile without opengl_vsync ??
Thanks
_______________________________________________
If this is indeed the problem (ACPI), why didn't it affect .16? I would
be willing to take it out of my kernel, except I really don't want to
recompile. Anyone know if just setting acpi=off in lilo.conf will do
the same thing? I will try it, but I don't really want to recompile
with opengl enabled again unless this is sure to work.
Matt
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users