On Fri, Mar 04, 2005 at 06:27:20AM -0600, Neil wrote:Hey guys, I've done some testing regarding the mystery I'm experiencing with mythtv, nvidia and DVI.
Check it out at
http://restricted.dyndns.org/mystery.html
comments are greatly appreciated...........
So I guess what we're getting is that on the newer 6600 class cards,
DVI at 1080i works without the interlace bug. I think we heard that
from somebody else.
By "tearing" do you mean interlace artifacts? The whole point of running at 1080i on a TV whose native resolution is 1280x720 is to get rid of those. Perhaps overscan is doing you in?
It's totally different from interlace artifacts. Here is an example of how I am understanding on what you meant by interlaced artifacts - http://neuron2.net/LVG/inthead.jpg Please correct me if I am wrong.
If so, then, it's different. I only see mine in just 1 line but random Y coordinates. It only happens on very fast moving objects. Here is an example. Like in the program American Idol, I don't know if you have that program. During a presentation of a performer, behind her is a large flat screen with fast moving randmon graphics. That's the time I will see the tearing however, the performer object doesn't get affected. In interlace, everything get's affected. So, like what I read in the other thread, Twinview configuration really affects playback performance. I watched the same recorded program but using single head xorg.conf in my computer monitor, and that tearing went away. That tearing though was also visible in my computer monitor when I was using Twinview configuration.
Overscan? I guess, overscan only has something to do with picture being larger than my hdtv. Fortunately, mythfrontend comes with X&Y offsets, size that we can tweak. And with that, I am able to size my watching to almost 99% of my hdtv screen.
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
