I thought I understood this interlace stuff but Myth is challenging my concept of interlace vs. non-interlace video.
I have a TV capable of displaying 1920x1080 interlaced. It's a CRT based RPTV from Pioneer, if you care. I have watched terrestrial broadcasts on HDTV and it looks breathtaking. The image consists of two alternating 540-line frames, each 1/60 second, displayed at even and odd scan lines for 1080 unique lines of information.
Now I get my Myth box running and it looks very, very good. But not quite as good as my TV with a direct OTA receiver. Why not? I've set everything up as carefully as possible. I'm on stock 0.16 Myth, I have an Intel 3GHz processor running an nVidia 5500 card with Xv. The modeline is correct and the card is doing interlaced out, as evidenced by the flickering on 1-pixel lines with an xterm on screen.
Some people on the list whose advice I trust very much have suggested that 540p and 1080i output are _identical_ as played by Myth. This certainly seems to be true. I have played with the "deinterlace" setting and the "bob and weave" deinterlace (which I understand duplicates one of the frames and tosses the other) looks the same as interlaced on my set.
There is no "bob and weave" deinterlace in MythTV. The relevant algorithms are "onefield," which behaves as you say -- tosses half the fields; and "bob," which shows each field sequentially (one field each 1/60 sec.).
The other two algorithms are "linearblend," which retrieves resolution from low-motion scenes at the expense of ghosting during horizontal motion; and "kerneldeint," which does better but is too resource-intensive for HDTV on today's hardware.
It's worth noting once again that if you're using XvMC, none of these algorithms are used. If you choose any of them, it will use the only method XvMC is capable of: bob implemented in hardware. This is not a software limitation.
There are more-advanced algorithms out there, notably from the open-source (but for Windows <boggle>) DScaler, and from TVTime. The software architecture of how Myth displays video is not amenable to easily adapting these filters, which is a shame, because the TVTime ones seem to be developing into somewhat of a standard.
When displayed in a 540p frame, bob deinterlacing should look identical to native 1080i, at least on a CRT with relatively long-persistence phosphor. Make sure you're using a 1920x540 video mode (not 960x540), or you *are* tossing out half your horizontal resolution.
I believe that on some sets, native 1080i output would look better than the fake 540p that we use. However, there appears to be a bug in nVidia's drivers, where *video* is not displayed at the full resolution, but instead undergoes some sort of deinterlacing. Pause some horizontally-panning scene when in 1080i and you should see flicker, but you don't. Normal GUI stuff is fine; it's the Xv overlay that's screwed up.
In conclusion, I think that for a 1080i-native set, 1920x540 with bob deinterlacing is the best you'll get out of Myth right now. If your set does 1080*p*, it should be even better, though some of those more-advanced algorithms would help.
-Doug
signature.asc
Description: OpenPGP digital signature
_______________________________________________ mythtv-users mailing list [EMAIL PROTECTED] http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
