Jarod Wilson wrote:
My TV does everything for me automagically in hardware. :-)
Deinterlacing a 1080i signal for display at 1080p is easier than
doing it for a 720p display, since you don't have to scale the video
at all. I dunno the specifics of exactly what's happening under the
hood on my TV, but I get absolutely zero interlace artifacts on 1080i
content. I do see some minor interlace artifacts on 480i stuff, but
only upon very close inspection (nose near the screen), and even
then, they aren't bad. My ASSumption is that the TV just has a really
good deinterlace filter in it. Maybe I should read my TV's manual one
of these days to figure out exactly what's going on... ;-)
So, why a 1080i signal? Have you considered sending a 1080p signal?
Would that make 720p look better (since it's only scaled instead of
scaled/interlaced/deinterlaced)? It should make 1080p30/1080p24 look
better (if broadcasters are even using them--I know you can get movie
trailers in 1080p24)... (Gotta admit I haven't done any HDTV stuff,
yet--next month when I'm done with some travel, though. :)
Mike
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users