On 1/18/06, Dan McCarthy <[EMAIL PROTECTED]> wrote: > Hi, > > I have a couple of questions about interlaced output with an ATI card. > I am using an ATI 9200 with TV out with the ATI fglrx binary driver > and svn mythtv. I capture from a dvb-t card. > I am getting interlacing artifacts on the displayed video. These are > chunky lines to the left or right of the image during motion. > > I understand that the SDTV DVB signal is interlaced. Does this mean > that I would need to use software deinterlace when using the TV out, > or is this handled automatically by the card? Is the TV out just some > kind of mirror output of what is shown on a monitor or is it treated > completely differently by the card? > > Also, I have the screen resolution set to 800x600. Should I be setting > this to 576x720 for SDTV PAL with a modeline (if possible) to avoid > any issues with image scaling before it goes out the TV out? > > Thanks, > Dan
Do you want interlaced output or deinterlaced output? If the first, you can't with an ATI card. Even the newer nvidia drivers are broke from what I understand. I think what you want is in the settings under Playback, on the first page at the top: there is a box to enable deinterlacing. The different types vary on how much CPU% they use and how well they work. _______________________________________________ mythtv-users mailing list [email protected] http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
