That cannot be correct, since TVOUT is defined to be interlaced. Now, if you meant to say that it cannot play both fields of a interlaced content, I may believe that. I've got an NVidia card (MX-400) with such a horrendously crappy tvout chip on it, the most I ever see is 240 lines. The chip just plain blows chunks.

I must disappoint you here; first of all I have massive respect to Thomas Winischhofer, the developer/maintainer of the X driver for SiS chipsets, he has a well documented website on which he elaboralety describes every driver option that he squeezed out of the SiS chips. In the FAQ however, I only recently stumbled upon this:


  * Q: Why does output of interlaced video via TV show a "comb-like"
    "interlace"-effect? If the source material is interlaced and the
    TV output is interlaced, shouldn't this match?
  * A: CRT2 does not support interlace. Therefore, the driver can't
    feed interlaced output into the video bridge (which handles TV
    output, be it a SiS video bridge, be it a Chrontel TV encoder).
    The video bridge can only convert a progressive scan
    (=non-interlaced) input into TV-suitable interlaced output. The
    driver can neither change this nor control which of the frames
    sent to the bridge is the even/odd field. Long story short: If you
    want to output interlaced material on your TV without using a
    software de-interlacer, you need to add a proper Modeline for
    interlaced PAL/NTSC timing (easily found on the internet) and an
    external VGA-to-TV converter connected to CRT1. Otherwise you have
    to use a software de-interlacer.

You have to acknowledge that is is a Very Bad Thing®. What I said about the interlaced material is not entirely true, but the result is still not quite useful. Damn bastards @ SiS. Oh well.

We said the same thing, only in a slightly different way. It sounds like the SiS chipset is indeed broken in that respect... that it cannot even send the tvout chip a interlaced feed. Thus, he's saying you need to deinterlace it to put it into the framebuffer only to have the tvout chip use every other line anyway... so it's a 30->60->30 Hz conversion (or for you apparently 25->50->25 if you're doing SCART=>PAL?)


You are of course right, what I meant was: I made a VGA (D-sub 15) to RGB (Euro-AV/SCART) converter cable so I can plug it directly in the TV using seperate R, G, B and composite sync signals, as opposed to CVBS or S-Video input from the TV-out. I can produce a nice list of HOWTO links if you are interested.
(Side note: ATI and Matrox cards output composite sync already, all other cards have seperate Hsync and Vsync and need this converter.)



OK... that makes sense. So your "converter" basically consists of routing RGB where the belong, and combining the H/V sync as necessary? I have messed with composite sync, sync-on-green, interlacing, funky dotclocks, etc a fair bit in the past. My previous experience (Millenium and G-100) was that Matrox was second to none as far as weird options supported. Not sure if it's true anymore.


-Cory

*************************************************************************
* Cory Papenfuss                                                        *
* Electrical Engineering candidate Ph.D. graduate student               *
* Virginia Polytechnic Institute and State University                   *
*************************************************************************
_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to