Cory Papenfuss wrote:
I just recently found out that the TV-out of SiS chipsets (many, many HTPC barebones use them) are not capable of outputting interlaced material. Quite an important thing if you want picture perfect I'd say. I am trying to get the modeline right

    That cannot be correct, since TVOUT is defined to be interlaced. Now, if you meant to say that it cannot play both fields of a interlaced content, I may believe that.  I've got an NVidia card (MX-400) with such a horrendously crappy tvout chip on it, the most I ever see is 240 lines. The chip just plain blows chunks.
I must disappoint you here; first of all I have massive respect to Thomas Winischhofer, the developer/maintainer of the X driver for SiS chipsets, he has a well documented website on which he elaboralety describes every driver option that he squeezed out of the SiS chips. In the FAQ however, I only recently stumbled upon this:

  • Q: Why does output of interlaced video via TV show a "comb-like" "interlace"-effect? If the source material is interlaced and the TV output is interlaced, shouldn't this match?
  • A: CRT2 does not support interlace. Therefore, the driver can't feed interlaced output into the video bridge (which handles TV output, be it a SiS video bridge, be it a Chrontel TV encoder). The video bridge can only convert a progressive scan (=non-interlaced) input into TV-suitable interlaced output. The driver can neither change this nor control which of the frames sent to the bridge is the even/odd field. Long story short: If you want to output interlaced material on your TV without using a software de-interlacer, you need to add a proper Modeline for interlaced PAL/NTSC timing (easily found on the internet) and an external VGA-to-TV converter connected to CRT1. Otherwise you have to use a software de-interlacer.
You have to acknowledge that is is a Very Bad Thing®. What I said about the interlaced material is not entirely true, but the result is still not quite useful. Damn bastards @ SiS. Oh well.


    For future reference, the following issues need to be taken into account (almost without exception):
- The TVOut of regular VGA cards use a separate chip (or integrated into the GPU) to do the same thing as an scanline converter.  Thus, they generally do temporal and spatial resampling to make the "square peg" of the computer-generated video fit into the "round hole" of an NTSC|PAL compliant video signal.
- Modelines on such cards are only loosely related to the TVout video output.  Tweaking things like over/underscan, interlacing, refresh rates, resolutions, etc are all filterd through the tvout chip of the above note.

I am trying to get the modeline right for my TV set so that I can start using my homebrew VGA to RGB converter. I made it because CRT-1 output (ie. the VGA connector) IS capable of outputting interlaced material. It seems that all you nVidia owners forget that their vsync is controlled by OpenGL, which in fact is not supported on every video card in Linux.

    I'm now finally curious as to what VGA->RGB converter you are doing.  VGA *is* RGB... unless it's just a cable to make it DB-15 (old-school 15-pin macintosh video connector) as opposed to H-DB-15 (15-pin VGA connector).  What are you driving?

-Cory
You are of course right, what I meant was: I made a VGA (D-sub 15) to RGB (Euro-AV/SCART) converter cable so I can plug it directly in the TV using seperate R, G, B and composite sync signals, as opposed to CVBS or S-Video input from the TV-out. I can produce a nice list of HOWTO links if you are interested.
(Side note: ATI and Matrox cards output composite sync already, all other cards have seperate Hsync and Vsync and need this converter.)
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to