Also note that some of the nvidia cards had external DVI chips, which, while not as good as ATI's, were still able to do the full 162mhz.
-Nate On Tue, 7 Dec 2004 23:28:52 -0800, Brad Templeton <[EMAIL PROTECTED]> wrote: > On Tue, Dec 07, 2004 at 03:41:30PM -0800, Joe Barnhart wrote: > > The nVidia 5200 card is a fine one for software > > decoding, and I recommend it. There was a recent > > article in Tom's Hardware, tho, which trashed it for > > DVI compatibility. I use VGA myself so I couldn't > > comment. The "winner" for DVI compatibility was ATI, > > but no one has much good to say about Radeon cards for > > Myth. > > Strictly, the Tom's article approved one Nvidia card and deprecated two > others, but they were all much higher end cards ($300 range, you can get > the 5200 for $50) so we don't know what they saw on a 5200, they said > they would do that at some point. > > I will note I have been unable to get the 5200 to work under linux to drive > my HDTV via DVI, but others have been able to do that. > > Note that the "trashed" cards were able to do 140mhz, not the full 162mhz > needed to do 1600x1200x60hz. HDTV needs only 1920x1080x30hz or > 1280x720x60hz, where are both about half of the full bandwidth. > > > _______________________________________________ > mythtv-users mailing list > [EMAIL PROTECTED] > http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users > > >
_______________________________________________ mythtv-users mailing list [EMAIL PROTECTED] http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
