On Fri, 2005-12-16 at 10:20 +0100, Marius Schrecker wrote:

I'm using a Radeon X300 with DVI out. Works fine using latest X-org and
ATI proprietary drivers. The only caveats I can think of are to do with
direct access (I've so far only got it to work with xv).

Others have had problems with overscan, but with my lcdtv this hasn't een
a problem. It copes fine with a custom modeline of 1368x768.

The only other problems I've had have to do with the TV not suporting 50Hz
refresh rate (so I can't use bob deinterlacing), and a module build
problem using the latest 2.6.14 kernel (compiled fine on 2.6.13).

There may be issues on X86_64 systems as the proprietary driver has 32 bit
dependencies.

Which distro are you running?


Marius

Marius,
    Thanks for the feedback! In answer to your question, I am using a diskless Ubuntu frontend in this case. Since my TV only supports 1080i, deinterlacing isn't much of an issue for me (I tried using 540p, and the results don't appear as good as the 1080i, for whatever reason). And since Ubuntu is still using 2.6.12, I shouldn't have problems with kernel compilation.  So it seems that if I do my homework, and get a card that has HDCP support, I might be better off.

    Thanks for the help!

        --Matt


_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to