Michael T. Dean wrote:
Chad wrote:
On 4/28/05, Chad <[EMAIL PROTECTED]> wrote:
What resolution should I have in my xorg.conf?
Ideally, since your TV has a native resolution of 1280x720, you would
use that resolution. Then, your TV doesn't have to scale the video
and you get a pixel-for-pixel representation of the output, giving you
the highest quality possible. (Note that although some
(older/cheaper) TV's have a native resolution of 1280x720, they may
not be able to accept input at that resolution from a computer. If
that's the case for yours, you would probably get the best results
using the maximum resolution allowed. However, this is most likely
not the case with your Samsung.)
I knew this section of my reply was too small. (Yeah, I'm the first to
admit my replies can be^H^H^H^H^Hare usually a bit wordy. :)
I forgot to mention that the above holds true when talking about the
quality of the GUI. If you render the GUI at the resolution at which it
is displayed, you'll get the best picture quality.
However, when it comes to TV, you might actually get better results
sending a different signal to the TV and allowing its built-in scaler to
do the scaling for you. This may be the case if all you have is SDTV.
Also, if you have HDTV, you might actually get better quality for your
1080i broadcasts by outputting 1080i. Probably the only way to know for
sure is to try it out.
IIRC, Myth can be set up to use XRandR to dynamically change the output
resolution based on the source resolution. To get the best of all
possible worlds--assuming the TV's scaler works better than the computer
based scaling mechanism, you would want to set up your xorg.conf to
allow 1080i, 720p, 480p and 480i resolutions (assuming your TV accepts
all these resolutions as input) and set up Myth to use XRandR.
Mike
_______________________________________________
mythtv-users mailing list
[email protected]
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users