On Nov 28, 06 23:21:12 +0200, Andrei Verovski (aka MacGuru) wrote:
> My video card GeForce 5500 with 128 MB is capable of 1680x1050, but one of my 
> fiends who is very fluent in electronics and PC hardware told me that this 
> card is quite old, and therefore, it is capable of HD 16:10 resolution ONLY 
> in VGA mode, and NOT DVI. Additionally, he said, it may suffer from glitches 
> artifacts anyway.

As DVI has no knowledge about aspect ratio, this is certainly not true!
Again, this is about single link interfaces and their maximum pixel
clock of 135MHz. A non-reduced mode (that can be used for CRTs as well)
needs 147, so it is not working. A reduced mode only needs 119.

Also, 1680x1050 is *no* HD mode - only 1280x720 and 1920x1080 are. And
they are 16:9, not 16:10 as 1680x1050 is.

> I have used xorg.conf supplied by Basil, and yes, monitor worked in 1680x1050 
> ONLY in VGA mode. Reconnecting it to DVI resulted back 1280x1024.

Add the reduced mode line and the options I've written, and it *should*
work with DVI - it did with a 5300. I just reviewed the bug, you might
need to add "NoMaxPClkCheck" as well. Mode validation has changed quite
a bit in the newer NVidia drivers.

See Bug-Id #219916

https://bugzilla.novell.com/show_bug.cgi?id=219916

CU

Matthias

-- 
Matthias Hopf <[EMAIL PROTECTED]>       __        __   __
Maxfeldstr. 5 / 90409 Nuernberg    (_   | |  (_   |__         [EMAIL PROTECTED]
Phone +49-911-74053-715            __)  |_|  __)  |__  labs   www.mshopf.de
-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to