Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-07 Thread Grzesiek Sójka

On 10/05/10 14:55, Francisco Jerez wrote:

Grzesiek Sójkap...@pfu.pl  writes:


On 10/02/10 15:31, Francisco Jerez wrote:

Ah, I think you're hitting the bandwidth limitation of the nv34
integrated TMDS transmitter. The attached patch should help with the
console modesetting problem, but you'll still need to set the modelines
manually (and force panel rescaling) if you want to go up to 1600x1200,
because your GPU *cannot* handle the video mode your monitor is asking
for.


Your patch works fine. Now I have clear image at both displays. Only
disadvantage is that the resolution is 1280x1024 (PixClk 135MHz). So I
was wondering if it is possible to force particular modeline (by
editing the kernel source tree??). The mode:

Modeline 1600x1200_def 144  1600 1628 1788 1920  1200 1201 1204 1250

works fine with the XServer. Is it possible to force it at the console??


You could try to force a reduced blanking mode in the kernel command
line like: video=DVI-I-1:1600x1200RM. But it isn't going to work with
GPU rescaling, the attached patch (on top of the previous one) will make
the kernel detect that, and automatically fall back to panel rescaling.

Your patch works grate, again. Thanks
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-07 Thread Grzesiek Sójka

On 10/07/10 04:53, Francisco Jerez wrote:

Grzesiek Sójkap...@pfu.pl  writes:


On 10/05/10 14:55, Francisco Jerez wrote:

PS. I'm afraid that my system is not very stable when the AGP support
is turned on both using the nouveau kernel source tree and the PLD
patched 2.6.35-5 version with an extra amd-k7-agp patch. The Xserver
uses the driver:

Unstable? How? What's the problem?


Here are the logs:
http://yen.ipipan.waw.pl/~gs159090/tmp/log.tgz

BTW: Sometimes the Xserver freezes during normal work. Unfortunately I
was not able to generate such a crush now. I send you logs if it
happens again.

Regards.

[...]
[  197.374498] kernel BUG at drivers/gpu/drm/ttm/ttm_tt.c:420!


Oops, I overlooked that, updated patch attached.



The new patch works fine so fare. System seems to be stable.

cheers


greg
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-06 Thread Francisco Jerez
Grzesiek Sójka p...@pfu.pl writes:

 On 10/05/10 14:55, Francisco Jerez wrote:
 PS. I'm afraid that my system is not very stable when the AGP support
 is turned on both using the nouveau kernel source tree and the PLD
 patched 2.6.35-5 version with an extra amd-k7-agp patch. The Xserver
 uses the driver:
 Unstable? How? What's the problem?

 Here are the logs:
 http://yen.ipipan.waw.pl/~gs159090/tmp/log.tgz

 BTW: Sometimes the Xserver freezes during normal work. Unfortunately I
 was not able to generate such a crush now. I send you logs if it
 happens again.

 Regards.

 [...]
 [  197.374498] kernel BUG at drivers/gpu/drm/ttm/ttm_tt.c:420!

Oops, I overlooked that, updated patch attached.

diff --git a/drivers/char/agp/amd-k7-agp.c b/drivers/char/agp/amd-k7-agp.c
index b6b1568..b1b4362 100644
--- a/drivers/char/agp/amd-k7-agp.c
+++ b/drivers/char/agp/amd-k7-agp.c
@@ -309,7 +309,8 @@ static int amd_insert_memory(struct agp_memory *mem, off_t pg_start, int type)
 
 	num_entries = A_SIZE_LVL2(agp_bridge-current_size)-num_entries;
 
-	if (type != 0 || mem-type != 0)
+	if (type != mem-type ||
+	agp_bridge-driver-agp_type_to_mask_type(agp_bridge, type))
 		return -EINVAL;
 
 	if ((pg_start + mem-page_count)  num_entries)
@@ -348,7 +349,8 @@ static int amd_remove_memory(struct agp_memory *mem, off_t pg_start, int type)
 	unsigned long __iomem *cur_gatt;
 	unsigned long addr;
 
-	if (type != 0 || mem-type != 0)
+	if (type != mem-type ||
+	agp_bridge-driver-agp_type_to_mask_type(agp_bridge, type))
 		return -EINVAL;
 
 	for (i = pg_start; i  (mem-page_count + pg_start); i++) {


pgpKP8QOD53nn.pgp
Description: PGP signature
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Grzesiek Sójka

On 10/02/10 15:31, Francisco Jerez wrote:

Ah, I think you're hitting the bandwidth limitation of the nv34
integrated TMDS transmitter. The attached patch should help with the
console modesetting problem, but you'll still need to set the modelines
manually (and force panel rescaling) if you want to go up to 1600x1200,
because your GPU *cannot* handle the video mode your monitor is asking
for.


Your patch works fine. Now I have clear image at both displays. Only 
disadvantage is that the resolution is 1280x1024 (PixClk 135MHz). So I 
was wondering if it is possible to force particular modeline (by editing 
the kernel source tree??). The mode:


Modeline 1600x1200_def 144  1600 1628 1788 1920  1200 1201 1204 1250

works fine with the XServer. Is it possible to force it at the console??

Thanks again!

PS. I'm afraid that my system is not very stable when the AGP support is 
turned on both using the nouveau kernel source tree and the PLD patched 
2.6.35-5 version with an extra amd-k7-agp patch. The Xserver uses the 
driver:

[38.842] (II) Module nouveau: vendor=X.Org Foundation
[38.842]compiled for 1.9.0, module version = 0.0.16
[38.842]Module class: X.Org Video Driver
[38.842]ABI class: X.Org Video Driver, version 8.0
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Francisco Jerez
Grzesiek Sójka p...@pfu.pl writes:

 On 10/02/10 15:31, Francisco Jerez wrote:
 Ah, I think you're hitting the bandwidth limitation of the nv34
 integrated TMDS transmitter. The attached patch should help with the
 console modesetting problem, but you'll still need to set the modelines
 manually (and force panel rescaling) if you want to go up to 1600x1200,
 because your GPU *cannot* handle the video mode your monitor is asking
 for.

 Your patch works fine. Now I have clear image at both displays. Only
 disadvantage is that the resolution is 1280x1024 (PixClk 135MHz). So I
 was wondering if it is possible to force particular modeline (by
 editing the kernel source tree??). The mode:

 Modeline 1600x1200_def 144  1600 1628 1788 1920  1200 1201 1204 1250

 works fine with the XServer. Is it possible to force it at the console??

You could try to force a reduced blanking mode in the kernel command
line like: video=DVI-I-1:1600x1200RM. But it isn't going to work with
GPU rescaling, the attached patch (on top of the previous one) will make
the kernel detect that, and automatically fall back to panel rescaling.

 Thanks again!

 PS. I'm afraid that my system is not very stable when the AGP support
 is turned on both using the nouveau kernel source tree and the PLD
 patched 2.6.35-5 version with an extra amd-k7-agp patch. The Xserver
 uses the driver:
Unstable? How? What's the problem?

 [38.842] (II) Module nouveau: vendor=X.Org Foundation
 [38.842]compiled for 1.9.0, module version = 0.0.16
 [38.842]Module class: X.Org Video Driver
 [38.842]ABI class: X.Org Video Driver, version 8.0

diff --git a/drivers/gpu/drm/nouveau/nv04_dfp.c b/drivers/gpu/drm/nouveau/nv04_dfp.c
index c936403..0d6ee18 100644
--- a/drivers/gpu/drm/nouveau/nv04_dfp.c
+++ b/drivers/gpu/drm/nouveau/nv04_dfp.c
@@ -185,14 +185,19 @@ static bool nv04_dfp_mode_fixup(struct drm_encoder *encoder,
 	struct nouveau_encoder *nv_encoder = nouveau_encoder(encoder);
 	struct nouveau_connector *nv_connector = nouveau_encoder_connector_get(nv_encoder);
 
-	/* For internal panels and gpu scaling on DVI we need the native mode */
-	if (nv_connector-scaling_mode != DRM_MODE_SCALE_NONE) {
-		if (!nv_connector-native_mode)
-			return false;
+	if (!nv_connector-native_mode ||
+	mode-hdisplay  nv_connector-native_mode-hdisplay ||
+	mode-vdisplay  nv_connector-native_mode-vdisplay) {
+		NV_DEBUG_KMS(encoder-dev, Mode larger than native mode, 
+			 forcing panel rescaling.\n);
+		nv_encoder-mode = *adjusted_mode;
+
+	} else if (nv_connector-scaling_mode == DRM_MODE_SCALE_NONE) {
+		nv_encoder-mode = *adjusted_mode;
+
+	} else {
 		nv_encoder-mode = *nv_connector-native_mode;
 		adjusted_mode-clock = nv_connector-native_mode-clock;
-	} else {
-		nv_encoder-mode = *adjusted_mode;
 	}
 
 	return true;


pgpmgUBQqaNqt.pgp
Description: PGP signature
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Grzesiek Sójka

On 10/05/10 14:55, Francisco Jerez wrote:

PS. I'm afraid that my system is not very stable when the AGP support
is turned on both using the nouveau kernel source tree and the PLD
patched 2.6.35-5 version with an extra amd-k7-agp patch. The Xserver
uses the driver:


The Xserver crushes every time when I try to shut it down. To send you 
the dmesg and Xlog I need some more time. I messed something up in the 
kernel configuration and I'm only able to use single user mod.


Regards
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Grzesiek Sójka

On 10/05/10 14:55, Francisco Jerez wrote:

PS. I'm afraid that my system is not very stable when the AGP support
is turned on both using the nouveau kernel source tree and the PLD
patched 2.6.35-5 version with an extra amd-k7-agp patch. The Xserver
uses the driver:

Unstable? How? What's the problem?


Here are the logs:
http://yen.ipipan.waw.pl/~gs159090/tmp/log.tgz

BTW: Sometimes the Xserver freezes during normal work. Unfortunately I 
was not able to generate such a crush now. I send you logs if it happens 
again.


Regards.
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Pekka Paalanen
On Tue, 05 Oct 2010 16:53:16 +0200
Grzesiek Sójka p...@pfu.pl wrote:

 On 10/05/10 14:55, Francisco Jerez wrote:
  PS. I'm afraid that my system is not very stable when the AGP
  support is turned on both using the nouveau kernel source tree
  and the PLD patched 2.6.35-5 version with an extra amd-k7-agp
  patch. The Xserver uses the driver:
  Unstable? How? What's the problem?
 
 Here are the logs:
 http://yen.ipipan.waw.pl/~gs159090/tmp/log.tgz
 
 BTW: Sometimes the Xserver freezes during normal work.
 Unfortunately I was not able to generate such a crush now. I send
 you logs if it happens again.

Whoa, you have three graphics cards/chips?
X log lists three devices, maybe the third one is not
a graphics card?

Anyway, the Matrox card is an important detail.

X drivers in use:
- nouveau
- mga

kernel drivers in use:
- matroxfb (fb0)
- nouveau (fb1)

Your kernel log is missing the part from boot to 12
seconds. Might not contain anything important, but
would be nice to see it.

You are missing VGA arbiter support in kernel.

I'm not sure how many conflicts all those cards and drivers
create in theory, but missing VGA arbiter is not good in
a multi-card machine.

I hope someone can tell, if the following are conflicts, and
if so, are they fixable:
- mga vs. nouveau DDX (XAA vs. EXA, pre-Randr vs. Randr 1.2)
- mga vs. matroxfb (kernel driver vs. X driver)
- matroxfb vs. nouveau (legacy kernel fb driver vs. KMS driver,
multiple fb devices)

Finally, there is a kernel BUG at the end of the log,
from TTM. I can't tell if it is something already
fixed.

One option is to remove all but the nvidia graphics card,
that would hopefully stabilise your system immediately.
Apparently you are not really using the Matrox card, yes?
At least, disable matroxfb and mga, and enable VGA arbiter,
if not touching the hardware.


Cheers.

-- 
Pekka Paalanen
http://www.iki.fi/pq/
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Marcin Slusarz
On Wed, Oct 06, 2010 at 12:33:25AM +0300, Pekka Paalanen wrote:
 On Tue, 05 Oct 2010 16:53:16 +0200
 Grzesiek Sójka p...@pfu.pl wrote:
 
  On 10/05/10 14:55, Francisco Jerez wrote:
   PS. I'm afraid that my system is not very stable when the AGP
   support is turned on both using the nouveau kernel source tree
   and the PLD patched 2.6.35-5 version with an extra amd-k7-agp
   patch. The Xserver uses the driver:
   Unstable? How? What's the problem?
  
  Here are the logs:
  http://yen.ipipan.waw.pl/~gs159090/tmp/log.tgz
  
  BTW: Sometimes the Xserver freezes during normal work.
  Unfortunately I was not able to generate such a crush now. I send
  you logs if it happens again.
 
 Whoa, you have three graphics cards/chips?
 X log lists three devices, maybe the third one is not
 a graphics card?
 
 Anyway, the Matrox card is an important detail.
 
 X drivers in use:
 - nouveau
 - mga
 
 kernel drivers in use:
 - matroxfb (fb0)
 - nouveau (fb1)
 
 Your kernel log is missing the part from boot to 12
 seconds. Might not contain anything important, but
 would be nice to see it.
 
 You are missing VGA arbiter support in kernel.
 
 I'm not sure how many conflicts all those cards and drivers
 create in theory, but missing VGA arbiter is not good in
 a multi-card machine.
 
 I hope someone can tell, if the following are conflicts, and
 if so, are they fixable:
 - mga vs. nouveau DDX (XAA vs. EXA, pre-Randr vs. Randr 1.2)
 - mga vs. matroxfb (kernel driver vs. X driver)
 - matroxfb vs. nouveau (legacy kernel fb driver vs. KMS driver,
 multiple fb devices)
 
 Finally, there is a kernel BUG at the end of the log,
 from TTM. I can't tell if it is something already
 fixed.

Just a quick note: this BUG should be easily fixable by applying
the same fix Francisco did for amd-k7-agp.c/amd_insert_memory to
amd_remove_memory.

 One option is to remove all but the nvidia graphics card,
 that would hopefully stabilise your system immediately.
 Apparently you are not really using the Matrox card, yes?
 At least, disable matroxfb and mga, and enable VGA arbiter,
 if not touching the hardware.
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Grzesiek Sójka

On 10/05/10 23:45, Marcin Slusarz wrote:

Just a quick note: this BUG should be easily fixable by applying
the same fix Francisco did for amd-k7-agp.c/amd_insert_memory to
amd_remove_memory.
I was the one that reported the problem leading to this patch and at the 
moment all my kernels are patched with it.


PS To zabawne, że piszemy do siebie po angielsku :)
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-05 Thread Grzesiek Sójka

On 10/05/10 23:53, Grzesiek Sójka wrote:

On 10/05/10 23:45, Marcin Slusarz wrote:

Just a quick note: this BUG should be easily fixable by applying
the same fix Francisco did for amd-k7-agp.c/amd_insert_memory to
amd_remove_memory.

I was the one that reported the problem leading to this patch and at the
moment all my kernels are patched with it.

Sorry, I misunderstood your comment. Just ignore my previous answer.
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-10-02 Thread Francisco Jerez
Grzesiek Sójka p...@pfu.pl writes:

 On 10/01/10 01:29, Francisco Jerez wrote:
 Grzesiek Sójkap...@pfu.pl  writes:

 On 09/30/10 18:05, Francisco Jerez wrote:
 Does the following command help?
 $ xrandr --output DVI-I-2 --set scaling mode None
 Yes! Now I have:
 xrandr --output DVI-I-1 --set scaling mode None
 xrandr --output DVI-I-2 --set scaling mode None
 in xinitrc. Both displays work fine when Xserver is running. But after I
 go back to the console the display #1 switches off and #2 starts to
 blink again. Is it possible to fix the console??

 Do you need to tell the binary driver to do anything special to get it
 working on those monitors? (e.g. manually specified timings or EDID)
 No, binary driver works fine without any extra settings. The version I
 used was 173.14.22. With nouveau drivers I'm forced to use the
 following

 Modeline 1600x1200_def 144  1600 1628 1788 1920  1200 1201 1204 1250

 Without it the displays (both) are blinking. Remember that binary
 nvidia claims that the MaxPixClk is 135MHz and nouveau says that it is
 175MHz. According to the spec it should be 162MHz. I'm not sure but I
 think that this high rate works only with D-SUB. Actually I have 3
 monitors and the third one is connected with the ancient Matrox
 Millennium II card using the D-SUB cable and it uses the modeline with
 the 162MHz PixClk.

Ah, I think you're hitting the bandwidth limitation of the nv34
integrated TMDS transmitter. The attached patch should help with the
console modesetting problem, but you'll still need to set the modelines
manually (and force panel rescaling) if you want to go up to 1600x1200,
because your GPU *cannot* handle the video mode your monitor is asking
for.


 PS. By the way - I was force to install Win7 and it works with my LCD
 without any drivers installed.

diff --git a/drivers/gpu/drm/nouveau/nouveau_connector.c b/drivers/gpu/drm/nouveau/nouveau_connector.c
index 0871495..6208eed 100644
--- a/drivers/gpu/drm/nouveau/nouveau_connector.c
+++ b/drivers/gpu/drm/nouveau/nouveau_connector.c
@@ -641,11 +641,28 @@ nouveau_connector_get_modes(struct drm_connector *connector)
 	return ret;
 }
 
+static unsigned
+get_tmds_link_bandwidth(struct drm_connector *connector)
+{
+	struct nouveau_connector *nv_connector = nouveau_connector(connector);
+	struct drm_nouveau_private *dev_priv = connector-dev-dev_private;
+	struct dcb_entry *dcb = nv_connector-detected_encoder-dcb;
+
+	if (dcb-location != DCB_LOC_ON_CHIP ||
+	dev_priv-chipset = 0x46)
+		return 165000;
+	else if (dev_priv-chipset = 0x40)
+		return 155000;
+	else if (dev_priv-chipset = 0x18)
+		return 135000;
+	else
+		return 112000;
+}
+
 static int
 nouveau_connector_mode_valid(struct drm_connector *connector,
 			 struct drm_display_mode *mode)
 {
-	struct drm_nouveau_private *dev_priv = connector-dev-dev_private;
 	struct nouveau_connector *nv_connector = nouveau_connector(connector);
 	struct nouveau_encoder *nv_encoder = nv_connector-detected_encoder;
 	struct drm_encoder *encoder = to_drm_encoder(nv_encoder);
@@ -663,11 +680,9 @@ nouveau_connector_mode_valid(struct drm_connector *connector,
 		max_clock = 40;
 		break;
 	case OUTPUT_TMDS:
-		if ((dev_priv-card_type = NV_50  !nouveau_duallink) ||
-		!nv_encoder-dcb-duallink_possible)
-			max_clock = 165000;
-		else
-			max_clock = 33;
+		max_clock = get_tmds_link_bandwidth(connector);
+		if (nouveau_duallink  nv_encoder-dcb-duallink_possible)
+			max_clock *= 2;
 		break;
 	case OUTPUT_ANALOG:
 		max_clock = nv_encoder-dcb-crtconf.maxfreq;


pgpgzVLsh7YTE.pgp
Description: PGP signature
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-30 Thread Francisco Jerez
Pekka Paalanen p...@iki.fi writes:

 On Tue, 28 Sep 2010 18:17:25 +0200
 Grzesiek Sójka p...@pfu.pl wrote:

 I'm completely lost at the moment so let me start from the
 beginning. First thing is that everything works fine with Nvidia
 binary drivers, which means that hardware is rather OK. After
 loading of the nouveau.ko the monitor connected to DVI-I-1 goes
 to the standby mode. I do not know if the head is turned of or
 maybe some kind of rates are too high/low. I was trying to play
 with video=DVI-I-1: with different modes and switches (M/R/D/e)
 and nothing. At the some time (after loading nouveau.ko) the
 monitor connected with the DVI-I-2 starts to blink. And again I
 was playing with the video=DVI-I-2:??? and nothing. I thought
 that maybe after starting the xorg I will get the proper image at
 both monitor. Unfortunately - no success. According to the Xorg
 log file (Option ModeDebug true) all the refresh rates are
 ok. Moreover I was trying to use different ModeLines and nothing.
 The X server claims that there are two monitor connected and
 working but the first one is in standby mode and the second is
 blinking. Any ideas?? If you need some more information please
 let me know.

 Yes, the complete kernel and X logs, like I have requested twice before.
 That is the minimum before anyone will seriously start looking into it.

Before doing that, can you try latest git again? I pushed a patch
yesterday that may help with it.

 Preferrably, use the simplest setup that fails:
 - only one monitor physically connected
 - the following xorg.conf:
And use drm.debug=4 in your kernel command line, that should tell you
what timings are being used exactly.


 Section Device
 Identifier n
 Driver nouveau
 Option ModeDebug true
 EndSection


 Thanks.


pgpwI3BIFYzzl.pgp
Description: PGP signature
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-30 Thread Grzesiek Sójka
I did download new sources and applied the amd-agp patch but no change.
I get the some problems. Here is the dmesg without modeset:
http://yen.ipipan.waw.pl/~gs159090/tmp/txt.gz
Then I added following modeline to the file drm_edid_modes.h

/* 1600x1...@50hz */
{ DRM_MODE(1600x1200, DRM_MODE_TYPE_DRIVER, 135000, 1600, 1664,
  1856, 2160, 0, 1200, 1201, 1204, 1250, 0,
  DRM_MODE_FLAG_PHSYNC | DRM_MODE_FLAG_PVSYNC) },

and tried to force the mode by setting video=DVI-I-[12]:1600x1200...@50.
No luck, here you have the dmesg:
http://yen.ipipan.waw.pl/~gs159090/tmp/txt2.gz

I also did some test with the xorg:
http://yen.ipipan.waw.pl/~gs159090/tmp/Xorg.0.log.gz
http://yen.ipipan.waw.pl/~gs159090/tmp/xorg.conf.gz

The interesting thing is that according to the dmesg the xorgs tries to
set the mode:
Modeline 58:1600x1200_def 0 144000 1600 1628 1788 1920 1200 1201 1204
1250 0x0 0x0
but just after the mode is used:
Modeline 75:1600x1200 0 162000 1600 1664 1856 2160 1200 1201 1204 1250
0x48 0x5
Here is the xrandr --verbose output:
http://yen.ipipan.waw.pl/~gs159090/tmp/xrandr.nouveau.gz

Just in case I also recorded the xrandr output in the case of the Nvidia
binary driver:
http://yen.ipipan.waw.pl/~gs159090/tmp/xrandr.nvidia.gz
I was really surprised because the xrandr claims that the refresh rate
is 50Hz but at the OSD there is an information that the refresh rate is
60Hz. What to think?? Next thing is that the refresh rate of 50Hz is out
of the monitor range: VertRefresh 56 - 75. So the Xserver should not to
set it to 50.

All the time (when using nouveau.ko) the monitor #1 is switched off and
#2 is blinking. It does not mater if Xserver is running or not.

Please let me know if you need any more data.

Thanks for any help in advance.

Regards
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-30 Thread Francisco Jerez
Grzesiek Sójka p...@pfu.pl writes:

 I did download new sources and applied the amd-agp patch but no change.
 I get the some problems. Here is the dmesg without modeset:
 http://yen.ipipan.waw.pl/~gs159090/tmp/txt.gz
 Then I added following modeline to the file drm_edid_modes.h

 /* 1600x1...@50hz */
 { DRM_MODE(1600x1200, DRM_MODE_TYPE_DRIVER, 135000, 1600, 1664,
   1856, 2160, 0, 1200, 1201, 1204, 1250, 0,
   DRM_MODE_FLAG_PHSYNC | DRM_MODE_FLAG_PVSYNC) },

 and tried to force the mode by setting video=DVI-I-[12]:1600x1200...@50.
 No luck, here you have the dmesg:
 http://yen.ipipan.waw.pl/~gs159090/tmp/txt2.gz

 I also did some test with the xorg:
 http://yen.ipipan.waw.pl/~gs159090/tmp/Xorg.0.log.gz
 http://yen.ipipan.waw.pl/~gs159090/tmp/xorg.conf.gz

 The interesting thing is that according to the dmesg the xorgs tries to
 set the mode:
 Modeline 58:1600x1200_def 0 144000 1600 1628 1788 1920 1200 1201 1204
 1250 0x0 0x0
 but just after the mode is used:
 Modeline 75:1600x1200 0 162000 1600 1664 1856 2160 1200 1201 1204 1250
 0x48 0x5
 Here is the xrandr --verbose output:
 http://yen.ipipan.waw.pl/~gs159090/tmp/xrandr.nouveau.gz

Does the following command help?
$ xrandr --output DVI-I-2 --set scaling mode None

 Just in case I also recorded the xrandr output in the case of the Nvidia
 binary driver:
 http://yen.ipipan.waw.pl/~gs159090/tmp/xrandr.nvidia.gz
 I was really surprised because the xrandr claims that the refresh rate
 is 50Hz but at the OSD there is an information that the refresh rate is
 60Hz. What to think?? Next thing is that the refresh rate of 50Hz is out
 of the monitor range: VertRefresh 56 - 75. So the Xserver should not to
 set it to 50.

Don't worry about that, IIRC it's just a hack nvidia did to
differentiate between meta-modes with the same resolution but a
different set of enabled outputs, instead of switching to RandR12.


 All the time (when using nouveau.ko) the monitor #1 is switched off and
 #2 is blinking. It does not mater if Xserver is running or not.

 Please let me know if you need any more data.

 Thanks for any help in advance.

 Regards


pgpyOgvSpsQlp.pgp
Description: PGP signature
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-30 Thread Francisco Jerez
Grzesiek Sójka p...@pfu.pl writes:

 On 09/30/10 18:05, Francisco Jerez wrote:
 Does the following command help?
 $ xrandr --output DVI-I-2 --set scaling mode None
 Yes! Now I have:
 xrandr --output DVI-I-1 --set scaling mode None
 xrandr --output DVI-I-2 --set scaling mode None
 in xinitrc. Both displays work fine when Xserver is running. But after I
 go back to the console the display #1 switches off and #2 starts to
 blink again. Is it possible to fix the console??

Do you need to tell the binary driver to do anything special to get it
working on those monitors? (e.g. manually specified timings or EDID)

 Regards.


pgpHWYtDixNuB.pgp
Description: PGP signature
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-29 Thread Pekka Paalanen
On Tue, 28 Sep 2010 18:17:25 +0200
Grzesiek Sójka p...@pfu.pl wrote:

 I'm completely lost at the moment so let me start from the
 beginning. First thing is that everything works fine with Nvidia
 binary drivers, which means that hardware is rather OK. After
 loading of the nouveau.ko the monitor connected to DVI-I-1 goes
 to the standby mode. I do not know if the head is turned of or
 maybe some kind of rates are too high/low. I was trying to play
 with video=DVI-I-1: with different modes and switches (M/R/D/e)
 and nothing. At the some time (after loading nouveau.ko) the
 monitor connected with the DVI-I-2 starts to blink. And again I
 was playing with the video=DVI-I-2:??? and nothing. I thought
 that maybe after starting the xorg I will get the proper image at
 both monitor. Unfortunately - no success. According to the Xorg
 log file (Option ModeDebug true) all the refresh rates are
 ok. Moreover I was trying to use different ModeLines and nothing.
 The X server claims that there are two monitor connected and
 working but the first one is in standby mode and the second is
 blinking. Any ideas?? If you need some more information please
 let me know.

Yes, the complete kernel and X logs, like I have requested twice before.
That is the minimum before anyone will seriously start looking into it.

Preferrably, use the simplest setup that fails:
- only one monitor physically connected
- the following xorg.conf:

Section Device
Identifier n
Driver nouveau
Option ModeDebug true
EndSection


Thanks.

-- 
Pekka Paalanen
http://www.iki.fi/pq/
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-28 Thread Grzesiek Sójka
By the way: There is now direct implication between the refresh rate and
the PixelClock. In theory you can do arbitrary low resolution/refresh
rate by arbitrary high PixelClock. Only thing is that setting to high
values of PixelClock does not make any sense.
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-28 Thread Pekka Paalanen
On Tue, 28 Sep 2010 12:43:12 +0200
Grzesiek Sójka p...@pfu.pl wrote:

 On 09/27/10 23:41, Pekka Paalanen wrote:
  Are you saying that forcing a low-refresh-rate mode via video=
  kernel argument does not help?
  
  But you still do get a good picture on the framebuffer console,
  do you not? In that case, X is overriding what kernel thinks is
  the best mode, so you should clean up the X config.
  
  If wiping the X config does not help, post kernel and X logs.
 
 1. At the moment I try to get a proper image on the console. I
 put the comment about the X-log only to explain why I thing that
 there is something wrong with PixClk. Actually the screen blinks
 in the some way at the console and when I run the X-server. I was
 trying to play with the ModeLine but it seems to be ignored. For
 example whatever i put +hsync +vsync or -hsync -vsync I always
 get at the OSD info that both polarisation are positive. So it
 looks like all the timings and etc. are set by the nouveau.ko not
 by the xorg nouveau driver.

Are you sure X even runs and doesn't die too soon?
Anyway, one thing at a time, better disable X for now to debug this.

You said you have dual monitors, does the problem occur with only one?

 2. Whatever I put to the video= I always get the some problem. It
 looks a little bit like only the resolutions is taken and all the
 rest is ignored. I do not have the way to check it but my LCD
 always claims that the mode is 1600x1...@60 horizontal 74.9KHz
 and both polarizations positive. It does not display the PixClk.

I forgot about the dual monitors. When you have more than one
output active, it is best to be explicit on which output is being
set via video= argument, see Forcing Modes at
http://nouveau.freedesktop.org/wiki/KernelModeSetting

My idea with video= was the timings are (sometimes?) computed
with a standard formula, CVT or GTF.
I would expect lower resolution or refresh to produce lower
pixel clock. Experimenting with non-standard modes might be
interesting.

 So the situation is very strange. I think that the good start
 would be checking what are the current rates of the vertical,
 horizontal and pixclk. Is it a way to force nouveau.ko to put it
 into the dmesg?? If not then maybe I could just hard-code
 something like: printk(h=%d,V=%d,P=%d, ???
 into the source tree. The question is where to put it and what
 are the names of the appropriate variables.

Nouveau kernel module has reg_debug option which would show
the raw values being programmed to hardware. drm.ko has the
debug parameter, for which I don't have the documentation at
hand here. It is a bitmask, too. Someone else should guide
you with the source tree.

I think it would be best to make a report in bugzilla,
where you can attach kernel and X logs. X is by default
more verbose, like you said. In X, Option ModeDebug true
in the Device section will give even more information. This
should be done with otherwise minimal xorg.conf.

Did you ever need anything special with the proprietary
driver, like ModeLines or EDID data from file in xorg.conf?


Thanks.

PS. samsung syncmaster 204b, eh? I got one today, I just need
to replace all the capacitors to get it working. Cheap caps
syndrome, I assume.

-- 
Pekka Paalanen
http://www.iki.fi/pq/
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


Re: [Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-27 Thread Pekka Paalanen
On Sat, 25 Sep 2010 16:04:23 +0200
Grzesiek Sójka p...@pfu.pl wrote:

 Im using GeForce FX5200 dual DVI graphics adapter connected with
 two Samsung 204b monitors. At the moment I'm using binary nvidia
 drivers and everything works fine but I'm forced to switch to
 nouveau. Unfortunately there are problems. It seems that
 nouveau.ko wrongly detects the PixelClock of my monitors. More
 precisely, after loading of nouveu.ko the screen starts to blink.
 According to my experience the problem is too hight PixelClock.
 Moreover, in the /var/log/Xorg.0.log you can find the line:
 
 NOUVEAU(0): Ranges: V min: 56 V max: 75 Hz, H min: 30 H max: 81
 kHz, PixClock max 175 MHz
 
 which is false. The maximum PixClk for Sumsung 204b is 162MHz. I
 was trying to play with the video= kernel parameter but there is
 no way to force the PixClk.

Are you saying that forcing a low-refresh-rate mode via video=
kernel argument does not help?

But you still do get a good picture on the framebuffer console,
do you not? In that case, X is overriding what kernel thinks is
the best mode, so you should clean up the X config.

If wiping the X config does not help, post kernel and X logs.


Cheers.

-- 
Pekka Paalanen
http://www.iki.fi/pq/
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau


[Nouveau] GeForce FX5200 dual DVI Samsung 204b

2010-09-25 Thread Grzesiek Sójka
Hi there,

Im using GeForce FX5200 dual DVI graphics adapter connected with two
Samsung 204b monitors. At the moment I'm using binary nvidia drivers and
everything works fine but I'm forced to switch to nouveau. Unfortunately
there are problems. It seems that nouveau.ko wrongly detects the
PixelClock of my monitors. More precisely, after loading of nouveu.ko
the screen starts to blink. According to my experience the problem is
too hight PixelClock. Moreover, in the /var/log/Xorg.0.log you can find
the line:

NOUVEAU(0): Ranges: V min: 56 V max: 75 Hz, H min: 30 H max: 81 kHz,
PixClock max 175 MHz

which is false. The maximum PixClk for Sumsung 204b is 162MHz. I was
trying to play with the video= kernel parameter but there is no way to
force the PixClk. I was also trying to edit the suitable ModeLine in the
drivers/gpu/drm/drm_edid_modes.h but I thing nouveau.ko ignores it. So
my question is what to change (in the kernel source) to override the
detection and force MaxPixelClock to 162MHz just to make sure that the
problem is related to PixelClock. To build the kernel I used the kernel
tree downloaded from nouveau about one week ago.

Thanks in advance for any help.
___
Nouveau mailing list
Nouveau@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/nouveau