Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-20 Thread Grant
 ...
  I was thinking about this.  The digital HDMI signal must be converted
  into an analog signal at some point if it's being represented as light
  on a TV screen.  Electrical interference generated by the computer and
  traveling up the HDMI wire should have its chance to affect things
  (i.e. create weird shadows) at that point, right?
 
  Not with DFPs.  Those work digital even internally.  I assume of course
  that his HDMI TV *is* a DFP.

 But at some point the 1s and 0s must be converted to some sort of an
 analog signal if only right behind the diode.  A diode must be
 presented with a signal in some sort of analog form in order to
 illuminate, right?

 no.

 If your tv is a standard flat panel, the sub pixels only go from on to off 
 and
 back. Nothing else. There is no analog signal, no transformation nothing. 
 And
 off means 'let light through' and on 'black'

 Every digital signal is encoded into an analog signal.  I think it
 would take some serious EMI to sufficiently change the characteristics
 of an analog signal so as to create an error in the overlying digital
 signal if that signal is traveling along a wire.  I can imagine it
 happens but I would think it's rare.  Even if that signal were
 altered, I would think it just about impossible that anything but an
 error could be produced.

 Whether an LED is on or off is determined by whether or not it is
 forward biased.  Biasing is established by analog voltages and/or
 currents, and those can be altered by EMI.  Again, I would think it's
 very rare that EMI could affect an LED's forward biasing and change
 its state from on to off or off to on.

 However, what color an LED emits is determined by the energy gap of
 the semiconductor which is very much an analog process.  How could it
 be anything else?  How do you tell a photon to emit a certain color by
 feeding it 1's and 0's?  There has to be at least one D/A conversion
 somewhere between the digital signal and the emittance of the LED, and
 that is the most likely point for EMI to affect the final output.

 If you have an led display it is pretty much the same. All the levels you 
 see
 are achieved with fast switching. There are no analog levels.

 Stroller is probably correct with overscan/underscan.

 But that has nothing to do with digital/analog conversion.


 Digital is just a figment of our imagination after
 all.

 emm, no, seriously not.

 It is though.  It only exists in the conceptual world, not the
 physical world.  If you want to do anything with your digital signal
 besides change it, store it, or transfer it, there must be a D/A
 conversion.

 You're thinking of PCM. (And that's what I was thinking of, earlier,
 too). I assume Stroller and Volker are talking about PWM, where a
 perceived analog value is achieved by rapidly turning a signal from
 full-on to full-off.

 (Yes, there's no such thing as pure-digital in the physical world. The
 confusion here appears to be in PWM vs PCM.)
 --
 :wq

Everything I said above applies to both PCM and PWM.  They are only
conceptual layers built on top of a physical/analog base.  PWM
switching from full-on to full-off and back is an analog process
representing digital data in order to represent an analog signal.

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-19 Thread Grant
...
  I still think it's a driver problem.  Again: it's *physically*
  impossible to
  have these problems with the HDMI signal.  At most you get digital
  noise,
  which means some pixels get stuck or are missing.  But not what you
  get; that's just something that can't be explained.
 
  I was thinking about this.  The digital HDMI signal must be converted
  into an analog signal at some point if it's being represented as light
  on a TV screen.  Electrical interference generated by the computer and
  traveling up the HDMI wire should have its chance to affect things
  (i.e. create weird shadows) at that point, right?
 
  Not with DFPs.  Those work digital even internally.  I assume of course
  that his HDMI TV *is* a DFP.

 But at some point the 1s and 0s must be converted to some sort of an
 analog signal if only right behind the diode.  A diode must be
 presented with a signal in some sort of analog form in order to
 illuminate, right?

 no.

 If your tv is a standard flat panel, the sub pixels only go from on to off and
 back. Nothing else. There is no analog signal, no transformation nothing. And
 off means 'let light through' and on 'black'

Every digital signal is encoded into an analog signal.  I think it
would take some serious EMI to sufficiently change the characteristics
of an analog signal so as to create an error in the overlying digital
signal if that signal is traveling along a wire.  I can imagine it
happens but I would think it's rare.  Even if that signal were
altered, I would think it just about impossible that anything but an
error could be produced.

Whether an LED is on or off is determined by whether or not it is
forward biased.  Biasing is established by analog voltages and/or
currents, and those can be altered by EMI.  Again, I would think it's
very rare that EMI could affect an LED's forward biasing and change
its state from on to off or off to on.

However, what color an LED emits is determined by the energy gap of
the semiconductor which is very much an analog process.  How could it
be anything else?  How do you tell a photon to emit a certain color by
feeding it 1's and 0's?  There has to be at least one D/A conversion
somewhere between the digital signal and the emittance of the LED, and
that is the most likely point for EMI to affect the final output.

 If you have an led display it is pretty much the same. All the levels you see
 are achieved with fast switching. There are no analog levels.

 Stroller is probably correct with overscan/underscan.

 But that has nothing to do with digital/analog conversion.


 Digital is just a figment of our imagination after
 all.

 emm, no, seriously not.

It is though.  It only exists in the conceptual world, not the
physical world.  If you want to do anything with your digital signal
besides change it, store it, or transfer it, there must be a D/A
conversion.

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-19 Thread Michael Mol
On Tue, Jul 19, 2011 at 4:35 PM, Grant emailgr...@gmail.com wrote:
 ...
  I still think it's a driver problem.  Again: it's *physically*
  impossible to
  have these problems with the HDMI signal.  At most you get digital
  noise,
  which means some pixels get stuck or are missing.  But not what you
  get; that's just something that can't be explained.
 
  I was thinking about this.  The digital HDMI signal must be converted
  into an analog signal at some point if it's being represented as light
  on a TV screen.  Electrical interference generated by the computer and
  traveling up the HDMI wire should have its chance to affect things
  (i.e. create weird shadows) at that point, right?
 
  Not with DFPs.  Those work digital even internally.  I assume of course
  that his HDMI TV *is* a DFP.

 But at some point the 1s and 0s must be converted to some sort of an
 analog signal if only right behind the diode.  A diode must be
 presented with a signal in some sort of analog form in order to
 illuminate, right?

 no.

 If your tv is a standard flat panel, the sub pixels only go from on to off 
 and
 back. Nothing else. There is no analog signal, no transformation nothing. And
 off means 'let light through' and on 'black'

 Every digital signal is encoded into an analog signal.  I think it
 would take some serious EMI to sufficiently change the characteristics
 of an analog signal so as to create an error in the overlying digital
 signal if that signal is traveling along a wire.  I can imagine it
 happens but I would think it's rare.  Even if that signal were
 altered, I would think it just about impossible that anything but an
 error could be produced.

 Whether an LED is on or off is determined by whether or not it is
 forward biased.  Biasing is established by analog voltages and/or
 currents, and those can be altered by EMI.  Again, I would think it's
 very rare that EMI could affect an LED's forward biasing and change
 its state from on to off or off to on.

 However, what color an LED emits is determined by the energy gap of
 the semiconductor which is very much an analog process.  How could it
 be anything else?  How do you tell a photon to emit a certain color by
 feeding it 1's and 0's?  There has to be at least one D/A conversion
 somewhere between the digital signal and the emittance of the LED, and
 that is the most likely point for EMI to affect the final output.

 If you have an led display it is pretty much the same. All the levels you see
 are achieved with fast switching. There are no analog levels.

 Stroller is probably correct with overscan/underscan.

 But that has nothing to do with digital/analog conversion.


 Digital is just a figment of our imagination after
 all.

 emm, no, seriously not.

 It is though.  It only exists in the conceptual world, not the
 physical world.  If you want to do anything with your digital signal
 besides change it, store it, or transfer it, there must be a D/A
 conversion.

You're thinking of PCM. (And that's what I was thinking of, earlier,
too). I assume Stroller and Volker are talking about PWM, where a
perceived analog value is achieved by rapidly turning a signal from
full-on to full-off.

(Yes, there's no such thing as pure-digital in the physical world. The
confusion here appears to be in PWM vs PCM.)
-- 
:wq



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-18 Thread Volker Armin Hemmann
On Sunday 17 July 2011 09:54:33 Grant wrote:
  I gave it a try but there was no change.  I tried plugging the TV
  and
  computer into a power strip and also into an isolation
  transformer.
  Any other ideas?
  
  I still think it's a driver problem.  Again: it's *physically*
  impossible to
  have these problems with the HDMI signal.  At most you get digital
  noise,
  which means some pixels get stuck or are missing.  But not what you
  get; that's just something that can't be explained.
  
  I was thinking about this.  The digital HDMI signal must be converted
  into an analog signal at some point if it's being represented as light
  on a TV screen.  Electrical interference generated by the computer and
  traveling up the HDMI wire should have its chance to affect things
  (i.e. create weird shadows) at that point, right?
  
  Not with DFPs.  Those work digital even internally.  I assume of course
  that his HDMI TV *is* a DFP.
 
 But at some point the 1s and 0s must be converted to some sort of an
 analog signal if only right behind the diode.  A diode must be
 presented with a signal in some sort of analog form in order to
 illuminate, right?

no.

If your tv is a standard flat panel, the sub pixels only go from on to off and 
back. Nothing else. There is no analog signal, no transformation nothing. And 
off means 'let light through' and on 'black'

If you have an led display it is pretty much the same. All the levels you see 
are achieved with fast switching. There are no analog levels.

Stroller is probably correct with overscan/underscan.

But that has nothing to do with digital/analog conversion.


 Digital is just a figment of our imagination after
 all.

emm, no, seriously not.

-- 
#163933



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-17 Thread Grant
 I gave it a try but there was no change.  I tried plugging the TV and
 computer into a power strip and also into an isolation transformer.
 Any other ideas?

 I still think it's a driver problem.  Again: it's *physically* impossible to
 have these problems with the HDMI signal.  At most you get digital noise,
 which means some pixels get stuck or are missing.  But not what you get;
 that's just something that can't be explained.

I was thinking about this.  The digital HDMI signal must be converted
into an analog signal at some point if it's being represented as light
on a TV screen.  Electrical interference generated by the computer and
traveling up the HDMI wire should have its chance to affect things
(i.e. create weird shadows) at that point, right?

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-17 Thread Grant
 I gave it a try but there was no change.  I tried plugging the TV and
 computer into a power strip and also into an isolation transformer.
 Any other ideas?

 I still think it's a driver problem.  Again: it's *physically* impossible
 to
 have these problems with the HDMI signal.  At most you get digital
 noise,
 which means some pixels get stuck or are missing.  But not what you get;
 that's just something that can't be explained.

 I was thinking about this.  The digital HDMI signal must be converted
 into an analog signal at some point if it's being represented as light
 on a TV screen.  Electrical interference generated by the computer and
 traveling up the HDMI wire should have its chance to affect things
 (i.e. create weird shadows) at that point, right?

 Not with DFPs.  Those work digital even internally.  I assume of course that
 his HDMI TV *is* a DFP.

But at some point the 1s and 0s must be converted to some sort of an
analog signal if only right behind the diode.  A diode must be
presented with a signal in some sort of analog form in order to
illuminate, right?  Digital is just a figment of our imagination after
all.

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-17 Thread Michael Mol
On Sun, Jul 17, 2011 at 12:54 PM, Grant emailgr...@gmail.com wrote:
 I was thinking about this.  The digital HDMI signal must be converted
 into an analog signal at some point if it's being represented as light
 on a TV screen.  Electrical interference generated by the computer and
 traveling up the HDMI wire should have its chance to affect things
 (i.e. create weird shadows) at that point, right?

 Not with DFPs.  Those work digital even internally.  I assume of course that
 his HDMI TV *is* a DFP.

 But at some point the 1s and 0s must be converted to some sort of an
 analog signal if only right behind the diode.  A diode must be
 presented with a signal in some sort of analog form in order to
 illuminate, right?  Digital is just a figment of our imagination after
 all.

Sure, but that couldn't introduce ghosting as shown in the picture.
Ghosting represents the image being offset in its intended raster
coordinates. By the time a diode is turned on or off, the decision if
which diode a signal goes to has already been made.

-- 
:wq



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-17 Thread Grant
 I was thinking about this.  The digital HDMI signal must be converted
 into an analog signal at some point if it's being represented as light
 on a TV screen.  Electrical interference generated by the computer and
 traveling up the HDMI wire should have its chance to affect things
 (i.e. create weird shadows) at that point, right?

 Not with DFPs.  Those work digital even internally.  I assume of course that
 his HDMI TV *is* a DFP.

 But at some point the 1s and 0s must be converted to some sort of an
 analog signal if only right behind the diode.  A diode must be
 presented with a signal in some sort of analog form in order to
 illuminate, right?  Digital is just a figment of our imagination after
 all.

 Sure, but that couldn't introduce ghosting as shown in the picture.
 Ghosting represents the image being offset in its intended raster
 coordinates. By the time a diode is turned on or off, the decision if
 which diode a signal goes to has already been made.

True, but *is* that D/A conversion made right behind each diode?

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-16 Thread Mick
On Wednesday 13 Jul 2011 15:42:02 Nikos Chantziaras wrote:
 On 07/13/2011 03:25 PM, Mick wrote:
  [...]
  Is the [r600] gallium stable now?  I found it was locking up a kde
  desktop with effects enabled and set it back to classic.
 
 It's been made the default driver in Mesa now.  So I guess that means
 it's considered stable.  But for me, both classic and gallium can hang
 the machine.  At least with Gallium I know how to fix it though:
 
Section Device
  Identifier HD4870
  Driver radeon
  Option EnablePageFlip FALSE
EndSection
 
 in an xorg.conf.d file.

Unfortunately it doesn't help.  If I enable compositing in KDE the screen 
locks up (with horizontal artifacts/tearing) and I get this in dmesg:

[drm:r100_cs_track_check] *ERROR* [drm] No buffer for z buffer !
[drm:radeon_cs_ioctl] *ERROR* Invalid command stream !
[drm:r100_cs_track_check] *ERROR* [drm] No buffer for z buffer !
[drm:radeon_cs_ioctl] *ERROR* Invalid command stream !
[drm:r100_cs_track_check] *ERROR* [drm] No buffer for z buffer !
[drm:radeon_cs_ioctl] *ERROR* Invalid command stream !
[drm:r100_cs_track_check] *ERROR* [drm] No buffer for z buffer !
[drm:radeon_cs_ioctl] *ERROR* Invalid command stream !

Google mentions a bug in mesa.
-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-14 Thread Grant
 I gave it a try but there was no change.  I tried plugging the TV and
 computer into a power strip and also into an isolation transformer.
 Any other ideas?

 I still think it's a driver problem.  Again: it's *physically* impossible to
 have these problems with the HDMI signal.  At most you get digital noise,
 which means some pixels get stuck or are missing.  But not what you get;
 that's just something that can't be explained.

 I think it's worth reporting this as a bug upstream
 (http://bugs.freedesktop.org).

I've been working with a couple of devs:

https://bugs.freedesktop.org/show_bug.cgi?id=39120

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-13 Thread Mick
On Wednesday 13 Jul 2011 08:13:27 Nikos Chantziaras wrote:
 On 07/10/2011 02:21 AM, Grant wrote:
  When I was using an Nvidia video card, I noticed a strange sort of
  fuzzy edge effect if I used nvidia-drivers.  xf86-video-nouveau didn't
  have the same problem.  Now I've switched to an ATI video card and
  unfortunately I have the same problem with xf86-video-ati.  I tried to
  enable the new modesetting radeon driver in the kernel to see if that
  would help but it doesn't work with my HD4250 card yet.
 
 It should work.  But you need firmware that is not included in the
 kernel.  You need to install the x11-drivers/radeon-ucode package, and
 then build a kernel that includes the appropriate firmware.  Which
 firmware file (one of the *.bin files in /lib/firmware/radeon) is needed
 should be printed during boot; at the moment the kernel hangs, it should
 print which firmware file it was trying to load.
 
 On my HD4870, I configured it like so:
 
 In Device Drivers - Generic Driver Options, I've set:
 
 (radeon/R700_rlc.bin) External firmware blobs to build into the kernel
 binary
 (/lib/firmware) Firmware blobs root directory
 
 Then rebuild and install the kernel.  Before you reboot, make sure you
 have built media-libs/mesa with the gallium USE flag set, and do an
 eselect mesa set r600 gallium.  Make sure you don't have disabled KMS
 in the kernel command line or module options (radeon.modeset=0
 disables KMS).  After you reboot, you should have KMS + Gallium3D working.

I think the OP's card needs R600_rcl.bin as I've suggested in a previous 
message.

Is the gallium stable now?  I found it was locking up a kde desktop with 
effects enabled and set it back to classic.
-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-13 Thread Mick
On Wednesday 13 Jul 2011 15:42:02 Nikos Chantziaras wrote:
 On 07/13/2011 03:25 PM, Mick wrote:
  [...]
  Is the [r600] gallium stable now?  I found it was locking up a kde
  desktop with effects enabled and set it back to classic.
 
 It's been made the default driver in Mesa now.  So I guess that means
 it's considered stable.  But for me, both classic and gallium can hang
 the machine.  At least with Gallium I know how to fix it though:
 
Section Device
  Identifier HD4870
  Driver radeon
  Option EnablePageFlip FALSE
EndSection
 
 in an xorg.conf.d file.

Great!  Thanks for the hint, I'll try it out.
-- 
Regards,
Mick


signature.asc
Description: This is a digitally signed message part.


Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-13 Thread Grant
 When I was using an Nvidia video card, I noticed a strange sort of
 fuzzy edge effect if I used nvidia-drivers.  xf86-video-nouveau didn't
 have the same problem.  Now I've switched to an ATI video card and
 unfortunately I have the same problem with xf86-video-ati.  I tried to
 enable the new modesetting radeon driver in the kernel to see if that
 would help but it doesn't work with my HD4250 card yet.

 It should work.  But you need firmware that is not included in the kernel.
  You need to install the x11-drivers/radeon-ucode package, and then build a
 kernel that includes the appropriate firmware.  Which firmware file (one of
 the *.bin files in /lib/firmware/radeon) is needed should be printed during
 boot; at the moment the kernel hangs, it should print which firmware file it
 was trying to load.

 On my HD4870, I configured it like so:

 In Device Drivers - Generic Driver Options, I've set:

 (radeon/R700_rlc.bin) External firmware blobs to build into the kernel
 binary
 (/lib/firmware) Firmware blobs root directory

You're right.  That fixed the stall during kernel load and now KMS works fine.

 Then rebuild and install the kernel.  Before you reboot, make sure you have
 built media-libs/mesa with the gallium USE flag set, and do an eselect
 mesa set r600 gallium.  Make sure you don't have disabled KMS in the kernel
 command line or module options (radeon.modeset=0 disables KMS).  After you
 reboot, you should have KMS + Gallium3D working.

I've eselected to gallium but is there any benefit if I don't use 3D at all?

- Grant



Re: [gentoo-user] Re: Problem with xf86-video-ati nvidia-drivers

2011-07-13 Thread Grant
 When I was using an Nvidia video card, I noticed a strange sort of
 fuzzy edge effect if I used nvidia-drivers.  xf86-video-nouveau didn't
 have the same problem.  Now I've switched to an ATI video card and
 unfortunately I have the same problem with xf86-video-ati.  I tried to
 enable the new modesetting radeon driver in the kernel to see if that
 would help but it doesn't work with my HD4250 card yet.  Does anyone
 know how to fix this?  Here's a photo of the effect around the mouse
 cursor:

 http://imageshack.us/photo/my-images/804/cursor.jpg

 - Grant


 Hi Grant,

 just a shot in the dark:
 The image looks to me as thos would be an analog instead of
 an digital problem.
 May be both propietary drivers switch to the highest possible
 data transfer rate and this triggers the problem.
 To check, whether this may be the problem:
 Instruct the driver to use either low resolution or low refresh
 rates. Check both.
 If the problem changes signifiently: Change the cables.
 May be only a pluf is not inserted correctly.
 Addtionally you can move the cables arround to see whether
 this will change the shadows around the cursor in any way...

 Good luck! :)
 Best regards
 mcc

 Thanks for that.  I'm still working on it but adding radeon.audio=0 to
 grub cleaned it up about 75%.

 - Grant

 It turns out the radeon.audio=0 setting disables HDMI data packets and
 puts the HDMI port in DVI mode.  mcc, I'm starting to think you had it
 pretty right on.  I've tried two different cables with the same result
 but I'm thinking this may be some sort of electrical interference
 issue.

 HDMI is digital, so there can be no interference.  This looks more like a
 driver bug.

I tried the latest git-sources-3.0 kernel with the same results.

 Btw, why are you connecting to your monitor with HDMI?  For computer
 monitors, you use the DVI port, not HDMI.  HDMI is for TVs.  Unless of
 course your monitor lacks a digital DVI port (DVI-I or DVI-D).  If it only
 has a DVI-A port, only then is HDMI the better solution.

The monitor is actually a 47 LG HDTV.  This is an HTPC.

- Grant