On Wed, Feb 23, 2005 at 04:35:06PM -0500, Sean Cier wrote: > Brad Templeton wrote: > >Well, here's an odd failure story. I picked up a cable -- perhaps it > >was too cheap, but the sense I get with HDMI and cheap cables is that as > >a digital standard it either works or it doesn't. > > Which is a common myth or misunderstanding. Each bit being either 0 or 1 > doesn't mean that a billion bits either get there or don't -- and moreover, > it doesn't mean the cabling is not a factor. It's still an analog signal
I agree, but the problems are going to be different. Randomly lost bits are going to result, if they are occuring, in very different degredation, sparkles of the wrong colour, etc. > >And the image sucked! It wasn't as overscanned, which is nice, but it > >was noticeably worse in computer display mode (which shows resolution > >the best). > > What sucked about it -- sparklies (often green), horizontal lines (i.e. a > ground loop), colour, fuzziness, or something else? Colour is a common Fuziness! That surprised me the most but led me to think the cable may not be to blame. Another person said that Mitsibushis may convert their hdmi to analog first, which seems odd but is the leading theory. And yes, I went into the settings and played with sharpnes, turned off edge detection, adjusted contrast etc. The TV actually gives me more parameters to play with on the hdmi input than on the vga input. I also played with settings in nvidia-settings. > difference; it usually just means things have to be calibrated differently, > and the digital version is generally more reliable and accurate even if it > doesn't 'look right' at first. Sparklies (or just lack of signal) are the > only common artifact I know of that might actually be due to the cable > itself; otherwise it's probably either a setting somewhere (i.e. not > actually using the same resolution contrary to appearances; or a display > setting -- e.g., my projector has a whole set of settings that are > independent based on input) or possibly the display itself actually has > inferior electronics along the 'digital input' path. I will keep hunting for other settings. I had not expected there to be a giant improvement with DVI, but I had not expected it to be noticeably inferior. The TV itself has more analog-style noise that I would like (no TV I shopped among was perfect) which it gets on both component, digital and its own atsc-tuned inputs. (But not on its own on-screen-displays) That was the one thing I was hoping to improve, but no luck. The TV has both VGA and, strangely what it calls DTV -- RGBHV -- which I presume a VGA to 5 RCA breakout cable will handle. I will try one of those at some point, since the TV refuses 1080i on the VGA but will take it on the RGBHV. Though I am finding less and less need to put in 1080i, I am now getting pretty decent deinterlace in mythtv at 720p, which is the native resolution anyway.
_______________________________________________ mythtv-users mailing list [email protected] http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
