Andrew Lentvorski wrote:
Michael J McCafferty wrote:

    I gotta go do some work, but the conversation eventually did cover TV
quality and how 1080p over Fiber is better than 1080p over Copper and
how 720p is also better over Fiber, especially if you don't have a HD
TV.

Surprisingly, this appears to be true. There is apparently some pretty significant angst going on right now because the cable companies are at their bandwidth limits on copper with the way things are configured. Consequently, they are compressing HD signals to the point that they are starting to suck.

IMEO, all TV pictures have sucked for many years. In fact, I've found analog to be better in some cases than digital. Now you ask why I say that?

I used to work on the very first industry MPEG-2 video compressors ever made. Our company was five years ahead of the competition (the competition being Scientific Atlanta, which from what I know subsequently stole our designs, hardware, and software). My job was testing, troubleshooting, and repairing the $60K video compressor hardware that went into the racks that were used to uplink digital video to the satellites. I also built the production test stations, wrote the test documents, and wrote the test software for these units. We had a maximum compression ratios of 270:1. Part of my job when testing was to view the video (sometimes for hours on end - how many hundreds of times have I seen Jurassic Park!?) and make sure there were no "macro-blocks". This is commonly known as pixelation. Obvious pixelation occurs right around 2MB/sec. streaming video rate.

After two plus years of this, I have a keen eye for pixelation. I can see it when no one else can. Every digital picture I've seen since the early days of my Dish Network service, whether it be Time Warner, Cox, Dish, or Direct TV, suffers from the same low quality. This includes some HD service (including COX). There really is no difference from one provider to the next, and there can't be without some major changes.

This is because all the video feeds come from the same uplinks. They use the same encoders. In an attempt to maximize bandwidth and cram more channels into the same space, they began raising the compression ratio. It didn't take long before they reached the fine line between obvious pixelation and barely discernible pixelation. Most people can't see it, but I see it as glaringly obvious.

The reason I have stayed with Dish for so many years is not completely because of picture quality, but because it's been the best bang for my buck. It rarely ever fails (only once has the signal failed and that was because of high sunspot activity years back). Service - when I've needed it - is great including receiver replacement when one fails. I am legally allowed to copy recorded programs from my DVR to other media (my DVR manuals even tell me how to do it). So, AT&T or anyone else would have to have their own uplinks, satellites, downlinks, and run their compressors at somewhere around 100:1 compression to get me to change.

Never gonna happen.

PGA
--
Paul G. Allen, BSIT/SE
Owner, Sr. Engineer
Random Logic Consulting Services
www.randomlogic.com


--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to