If you looked at satellite HD broadcasts I would suspect you would find even
worse bitrates among several of the HD stations.

It all comes down to bandwidth - the cable satellite operators have limited
bandwidth and are trying to cram as many stations in as possible, and in
many case this means highly compressing some stations.  Usually they try to
make sure that HD signals where it is most likely to be noticed (Discovery
HD, PPV movies and live sports) get the best signal while other pseudo HD
channels like History or HGTV will get re-compressed like crazy.

---------------------------
Brian Weeden
Technical Consultant
Secure World Foundation <http://www.secureworldfoundation.org>
+1 (514) 466-2756 Canada
+1 (202) 683-8534 US


On Wed, Apr 8, 2009 at 1:35 AM, James Maki <[email protected]> wrote:

> I discovered something this week and am trying to understand its
> ramifications. I noticed lots of pixelation and motion blur the last two
> weeks of Heroes. NBC broadcasts at 1080i for HDTV. I checked the statistics
> for the show I recorded via HD Homerun tuners using Comcast cable, and NBC
> is averaging about 4.8 GB per hour for a 1080i show. I thought is a bit low
> but was even more surprised when I checked out shows on the other broadcast
> networks.
>
> ABC     720p/60fps              6.3 GB
> NBC     1080i/29.97fps  4.8 GB
> CBS     1080i/29.97fps  5.6 GB
> PBS     720p/60fps              5.4 GB
> CW      1080i/29.97fps  7.9 GB
> FOX     720p/60fps              7.3 GB
>
> I find it strange that NBC has the lowest total file size but is
> broadcasting at 1080i, so I assuming (and I know the drawback of that!) it
> is compressed more than the other channels and am again assuming that is
> why
> I am seeing the picture degradation. Calling Comcast is a joke, so I wanted
> to do the math to calculate the 'bits-per-second" for each case, but am not
> exactly sure if I am doing this correctly. It would seem that 4.8 GB/hr
> would calculate as:
>
> 4.8 GB/hr * 1 hr/60 min * 1 min/60 sec * 1024 MB/GB * 8 Mb/MB = 10.9 Mbps.
>
> One online source indicated that for quality 1080i you should have at least
> 15 Mbps.
>
> For the FOX network, the calculation would give 16.6 Mbps, far better than
> the 12 Mbps my online source gave for quality 720p broadcasts.
>
> I can't understand why the 720p broadcast is actually providing better
> throughput than the 1080i. It seems backwards (which is why I am wondering
> if my math is correct). I am not sure how to factor in the fps figures, if
> at all.
>
> If you can add some insight, it would be appreciated.
>
> Thanks,
>
> Jim Maki
> [email protected]
>
>

Reply via email to