Nathan,

While you hit the point in your second paragraph, namely that Apple REQUIRES 
25Mbps (as do others of the major streaming services, including Netflix today), 
your first paragraph misses it. It doesn’t matter what is POSSIBLE (unless you 
have the ability to persuade all streaming services to implement those 
technologies and ensure they work for the lion’s share of installed end-user 
equipment and 4K HDR streams, in which case, well done and I would agree that a 
lower bitrate is sufficient). The ONLY factor that matters in terms of required 
bandwidth to be considered a fully capable ISP service is what the market 
demands for the mainstream Internet services. That is 25Mbps.

As the article you linked to points out, those lower bitrates are NOT for 4K 
HDR (10-bit color depth per pixel). For those, even in the authors’ chosen 
examples, and possibly only at 8-bit color (not clear), the article claims to 
only get down to a low of about 17Mbps for the highest quality. I’ve seen other 
reports that say anything below 20Mbps will occasionally fail on particular 
complex scenes that don’t compress well. Add a little bit of overhead or assume 
some additional traffic (an important consideration, given the raison d’être of 
this group – reduce latency under load from multiple streams), and you’re back 
to 25Mbps on needed bandwidth to support multiple concurrent activities.

While I concede there is not a widely accepted standard for evaluating video 
quality (at least not one of which I’m aware), I dislike that Y axis (Quality) 
on their graphs has no metric, especially without a definition for how they 
define quality – is it based on lost data, % of pixels expressing compression 
artifacts, pixel color drift, or something else they created for the purpose of 
making their case? I would say that NONE of the photos shown constitute a good 
or excellent quality level, where all show significant compression artifacts at 
the high-contrast boundaries. These are distinct from natural focal problems 
with analog systems that are not contrast-dependent. Further, these all appear 
to be relatively static scenes with just a few small moving objects – the kinds 
of frames and scenes that compress extremely well. Again, this is why we must 
look to the market to determine what it needs, not individual proposals.

The article also acknowledges that the graph points represent the average, 
meaning some frames are better and some are worse. This is bad because with any 
lossy compression system, there is a (subjective) “good enough” level, where 
values above that don’t add much, but frames that are worse will stand out as 
bad. You can’t just watch the average – you’re forced to also watch the bad 
frames. In real-world usage, these will be the frames during high-speed image 
changes – explosions in action movies or a fast-panning scene), often the times 
when preserving fidelity are most important (e.g., you lose track of the 
football during the fast pan downfield, or you really want to see the detail in 
the X-wing fighters as the dogfight leads to explosions around them).

Further, that article is really targeting mobile usage for cellular bandwidth, 
where many of these viewing issues are fundamentally different from the 65” 
living room TV. The mobile display may offer 120Hz, but showing a movie or show 
at 30Hz (except for some sports) is still the standard.

Now, to be fair, I have no doubt that there exist today and will exist even 
more so in the future superior compression that could lower the bitrate needed 
at any given resolution and quality level. The one described in the article 
could be an important step in that direction. No doubt Netflix already has 
multiple economic incentives to reduce required bandwidth – their own bandwidth 
costs, which are a substantial % of their total operating costs, access to 
customers who can’t get 25Mbps connections, competition from other streaming 
services if they can claim that their streams are less affected by what others 
in the house are doing or are higher quality at any given bandwidth, etc. As 
noted above, however, that is all moot unless all of the major streamers adopt 
comparable bandwidth reduction technologies and ALSO that all major existing 
home equipment can support it today (i.e., without requiring people replace 
their TV’s or STB’s). Absent that, it’s just a technical novelty that may or 
may not take hold, like Betamax videotapes or HD-DVD.

On the contrary, what we see today is that the major streaming services REQUIRE 
users to have 25Mbps connections in order to offer the 4K HDR streams. Yes, 
users can lie and may find they can watch most of the 4K content they wish with 
only 20Mbps or in some cases 15Mbps connections, but that’s clearly not a 
reason why an ISP should say, “We don’t need to offer 25Mbps for our customers 
to be able to access any major streaming service.”

Cheers,
Colin

From: Nathan Owens <nat...@nathan.io>
Sent: Monday, May 6, 2024 9:44 AM
To: Alexandre Petrescu <alexandre.petre...@gmail.com>
Cc: Colin_Higbie <chigb...@higbie.name>; Frantisek Borsik 
<frantisek.bor...@gmail.com>; starlink@lists.bufferbloat.net
Subject: Re: [Starlink] It’s the Latency, FCC

You really don’t need 25Mbps for decent 4K quality - depends on the content. 
Netflix has some encodes that go down to 1.8Mbps with a very high VMAF:
https://netflixtechblog.com/optimized-shot-based-encodes-for-4k-now-streaming-47b516b10bbb

Apple TV has the highest bitrate encodes of any mainstream streaming service, 
and those do top out at ~25Mbps. Could they be more efficient? Probably…

_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink

Reply via email to