On Fri, 19 Feb 2010 22:31:30 +0800, Mikko Rantalainen <[email protected]> wrote:

The actual average bitrate is simple, it's the duration of the binary
file divided by the length of the movie in seconds. I don't care if this
is Mbps or kBps as long as we have some clear definition. This is needed
for the streaming case: UA shouldn't automatically select a movie for
streaming that has higher average bitrate than the current data
connection can transfer.

The bandwidth isn't a property of the connection, it's between the client and the server. For example, I may have high bandwidth to computers within the network/country/whatever but very poor bandwidth to computers half way across the world. The browser would have to perform some kind of bandwidth test against the actual server serving the video to make the decision.

In any event, for the record, I don't think it's realistic to change the resource selection algorithm to support dynamic switching between sources and think this kind of thing belongs at the protocol level or in a resource file.

--
Philip Jägenstedt
Core Developer
Opera Software

Reply via email to