On 2/09/2010 12:20 AM, Oz Linden (Scott Lawrence) wrote:
>    On 2010-09-01 9:46, Francesco Rabbi wrote:
>> No, this is a client side problem in file handling, not an HTTP
>> problem... You can parallelize billions of download,
> Whether or not you _can_, you _shouldn't_.  The HTTP spec is quite clear
> on this point.
RFC2616 makes for great reading, and the *majority* of it is superbly 
thought through. I spent years with a copy close-to-hand at all times.
> We'd get much better performance than we're getting now if we fixed the
> servers to support persistent connections; there's a lot of overhead in
> setting up a new connection - extra round trips plus TCP slow-start.
Concur. However, persistent connections (and possibly pipelining) will 
pretty much mean that you'll need to make sure you're maintaining a 
priority-queue of textures to fetch. Otherwise it will *feel* slower to 
the end-user even if it is actually faster in total fetch-and-render 
time. I've been down this road before.

-- 
Tateru Nino
http://dwellonit.taterunino.net/

_______________________________________________
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Reply via email to