you solve this problem by checking how much data they are grabbing....or do 
they all start at they top and who-ever is fastest?  I would think this 
requires something more than a get/post command...pattern chaecking is always 
hard to do. but if i was just a content site; this would piss me offf.
   
  do no evil
  lemon

Travis Kalanick <[EMAIL PROTECTED]> wrote:
  There is a reason why in the states, Download Accelerator Plus and FlashGet,
and all the others got 100's of millions of installs.

It will speed up your downloads. As broadband deployment increased however,
these tools had less and less impact (and less and less installs). My guess
is that China is in the beginning of this cycle (hence the massive numbers
of installs).

Regarding duplicate content: The duplicate servers of content exist because
quite often with games or popular video, many different sites will offer up
the same content.

Many content providers, especially popular ones, will want to thwart Xunlei
because Xunlei will pull from the popular content provider's servers and use
their bandwidth, even if the user didn't go to the popular content
provider's site (and see the ads placed there). If the popular content
provider doesn't thwart Xunlei, then anybody can create a site with all of
the downloads that the popular content provider has, but with the users
saturating the bandwidth of the popular content provider.

Websites can prevent partial or full gets by Xunlei by identifying and
rejecting requests from the Xunlei client. This can be accomplished today
by having your webserver look at the HTTP header for User Agent, or look at
the referrer field. If Xunlei begins to basically "forge" a User Agent or
referrer so that it is not filtered than the sites will have to get more
sophisticated and then make assessments based on request patterns, or IP
addresses, session variables, tokens or other similar techniques.


-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Matthew Kaufman
Sent: Friday, December 29, 2006 7:31 AM
To: theory and practice of decentralized computer networks
Subject: Re: [p2p-hackers] Xunlei

David Barrett wrote:
> Wow, very interesting. I can see the value of downloading in parallel
from
> multiple mirrors...
I can't. Assuming they're all servers with public IP addresses and 
reasonable outbound bandwidth, having each client load-balanced to a 
single specific server and downloading over a single TCP stream is more 
efficient for the network -- and provides better download performance 
for the user -- than having multiple TCP streams fight over the same 
congestion-limited download pipe.

The only reason to download from multiple sources simultaneously is the 
case where upstream capacity of serving nodes is a small fraction of 
downstream capacity (see: P2P filesharing network where the files are 
only present on ADSL or cable modem connected user machines), and thus 
there'd be no way to fill the download pipe otherwise.

Matthew Kaufman
[EMAIL PROTECTED]
_______________________________________________
p2p-hackers mailing list
p2p-hackers@lists.zooko.com
http://lists.zooko.com/mailman/listinfo/p2p-hackers

_______________________________________________
p2p-hackers mailing list
p2p-hackers@lists.zooko.com
http://lists.zooko.com/mailman/listinfo/p2p-hackers



You don't get no juice unless you squeeze
Lemon Obrien, the Third.

http://www.tamago.us
_______________________________________________
p2p-hackers mailing list
p2p-hackers@lists.zooko.com
http://lists.zooko.com/mailman/listinfo/p2p-hackers

Reply via email to