funny it evolves from FTP, which is natorious for being slow. Do they realize
the phone company is going video so everyone will have fat pipe in the future
and it won't matter?
I find it funny everyone is trying to make a central source appear fatter
than for downloading video...and really splitting the the "url" is really not
hard to duplicate...cause all p2p systems do this....i do it with home
connections now...we all know home download speed is faster than upload speed
with ADSL.
so...i'm guessing you have to download software to use....maybe a browser
plugin. Sorry but I haven't/won't surf that site. I have enough on my hands now.
but...didn't youtube prove if you just go huge, it feels like realtime
anyway? and i would think google has solved almost any centralized speed issue
given they already distribute the search somewhat ( i really don't care about
google's technology so i'm not an authority)
curious as to the strategic investment rumors?
lemon
Alex Pankratov <[EMAIL PROTECTED]> wrote:
"Download managers" appeared in mid- to late-90s as FTP clients
that supported resuming prematurely aborted or explicitly paused
downloads. They used REST FTP command for that and they added a
support for a similar mechanism with HTTP in a bit. If you look
back you'd probably remember that public FTP archives were a very
big thing back then.
Then they added an ability to use multiple download streams in
parallel. Then they added pretty GUI, download scheduling, file
management and other fluff.
It was wildly popular; at least where I come from :) The reason
for the popularity was that a lot of FTP/HTTP servers did a per-
connection throttling of the downloads. Speed caps were easily
in a range of 5 KBps, so downloading with N streams was in fact
making at it all go N times faster.
It is only natural that these managers evolved with time and
steered toward p2p/collaborative/swarm-like behavior.
WRT Google's interest with Xunlei, I guess it's the standard "we
want your user base" case. Nothing more.
Alex
> -----Original Message-----
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of
> David Barrett
> Sent: Friday, December 29, 2006 1:43 PM
> To: [EMAIL PROTECTED]; 'theory and practice of decentralized
> computer networks'
> Subject: RE: [p2p-hackers] Xunlei
>
> If the webserver has adequate bandwidth I agree; I was
> assuming there were a
> bunch of underpowered or bandwidth-throttling webservers in use.
>
> Incidentally, does anyone use any of the download managers
> out there? Out
> of curiosity I installed wxDownloadFast and did some cursory
> testing, but it
> seemed to download files *50% slower* than FireFox.
>
> Basically, it took a single big file, split it up into 5
> ranges, and then
> downloaded those in parallel. But presumably due to the TCP
> competition
> issues Matthew mentioned, it was far faster to just do it all
> with a single
> request.
>
> I see reports that these download managers get massive
> installs -- FlashGot
> is like the #3 FireFox extension, and it only works if you
> already have
> *another* download manager installed; by itself it does nothing.
>
> Furthermore, most of the features are variants on "download
> all the links on
> this page", which I'm struggling to find a use for. Perhaps
> it's useful to
> download from the occasional open file share, but I don't see
> why this is so
>
>
> Can anyone explain what the draw of these products/features?
> I can only
> guess it somehow helps a lot of people get porn or warez, but
> I'm not seeing
> it.
>
> -david
>
> > -----Original Message-----
> > From: [EMAIL PROTECTED] [mailto:p2p-hackers-
> > [EMAIL PROTECTED] On Behalf Of Matthew Kaufman
> > Sent: Friday, December 29, 2006 7:31 AM
> > To: theory and practice of decentralized computer networks
> > Subject: Re: [p2p-hackers] Xunlei
> >
> > David Barrett wrote:
> > > Wow, very interesting. I can see the value of
> downloading in parallel
> > from
> > > multiple mirrors...
> > I can't. Assuming they're all servers with public IP addresses and
> > reasonable outbound bandwidth, having each client load-balanced to a
> > single specific server and downloading over a single TCP
> stream is more
> > efficient for the network -- and provides better download
> performance
> > for the user -- than having multiple TCP streams fight over the same
> > congestion-limited download pipe.
> >
> > The only reason to download from multiple sources
> simultaneously is the
> > case where upstream capacity of serving nodes is a small fraction of
> > downstream capacity (see: P2P filesharing network where the
> files are
> > only present on ADSL or cable modem connected user
> machines), and thus
> > there'd be no way to fill the download pipe otherwise.
> >
> > Matthew Kaufman
> > [EMAIL PROTECTED]
> > _______________________________________________
> > p2p-hackers mailing list
> > p2p-hackers@lists.zooko.com
> > http://lists.zooko.com/mailman/listinfo/p2p-hackers
>
> _______________________________________________
> p2p-hackers mailing list
> p2p-hackers@lists.zooko.com
> http://lists.zooko.com/mailman/listinfo/p2p-hackers
_______________________________________________
p2p-hackers mailing list
p2p-hackers@lists.zooko.com
http://lists.zooko.com/mailman/listinfo/p2p-hackers
You don't get no juice unless you squeeze
Lemon Obrien, the Third.
http://www.tamago.us
_______________________________________________
p2p-hackers mailing list
p2p-hackers@lists.zooko.com
http://lists.zooko.com/mailman/listinfo/p2p-hackers