----- Original Message ----- From: "Tom Kaitchuck" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Sunday, August 03, 2003 9:32 PM Subject: Re: [Tech] freenet not suited for sharing large data
> You don't really seem to have a very good understanding of how Freenet works. > I'm going to put up a document in a few days that hopefully should make it > more clear. Right now Freenet's documentation is very lacking. More documentation is always better. :) > First there is no distinction between searching for and requesting a file. So, > if you make a request for it, you don't find out who has it and then connect > to them through proxies, you just get the file. (You don't know or care where > it's from.) I'm sorry if it sounded like I thought there was a distinction. In FreeNet there is no such distinction, true. However, if you DO have such a distinction, one mechanism to search, and one to request, then it is much easier to set the number of proxies. > So having to upload on Freenet is a BIG BIG plus. It means that instead of > people having to connect to your computer to get the file, they get it from > the network. Anything bigger than a meg is broken up into 256k chucks each of > which is routed and stored separately. So when you download a large file, you > can easily run 50 different threads, each downloading a segment form a > different computer. So the total bandwidth for all users downloading the file > is never limited by the number of users that already have the file. (Although > more would result in multiple copies on the network) So, Freenet would give > much much better throughput than something like bitTorrent with anonymizing > proxies. It is also not vulnerable to anyone host being down, overloaded etc. Hmm... when you say "the network", and "your computer".. the differance is only that the network constitues of whole bunch of "your computers".. so I don't see it as a big plus when you say that ppl don't have to connect to YOUR computer to get the file. It doesn't really matter which computers they connect to. It is good that the load of downloading a file is distributet as much as possible to avoid bottle necks, as freeNet and bitTorrent does. > So the total bandwidth for all users downloading the file > is never limited by the number of users that already have the file You mean because the file is split up in chunks and distributed in the network? Yeah that's true, and indeed very good. However it requires for the share to be uploaded... and as I said before, uploading 60GB data is not so much fun :) > However if you are talking about transmitting data point to point, rather than > making it available to everyone, then Freenet may not be for you. If your > download traffic is going to be less than the cost to upload all the data, > then you might as well just run an ftp server or something. Alternately you > don't always have to insert the data into Freenet. Frost employs a mechanism > to only insert the metadata, then when a user wants it, it will be inserted > on demand. No I'm not just talking point to point.. then it wouldn't be a file sharing network. It should be available to everyone of course. About frost.. it seems pretty slow.. first proxy the data into the network, then let the requester know that it's available now (as soon as a chunk is uploaded), then proxy the data back to the requester... lots of proxies there right? > To address your last point about reducing the number of proxies: This might be > done soon. Someone just brought this up on [EMAIL PROTECTED] It is not good to do as a > general practice, because then the intermediate nodes can't improve their > routing tables, and the network is more venerable to attack. However a good > time for then to cut themselves out of the return path, is when they are > overloaded, and would slow the transfer. So that will probably be done soon. So will it be possible to have only ONE proxy, if the downloader, so desires? Let's say he think it's safe enough for him? Btw, I read in some paper, think it was ACHORD or CHORD, that a thing about freeNet is that you aren't guaranteed to find the data you are looking for if it exists? Because you have to have a TTL on the request to avoid infinite looping problems? True or false? (I read about freeNet some time ago..) /Gabriel _______________________________________________ Tech mailing list [EMAIL PROTECTED] http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/tech
