On Monday 01 September 2003 07:17 pm, Zlatin Balevsky wrote: > >A. If both are guaranteed to exist then A is better. > > at least 90% of the time neither file will exist.
If 90% of the data being requested does not exist, the network is in bad shape. > >If both maxout without finding anything, then you ask the same number of > > nodes ether way. > > In theory yes. In practice, there will be plenty of QR-ing, restarting, > etc. If one request RNFs, the other one may not. yeah, but that is not more luckily for one key as opposed to the other. So it doesn't make much difference. > >However the data is further away > >than maxHtl/2 then it is fairly likly that the other file is too. > > This is not the case at all. The two files will be inserted along > completely different paths and availability of one will not be related to > the other. Correct, however assuming that they are both equally well distributed, then you are just as likely to encounter one as the other, within any given range, discounting of course any bias in your specialization tweeds one or the other. > >Consider > >that ideally the best HTL nodes out of m^HTL get asked. Where m is the > >average number of unique outgoing connections per hop. > > Again, in practice not the best node will receive the request but the one > that doesn't QR. By doing two requests the probability of both best nodes > QR-ing is decreased. Why would two request decrease the overall probability of a QR? > Going back to theory, freenet finds data in network size N with log(n) > hops. In that case, making two requests with MaxHtl/2 would be equivalent > to making one request with 1+MaxHtl/2. But then again - the default htl of > 15 should have been enough for network size of 32K nodes; we're much less > than that and requests for data that is known to be in the network rarely > succeed at htl 15, notwithstanding the most popular keys. I'm not out to > downplay the theory, but I do need to make a practical implementation that > will work _now_, not few years from now. First Freenet is bigger than 32k nodes. Second it is lower than that because Freenet does not always route optimally. This will never totally go away, because each node does not and cannot know everything about each of it's peers. So this universally lowers the success rate. Keep in mind the original goal was to minimize stress on the network. For that a single request at a higher HTL is better. It would be better yet if we picked which file we wanted in advance. _______________________________________________ Devl mailing list [EMAIL PROTECTED] http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/devl
