> The rationale was that if all the network is so busy that all the "close" 
> noderefs are busy, there's no point in further increasing traffic by 
> continuing to route.

Yeah, this is a-little suspicious, and I am not sure I buy this
rationale. If they are busy, then routing around them will serve to
provide alternative sources for the data (once it is found and cached
along the reverse search path), which will help to alleviate the
problem, without causing unnescessary RNFs.

The number of RNFs I see (particularly when using a tool like FMB)
suggests that this is causing more harm than good either way.  A RNF is
a serious failure of a Freenet node to serve its user, and the fact that
the node might be RNFing unnescessarily adds insult to injury. I believe
this harm greatly outweighs the debatable benefit you outline.

> 2) The overload prefiltering code discriminates against requests which are 
> far from the node's estimated specialization when the node is overloaded, 
> making them more likely to be QR rejected.

Not to nitpick - but I have always been somewhat uncomfortable about
assuming a single area of specialization for a node.  In many
simulations I have seen, nodes will frequently specialize in two or more
areas.

Ian.

-- 
Ian Clarke                                        [EMAIL PROTECTED]
Founder & Coordinator, The Freenet Project    http://freenetproject.org/
Chief Technology Officer, Uprizer Inc.           http://www.uprizer.com/
Personal Homepage                                       http://locut.us/

Attachment: msg03466/pgp00000.pgp
Description: PGP signature

Reply via email to