Currently routing in Freenet is not reflexive. Query requests can only be sent to peers in a node's routing table (RT). If node G is in node F's RT, node F is not necessarily in G's RT.
What benefits, drawbacks, security risks, performance metrics, protocol conformance metrics (if rules were imposed), etc. might be realized if symmetrical routing were implemented ? (connection multiplexing could have strong relations below)
A node functions in several roles: + insertor (and those who forward an insertion) + requestor (and those who forward requests) + provider (and those who forward provided data) + consumer (one who downloads, regardless of destination)
Symmetric routing would allow one node to better 'profile' another node for each of the above roles. Transient nodes are not addressed for now. This profiling would measure another node's level of benefit to the Freenet network as a whole, and give preferential treatment to 'model citizens.' It would be expected that the vast majority of nodes were equally good citizens (adjusted for bandwidth), but the idea is to identify those who aren't. This doesn't appear to be at odds with the goals of NGR. For a given path, a reverse path would be almost guaranteed - of interest to those people persuing forms of bidirectional unicast communications through Freenet.
It seems that currently, a denial of service attack against a single node would be easy to accomplish. Just use 'big' bandwidth (perhaps multiple nodes) to assault the target with useless queries and inserts. And there is no penalty! One potential penalty would be to close the connection and ignore Mr. NastyNode for X hours/days/years, depending on the severity of the offense. At present, there is no mechanism to specify the desired parameters for communications with another node; query and insertion rates (minimum and maximum) could be specified by count or in bytes.
Insertions between a pair of nodes could be rate-limited. The data may be "good" (popular), or it may be flooding garbage. Is there a difference between data inserted maliciously and data that simply isn't popular ? So we can't reward a node solely for publishing... and we can only measure a publisher's popularity to a small degree, but there is -some- information available. We are supposed to favor the most popular content- so why not favor the most popular publishers (as defined above)? If everything is working as designed, all nodes would be expected to publish at the same popularity level. Maybe the "price" of using Freenet to publish could be equated to the time spent inserting. Should lower bandwidth nodes "ask for" and prefer small inserts ? Small requests ?
Requests between a pair of nodes could be rate-limited. A node only has so much outbound bandwidth. Queries, whether forwarded or original, should not be allowed to occupy more than a certain amount of it. At least, not in a healthy network that is effective at delivering content.
Nodes should be able to negotiate an agreed upon rate (or pair of rates) for sending stuff between them. A node can earn the resources of the other node, and vice versa. And there are many ways to profile a node - what is the average size of inserts from node X ? And average throughput rate ? How about requests - at what rate are they received ? What rate of success do requests from a particular node experience ?
I confess this is all sort of hand-wavy and vague stuff on my part. But does it click with anyone ? Could there be a net gain for Freenet via symmetric routing ? Did I trigger any similar ideas ?
Thanks for taking the time to read this.
_______________________________________________ Devl mailing list [EMAIL PROTECTED] http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/devl
