Currently, the probability that a request being sent back through the
network will have it's DataSource field reset on any given node is 1/30.
Unless there is some weird statistical thing going on that I'm not aware of,
the total chances that the DataSource field will be reset when going through
two nodes is 1/15 (2/30). Through three nodes it's 1/10 (3/30), and so on.
Right now, people are reporting HTL values of 100 being necessary (I'm
ignoring people who say they use HTL of 500, Fred will just put that down to
100 anyway). That means there is a 100/30 chance of reset. Yes, it is
almost assured that the DataSource field will be reset 3 times in it's
journey through the network. As simulations have shown, having the
DataSource field reset too often tends to lengthen network connections.
My suggestion is to put the probability of reset much higher, perhaps 1/60.
I realize that this makes it more assured that the node mentioned in the
DataSource field really is the original source of the data, but we may view
it as a temperary solution and can lower it if and when we see network
improvement (or if we don't see any improvement at all over a given time).
The value can gradualy be brought back near it's orginal setting as we see
improvement/lack of improvement.
I also realize that this may only be treating the symptoms of the 0.3
network. Still, I think we should at least try.
_______________________________________________
Devl mailing list
[EMAIL PROTECTED]
http://lists.freenetproject.org/mailman/listinfo/devl