Matthew,

Thanks for your fast and precise answer. I have removed all text
where I think it does not need more discussion. The following
points remain:

Snip

I do not really understand why this value is introduced
and used. There is the risk that information available
can not be found, right? And I understand that circular
requests are terminated anyhow, right?
Because we want to be able to limit the length of time and the amount of
network resources that a request takes. A request only goes to a certain
number of nodes, the hops-to-live. This means that unlike gnutella,
freenet should be able to scale as a single contiguous network, and an
attacker who just sends zillions of requests only gets (max HTL) *
(number of requests) load/bandwidth for his money, rather than (max HTL)
* (number of nodes).
Oh, agreed, protecting freenet against attackers definitely
is worth that considerations. However with a growing net the
percentual reach of a search is getting smaller. A net which
is willing to give only a fraction of the information it holds
to somebody searching for the information of course is also of
limited value. The problem specially is that information that
is new or rarely requested does propagate badly. From the view
of a node searching for the information it may become available
only after somebody in between has also requested it. The only
chance besides that might be a shifted search horizon of the
requester. Is there a chance that my node gets to know more
nodes out there in the freenet universe and to ask them in an
arbitrary sequence? Or is there another chance to find infor-
mation which is beyond my search range (as seen at a single
point in time)? This would at least help me to find the infor-
mation, even if it costs me more bandwidth. Repeatedly asking
the nodes that did not know it before in the hope they might
know it in the future will also significantly increase network
load, but with less probability of success for the regular
user.

Please enlighten me, whether my assumptions are comple-
tely wrong, whether they are right but default is enough
to search a whole universe full of freenet nodes or whe-
ther the default should be changed. Are there other means
to increase the probability to find information in freenet?
We may want to increase the default maximum HTL in 0.5.1, however there are lots of nodes left over from 0.5.0 and its immediate successors,
which would tend to prevent this, short of forking the network...
This would be welcome, please increase it to a value that
holds some margin for future growth. The impact of infor-
mation that is not delivered may be bigger than the impact
of an overload attack. The impact of an overload attack might
also be limited by limiting the bandwidth a node is willing to
provide to a single requester, would that be reasonable?

With regard to new versions, wouldn't it help if a certain
percentage of the nodes had higher limits? These would pro-
vide paths of deeper information flow into the network, even
if other paths stay shallow.

And of course it could easily be improved without a version
change, if people would change it manually. Is there a promi-
nent place where that could be published?

Best regards,


Thomas


_______________________________________________
support mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org/cgi-bin/mailman/listinfo/support

Reply via email to