On Friday 14 June 2002 18:39, you wrote:
> >we try to make them work more gracefully - even under this bad condition.
> >we have not enough permanent nodes, but it should not be possible to stop
> >freenet by overloading.

> I'm don't want to imply that requests should be blocked or stopped to fix
> the problem. But if requests were better routed (routed more directly),
> would the problem go away by itself? Or would we still need more permanent
> nodes? (Also, let's say requests went directly to the node that has the
> data. Wouldn't that be contrary to the original Freenet design? Or is that
> what specialization means?)

you can not know where exactly the data is, that you want. you can only make 
a raw assumption. the node you contact does the same thing. the goal is to 
make "good" assumptions based on the keys you have seen already from the 
nodes in question.

> When you say, "it should not be possible to stop Freenet by overloading,"
> do you mean that even if there are too few permanent nodes, the Freenet
> protocol should be designed so that the network isn't stopped? Or do you
> mean that we need more permanent nodes to make the network more stable?

yes, freenet may slow down but it should not be possible by any means to shut 
it up. it have to bring as much results as possible in any condition.

may be, the transient nodes could be engaged as long as they are online.
that would increase the total data store size and should share the load.

mfg The Bishop

_______________________________________________
devl mailing list
[EMAIL PROTECTED]
http://hawk.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to