On Tue, 16 Apr 2002, Mark J Roberts wrote:

> Kevin Atkinson:
> > The probability of selecting three malicious, buggy hosts will be very low 
> > since part of the selection criteria will be on past performance.  
> > Furthermore it will likely select the same couple of hosts to download the 
> > blocks of the file.
> 
> Biasing node selection based on past performance is probably
> illogical. If a reliable node emerged on to the network, it would be
> recognized as reliable and flooded. Then it wouldn't be reliable any
> more.

I just have to see.  There are things I can do to avoid this problem.

> I'm still unconvinced that block retrieval is as reliable as you say
> it is. Even if a transient error or malicious host can never cause a
> block failure, the network's algorithm itself may not be reliable.

It may be slightly less reliable than retrieving data as a whole but I don't 
think it will be as unreliable as you think it is.  I think we will just 
have to agree to disagree.

> Is it reliable? (where reliable approaches 1)

Is freenet?  It sure doesn't seam that way.  I am convinced it can be made 
at least as unreliable as freenet.

--- 
http://kevin.atkinson.dhs.org


_______________________________________________
freenet-tech mailing list
[EMAIL PROTECTED]
http://lists.freenetproject.org/mailman/listinfo/tech

Reply via email to