> > > I haven't seen _any_ compelling argument why above-average nodes
> > > should attract more than their "fair share" of requests. What's
> > > yours?
> > 
> > Over time, the large node simply accumulates more data from Freenet.  
> > This means there should be more nodes which point to data on the large
> > node. Thus, there will be more requests routed to the large node.
> 
> Does FreeNet not take into account reliability, 

Not for routing an individual request, no.  However, nodes that don't respond when 
requested will be flushed from the datastore.

>proximity,

No.

>speed,

No.

> etc of
> nodes when fetching files? If it does I would think it could automaticlly
> sense if a node was becoming saturated and find a more satisfactory link
> and/or clone the resource?
> 
> Why is it a bad thing if machines with more resources do more work?

Centralization.  While it's not so bad if a better machine does a little more work, 
it get's bad when that same machine is getting requests from half the network.

_______________________________________________
Chat mailing list
[EMAIL PROTECTED]
http://lists.freenetproject.org/mailman/listinfo/chat

Reply via email to