On Fri, Jun 15, 2001 at 07:18:50PM +0100, toad wrote:
> On Fri, Jun 15, 2001 at 11:05:32AM -0700, [EMAIL PROTECTED] wrote:
> > If anything I think you should lower the maximum HTL allowed by the
> > nodes (not just clients, I mean network nodes) to the 20-30 range or
> > perhaps even lower. Theoretically this should make things work better.
> > My feeling is that all those inserts with HTL=100 are flattening the
> > search space and keeping searches from working.
> >
> > Like Oskar said, increasing HTL at this point means giving up on
> > the Freenet search model. If Freenet won't work with HTL=20 when the
> > network probably has no more than a few thousand nodes, then the whole
> > idea is flawed.
> Fetching at larger HTL's is often very useful. You can't impose this sort of
> change without a protocol revision, i.e. not before 0.4. And requests should
> have a higher maximum HTL than inserts, if that is possible.
WTF does "useful" mean here? Freenet's existence is only justified by
the fact that it, supposedly, can efficiently locate information - how
can it be "useful" to have to wait 10 minutes for a request on a network
of maybe a thousand peers?
To me this all seems like circular logic. "Lets use Freenet because it
finds stuff in a limited number of hops." And then "We have to keep
increasing the number of hops or we can't find anything."
Maybe we need to accept the fact that our model has, thus far, failed
completely and disastrously in pratice. Not to say we should give up:
there are still a lot of unexplored options, but the current network is
very obviously useless. All this beating of the dead horse is doing
nobody any good...
--
'DeCSS would be fine. Where is it?'
'Here,' Montag touched his head.
'Ah,' Granger smiled and nodded.
Oskar Sandberg
[EMAIL PROTECTED]
_______________________________________________
Devl mailing list
[EMAIL PROTECTED]
http://lists.freenetproject.org/mailman/listinfo/devl