I've been thinking on the caching problem recently and it seems to me that
it is is indeed disadvantageous to cache aggresively because files cached
on the edge of the network have a decreased probability of being found by
an arbitrary node. The closer a file is to the epicenter, the greater its
global probability of discovery.

So rather than caching all the way back the search path, it would be most
advantagous to cache in an expanding ring around the epicenter. This can
be done by having a rapidly decaying probability of caching.

This is distinctly different than the previous idea about weakening
caching, which as I recall was to have each node decide to cache a file or
not based on some internal node probability, which would decrease the
overall cachedness of a file, but not favor caching near the epicenter.




_______________________________________________
Devl mailing list
Devl at freenetproject.org
http://lists.freenetproject.org/mailman/listinfo/devl

Reply via email to