----- Original Message ----- From: "Matthew Toseland" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Thursday, June 27, 2002 4:05 PM Subject: Re: [freenet-dev] Probabalistic caching with Ps(k)
> We don't want to use raw Ps(k). It's REALLY low (less than 0.1 except in > cases of flukes due to not very much data), look at your probability > histograms. Maybe Ps(k)/average Ps, capped at some fixed number - 1.0 or > 0.9something, say. That would work nicely. Every value above the average would be capped at 1.0 (or whatever), so it would be guranteed to be cached. A low probabality for caching would make the network better at specialization, while a high probability makes the network better at caching. Together they make a healthier freenet. Caching on nodes that are far far away from the key can harm specialization, but otherwise caching improves the ability of the network to find the data. Something like Min(2 * Ps(k)/(avg Ps), 1.0) should be used so that a node only considers not caching keys that have a lower Ps(k) than (avg Ps)/2. Scott Young _______________________________________________ devl mailing list [EMAIL PROTECTED] http://hawk.freenetproject.org/cgi-bin/mailman/listinfo/devl
