If there is anyone out there thinking of doing some kind of research
project (for school/university) and who migh be interetsed in doing
something Freenet related, I have a perfect suggestion.

One important issue is that of document longevity in Freenet.  The
current approach basically removes documents on the basis of which was
least recently accessed, although a bias against large files was
introducted in version 0.3.

So one very important question is how long does a document last in
Freenet (assuming no requests for that document), and how is this
affected by differences in the size of the document and the HTL with
which it was inserted.

The best way to approach this question would probably be to use one of
the simulations (although none, that I am aware of, currently
incorporate the size bias).  Experiments could be conducted with the
simulation, and then the same experiment could be repeated on the actual
network (by inserting files, and after a measured amount of time,
attempting to request them - and noting the percentage of successes for
different file sizes, and different HTLs).  This would allow the
simulation's accuracy to be measured.

Any takers?

Ian.

PGP signature

Reply via email to