----- Original Message ----- From: <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Saturday, August 16, 2003 7:51 AM Subject: Re: [freenet-dev] Re: freenet (pre-)searchengine
> That is exactly what I meant. It would not query the encrypted files, but rather it would determine which directories Freenet is sharing, and then index those files. Do you mean like 'locally created indexes' when you say 'directories freenet is sharing', otherwise please explain? The only thing that a freenet node is sharing is the data in its DS > I had not thought of anonymity when I proposed that. I'll have to think about that a little more... > To clarify, I propose that once a node has the dbfs that it requests from its adjacent nodes, it then merges them into one index, containing the filename, filesize, key, and a "score" indicating how easy it is for that node to access the files (access speed and/or reliability). How do you define neighbour? When you say 'merge from your neighbours' are you meaning that you want every nodes indexes to slowly coalesce into larger indexes to be present on each and every node? How would you go about avoiding spam in this system? When I recommend that people could choose to trust one or more published indexed (and of course un-trust them if appropriate) I had spamcontrol in mind (and and other kind of unwanted content). /N _______________________________________________ devl mailing list [EMAIL PROTECTED] http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl
