--- Toad <[EMAIL PROTECTED]> wrote: 
> On Fri, Nov 07, 2003 at 06:23:17PM +0100, Some Guy wrote:
> > As I said earlier this kind of delaying is unnessesary.  Private Cache + Premix 
> > removes any
> need
> > to do this.
> 
> Does it? 

Sure, what's in your public store will have nothing to do with what you were using 
freenet for. 
Isn't that good enough?

> I thought that the list was collectively very concerned about
> some widely ignored legal idea called "entrapment" ? :)
Entrapment is when law enforcement entices you to break the law when you normally 
wouldn't.  For
example, if the police offer to give you drugs, and you take some, in court you can 
argue it was
entrapment, that you wouldn't normally go out looking for dope but since an under 
cover officer
enticed you, you took the bait.  The idea is that an undercover officer can't plan a 
bank robbery,
recruit a couple people, and then bust them for going along with his plans.

I'm not 100% sure how it fits into freenet.  I guess if someone requests "evil" 
material from you,
and you find it, and then they bust you for having distributed "evil" material, you 
could argue
that's entrapment.  It's a good thing.  It's kind of like plausible deniablity except, 
you're
argueing that the guys busting you are actually responsible for you having done 
something illegal,
not just anyone.
 
> However, it increasingly appears that it is grossly impractical for a
> node to try to conceal whether it has a file in it's cache... even the
> try to fake the time it would take to get it from the next node idea has
> problems... however, the new-nodes-getting-bad-estimates problem can be
> solved by the new node having estimators passed on from the node we
> would have routed to (either in the seednodes or in the StoreData that
> added it to our references). Of course the problem then is that the
> estimators passed might be grossly inaccurate, perhaps purposefully...

Yeah, there are lots of problems doing it.  The biggest problem is that your 
estimators to figuar
out if a node is good in an area of hashspace are base on the delay (and to some 
extent the DNF
rate).  If the delays are artificially modified, it seems they'd have trouble learning 
real
specializations.  

Maybe it's not that bad, but I just don't see why it's nessesary.  It definately seems 
like it's
one of those times when you pull out that card: "premix + private caching will take 
care of all
requestor/inserter anonymity."

__________________________________________________________________

Gesendet von Yahoo! Mail - http://mail.yahoo.de
Logos und Klingelt�ne f�rs Handy bei http://sms.yahoo.de
_______________________________________________
Devl mailing list
[EMAIL PROTECTED]
http://dodo.freenetproject.org/cgi-bin/mailman/listinfo/devl

Reply via email to