On Sat, 2003-05-03 at 19:35, Toad wrote:
> I am not sure that we could safely allow a get file closest to 
> this key command... it might allow some kinds of probing/mapping that
> might lead to attacks we would want to avoid. On the other hand, it is
> an interesting idea.
> 

Thanks for taking the time to read the idea.  I think we might
not need that "get closest file" command.  Instead, the node just
picks some files it happens to have available when it needs to
entangle the data.

---

I'm thinking of alternatives.  Maybe an independent system of
nodes which do almost nothing but exchange fixed length files 
in an attempt to keep them sorted in order by the hash key.  
Nodes try to replicate the data by requesting random keys from 
other nodes, checking that the resulting file matches the hash,
mangling the files (XORing them together, encrypting them, etc.) 
and re-inserting the results.  Nodes also accumulate entropy and
insert the resulting random files, together with the claim that
the files are random.  Or they take a file which claims to
be random, encrypt it with a really random key, forget the key,
insert the result, and certify it as encrypted with a forgotten
random key.  Finally, somehow the nodes distribute information 
about how files were mangled.

This activity goes on continually in the background, so that
the "real" work can go on unseen.  Nodes accept fixed length
files from external sources, and satisfy requests for files
with a specified hash, or for certified random files close to
a specified hash.  These activities need some clever
schemes to anonymize the requests in the presence of Trojan
nodes.  

Files would be deleted at random.  Nodes would accept
requests to delete any given key, but would only do so at a 
limited rate.  Files which are frequently requested would
be more likely to be replicated by mangling.  Requests for
missing files would be satisfied by reconstructing the
file if possible, without disclosing the files used for
the reconstruction.

I suspect that this limited feature set would be very useful
for the anonymous distribution of large files, given an
appropriate application layer.  It should also be fairly
easy to implement efficiently.  The nodes would attempt
to find out about _all_ other nodes.  Every node would
give its median key on request.  When inserting a file,
the originating node would send it directly to the node 
which advertises the closest median.  Reputations would
be maintained based on the ability to produce files which
match the specified hash.  When a file is originally
created, the originating node calculates several challenge
response values and saves them.  These are checked later
and the results contribute to the reputation of the
destination node.

I can't predict what would happen in court, but the fact
that these nodes never deliver anything but apparently
random bits ought to count for something.  And the fact
that they will delete any key on request (although this
doesn't necessarily prevent subsequent delivery of the
corresponding file).

-- Ed Huff

_______________________________________________
devl mailing list
devl at freenetproject.org
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to