On Sun, 2003-05-04 at 15:46, jrandom wrote:
> > I argue that Freenet must provide an interface to automatically 
> > accept DMCA notices demanding suppression of particular CHK's, 
> > or else it will be shut down once it becomes popular enough to 
> > warrant attention.
>   
> Understandable viewpoint for someone in the USA , but the day freenet 
> builds a workable framework to recieve - and act on - DMCA notices (or 
> EUCD, or any other rogue nation's legal violation of the international 
> declaration of human rights) is the day freenet gets forked.
>  
> Wouldn't the better route be to spend our time and effort on be making is so
> the DMCA, EUCD, or other fascist techniqes can't be used to shut down freenet?
>   

I looked at jrandom/8.  I don't have fixed ideas about how to proceed,
but want to discuss it.  [If there are a lot of _actual developers_
on this list who think this discussion should go on elsewhere, I
won't be offended.  I'm new here and I haven't read more than one
month of archives].

This morning, I was thinking about separating the necessary 
functions into _completely separate_ networks of nodes.  Any 
given node contributes only a small portion of the work needed,
and has no knowledge of the big picture or of any actual content.
Different layers of the problem would be handled by different
networks.  The nodes would continue to function just fine even if
all of the content were actually just random bit streams, and
_they wouldn't be able to tell the difference_.

So suppose we create a nice small application called "entropyNet"
or "randomNet" in which the nodes endeavor to create certified
random bitstreams.  They have filestore, network bandwidth, and
the ability to accept connections one way or another.  They
exchange fixed length files of random bits, working to keep
the files sorted by hash key.  They also continually generate
entropy using e.g. /dev/random and occasionally insert a file
which really came from /dev/random.  And they XOR files together
at random and insert the results, or encrypt the files and insert
the result.  And they delete files at random.

Oh, and one other thing.  Each node reports how it created each
file in terms of the hash of the input files, any keys, and
the hash of the output file.  This report is securely inserted
into some other system, like freenet, and made available after
a delay.

On the surface, entropyNet is just a research project into hash
collisions.  The reason the nodes need to sort the files by hash
code is so that they can find out how often hash collisions actually
occur when dealing with random files.  entropyNet is also a useful
service for anyone who needs some random bits.  For this purpose,
a user should be careful to obtain bits from a randomly selected
group of nodes and should then XOR the bits together and encrypt 
them with a random key, since he cannot be sure that any given 
server is really emitting random bits.

But, entropyNet will accept any node which applies for admission,
and which earns a good reputation.  So who knows?  Some of these
nodes might be inserting encrypted content and claiming it is
random.  They might be putting the keys into Freenet.  No attacker
can find out unless they hack into the node and look at the
actual entropyNet software that node is running.

I think this method is much more likely to succeed at the
political/legal game than a monolithic freenet in which each node
runs the whole package.  When every node owner is doing only
a small part of the job, and there is a plausible claim that
the software he runs actually is part of something else, it
will be hard to attack the nodes.

-- Ed Huff

_______________________________________________
devl mailing list
devl at freenetproject.org
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to