Edward J. Huff:
> Not at all.  This is a data compression scheme which avoids
> the problem of incompressibility.  Any 1 meg file can be
> represented as a CHK.  (It fails only when there are hash
> collisions.)

I don't understand how this relates to compression. It doubles the
size of all downloads, and the storage required for a given file
does not change.

> Thus, all freenet nodes are continually requesting
> randomly selected CHK's and are continually inserting
> new ones.  No one can tell which traffic is random
> and which is directed at inserting or obtaining 
> some document.

This is nice, but I suspect that it will not make real-world traffic
analysis much more difficult.

> The nodes are also exchanging formulas all the time.
> These circulate among all of the nodes, in a traffic
> analysis resistant fashion.  The copyright "owner"
> can run his own node and whenever he finds a formula
> to his content, he can demand that the formula be
> suppressed under the DMCA.  But it's too late:  by
> now the nodes which are looking for his content have
> obtained it, and also, by now there is another version
> of the formula which he doesn't know yet so he can't
> ask to suppress it.

How do I trust the legitimacy of a given formula? What prevents an
attacker from advertising tons of false formulas for a file?

> They have to take the whole network down, and erase 
> all of the disks to be sure of getting rid of one 
> document.

One can never be sure, I'll agree, but they can destroy the CHKs
indicated by the redundant formulas as quickly as they get them.
_______________________________________________
devl mailing list
devl at freenetproject.org
http://hawk.freenetproject.org:8080/cgi-bin/mailman/listinfo/devl

Reply via email to