Yes, duplicating the file under many different keys (which is what this does, with a twist) will of course make this sort of attack more difficult - but I would be happier if the network were more resitant to it then to rely on duplication by the client (using IDA or not is a client and user choice).
A way to make this attack more difficult would be: a) Before routing, SHA all keys again on the Node side (this means that the client can't just pick a key out of thin air, which he actually can now even with SVKs and CHKs since he is just making a request, and doesn't have to produce any data). b) If a Request fails, the node keeps a record of the failure and automatically times out any request that comes by under some set interval if it's HTL is not greater then the original. Since nodes cut of the htl above a certain number (100 currently) only a small number of messages can get past the node even if he tries to gradually up the HTL. After that, the attacker needs to perform another brute force of O(n(F)) (n(F) being the number of documents on Freenet) to find another key that gets routed close to targeted data before he can continue the attack. BTW, what sort of inflation are you looking at for something like 100 pieces where any 50 are enough to restore the data? On Mon, 31 Jul 2000, you wrote: > Oskar Sandberg wrote: > > > A better DoS attack is: find a key that is closer to the targeted data's key > > then any other on Freenet (even for cryptographically derived keys like all > > will be from 0.3 onwards, this only takes about double as many tries the > > number > > of documents on Freenet, which is doable), and do a distributed attack with > > thousands of clients sending thousands of requests for that key. Because the > > data can't be found all/most of the requests will find there way to nodes > > that > > the targeted data also clusters to, shutting them down. > > > > If I split a file using IDA, so that there are 100 parts, and any 50 are > needed > to reconstruct the file, this attack becomes 50 times as hard, doesn't it? > > The more parts you make, the more difficult it is to target one specific > file. I > could split the same file into 1000 parts, so that any 500 are needed to > reconstruct the file, and suddenly the attack is another 10 times more > difficult. > > However, If the split parts are being referenced from an SVK, then that SVK > has > to be duplicated as well to different keys, otherwise it can become the focus > of > attack. One way would be to cyclically rotate the string key that represents > the > SVK, and store under every combination, with the number of shifts stored along > with key to allow reconstruction of the original SVK key and hence > verification. > > _______________________________________________ > Freenet-dev mailing list > Freenet-dev at lists.sourceforge.net > http://lists.sourceforge.net/mailman/listinfo/freenet-dev -- \oskar _______________________________________________ Freenet-dev mailing list Freenet-dev at lists.sourceforge.net http://lists.sourceforge.net/mailman/listinfo/freenet-dev
