On Sun, Sep 08, 2013 at 03:15:24PM -0400, Avi wrote: > As must I. Robert has one of the clearest modes of exposition from > which I have ever been fortunate to benefit.
I have to agree on this point. The issue is that I disagree with him on his stance : in my opinion, having a schedule stating when the keylengths will become insecure is useless : we just have to be able to predict that longer keys will most likely be at least as hard to crack. And this means that, as long as the drawbacks associated with the use of the key are assumed by the key owner only (as the tables state, encrypt and verify operations being almost unchanged in time), preconizing 10kbit RSA keys is no issue, and can only improve overall security, to the key owner's usability's detriment at most. And each key owner might choose whatever keylength they feel best suits them, according to their usability / security needs ; as long as these choices do not impede other keyholders' usability or security. BTW, the statement "[Dan Boneh] proved that breaking RSA is not equivalent to factoring" is wrong : he did not prove that breaking RSA is easier than factoring numbers ; only that a whole ways of proving that breaking RSA is as hard as factoring numbers are ineffective ; thereby reducing the search space for potential valid ways of proving the conjecture. Hence the title of the article : "Breaking RSA *may* not be equivalent to factoring". Please pardon me if I misunderstood the english used in the abstract. Oh, and... Please correct me if I am mistaken, but I believe the best we can do at the moment, even with a quantum computer is Shor's algorithm, taking a time of O(log^3 n). Thus, going from 2k keys to 10k keys makes if approx. 125 times harder to break. Sure, not so wonderful as what it is today, but if the constant is large enough (which, AFAIK, we do not know yet), it might be a practical attack infeasible for a few more years. So, back to the initial question, I do not understand why the article should be judged so poor. No, to me the article is not about predicting 87 years in front of us. To me, the article is about stating that using 10k RSA keys is not as bad as one might think at first. The gist of the article is, to me, precisely this set of tables. And the first part is, still in my opinion, only here to explain why 10k RSA keys were chosen for the experiment. Explaining that, according to our current estimates, they might potentially resist until 2100, assuming no major breakthrough is made until then in the cryptanalysis field. You might notice that such precautions are taken in the article too. So... I find the article interesting. I would not have thought "everyday" use of a 10k key would have so little drawbacks. And, finally, I want to recall that signing keys need not be the same as certifying key, thus allowing us to lose the signature time only for certifications, and use "normal" keys the rest of the time ; thus getting the best of both worlds, by being able to upgrade signing keys to stronger ones without losing one's WoT. The only remaining drawback being when others certify our master key. Cheers, Leo _______________________________________________ Gnupg-users mailing list Gnupg-users@gnupg.org http://lists.gnupg.org/mailman/listinfo/gnupg-users