The state of the art in factorization is the same as for, e.g., the
factorization of RSA-768 [1] --- there haven't been many advances in the
number field sieve algorithm itself. The current effort, as Bernstein
puts it, is in speeding up smoothness detection, as part of the relation
collection process.

Both the RSA-768 factorization paper and a previous one by the same
authors [2] try to predict the effort needed for a 1024-bit prediction,
which is estimated to be around 1000 times harder than a 768-bit
modulus. [1] estimates to number of operations in the RSA768
factorization to be in the ballpark of 2^67 instructions: a thousand
times harder puts this on about 2^77, which puts it in the realm of
doable, but very hard, even for a well funded organization.

We also have to take into account the logistics of doing such a
factorization. Unlike an elliptic curve discrete logarithm computation,
that takes (relatively) negligible storage and communication, the number
field sieve requires massive amounts of data, and the linear algebra
step could become (even more of) a problem.

Best regards,
Samuel Neves

[1] http://eprint.iacr.org/2010/006
[2] http://eprint.iacr.org/2009/389

On 20-04-2010 16:45, Perry E. Metzger wrote:
> I was alerted to some slides from a talk that Dan Bernstein gave a few
> days ago at the University of Montreal on what tools will be needed to
> factor 1024 bit numbers:
>
> http://cr.yp.to/talks/2010.04.16/slides.pdf
>
> It has been a couple of years since there has been serious discussion on
> the list on this topic, and especially in the light of various technical
> decisions being undertaken on the size of DNS signing keys for high
> valued zones (like root), I was curious as to whether anyone had any
> interesting comments on the state of the art in factorization.
>
> Perry
>   

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majord...@metzdowd.com

Reply via email to