On 28 May 2014 01:47, mancha <manc...@zoho.com> wrote:
> Fouque and Tibouchi [3] offer the differing view that it's preferable to
> minimize bias and generate primes that are almost uniform "even if it is
> not immediately clear how such biases can help an adversary". They
> suggest a few algorithms that improve on naive discard & repeat by
> discarding only the top N bits of a candidate at each iteration, among
> other innovations.

This paper assumes two things that don't appear to be true:

a) That prime generation attempts consume entropy - why? Seems fine to
me to just stir the pool and try again.

b) That repeated random number generation is much more expensive than,
say, addition. Our experiments show that generating a new random
number each time is only half the speed of incrementing.

I'm guessing these incorrect assumptions are common in the literature?
______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
Development Mailing List                       openssl-dev@openssl.org
Automated List Manager                           majord...@openssl.org

Reply via email to