On Fri, 1 Jul 2005, Charles M. Hannum wrote:

Most implementations of /dev/random (or so-called "entropy gathering daemons")
rely on disk I/O timings as a primary source of randomness.  This is based on
a CRYPTO '94 paper[1] that analyzed randomness from air turbulence inside the
drive case.

I was recently introduced to Don Davis and, being the sort of person who
rethinks everything, I began to question the correctness of this methodology.
While I have found no fault with the original analysis (and have not actually
considered it much), I have found three major problems with the way it is
implemented in current systems.  I have not written exploits for these
[...]

You may be correct, but readers should also know that, at least in Linux:

/usr/src/linux/drivers/char/random.c:
  * All of these routines try to estimate how many bits of randomness a
  * particular randomness source.  They do this by keeping track of the
  * first and second order deltas of the event timings.

And then the inputs are run through a SHA hash before being released through /dev/random.

                                                        -J

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to