# Re: Linux RNG paper

```>From: David Malone <[EMAIL PROTECTED]>
>Sent: Mar 23, 2006 3:43 PM
>To: "Travis H." <[EMAIL PROTECTED]>
>Cc: "Heyman, Michael" <[EMAIL PROTECTED]>, cryptography@metzdowd.com, [EMAIL
>PROTECTED], [EMAIL PROTECTED]
>Subject: Re: Linux RNG paper```
```
...
>One metric might be guessability (mean number of guesses required
>or moments there of).  As you point out, Arikan and Massey have
>shown that Shannon entropy are not particularly good estimates of
>guessability. Generalisations of entropy, like Reni entropy do seem
>to have meaning. The min-entropy mentioned in RFC 4086 seems
>reasonable, though I don't think the rational given in the RFC is
>actually correct.

Starting clarification:

Min-entropy of a probability distribution is

-lg ( P[max] ),

minus the base-two log of the maximum probability.

The nice thing about min-entropy in the PRNG world is that it leads to
a really clean relationship between how many bits of entropy we need
to seed the PRNG, and how many bits of security (in terms of
resistance to brute force guessing attack) we can get.

Suppose I have a string S with 128 bits of min-entropy.  That means
that the highest probablity guess of S has probability 2^{-128}.  I
somehow hash S to derive a 128-bit key.  The question to ask is, could
you guess S more cheaply than you guess the key?  When the min-entropy
is 128, it can't be any cheaper to guess S than to guess the key.
That's true whether we're making one guess, two guesses, ten guesses,
or 2^{127} guesses.

To see why this is so, consider the best case for an attacker: S is a
128-bit uniform random string.  Now, all possible values have the same
probability, and guessing S is exactly the same problem as guessing
the key.  (I'm ignoring any bad things that happen when we hash down
to a key, but those can be important to think about in a different
context.)

Now, why is this the best case for an attacker?  Because it gives the
highest probability of guessing right on the nth guess.  If the
min-entropy is 128, then the highest probability symbol must have
prob. 2^{-128}.  If the next highest probability symbol has lower than
2^{-128} probability, then his second guess has lower probability.
And then the next highest probability symbol is constrained in the
same way.

This makes it really easy, once you're working in min-entropy terms,
to answer questions like "do I have enough entropy in this string to
initialize a PRNG based on running AES-128 in counter mode?"

>       David.

--John Kelsey, NIST

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
```