On Thu, Mar 23, 2006 at 01:55:30AM -0600, Travis H. wrote:
> It's annoying that the random number generator code calls the
> unpredictable stuff entropy.  It's unpredictability that we're
> concerned with, and Shannon entropy is just an upper bound on the
> predictability.  Unpredictability cannot be measured based on outputs
> of sources, it must be based on models of the source and attacker
> themselves.  But we all know that.  Maybe we should invent a term? 
> Untropy?

The problem is that we have to decide what out metric is before we
can give it a name. Shannon entropy is about the asympitic amount
of data needed to encode an average message. Kolmorogrov's entropy
(which got a mention here today) is about the shortest program that
can produce this string. These things aren't often important for
a PRNG or /dev/random like device.

One metric might be guessability (mean number of guesses required
or moments there of).  As you point out, Arikan and Massey have
shown that Shannon entropy are not particularly good estimates of
guessability. Generalisations of entropy, like Reni entropy do seem
to have meaning. The min-entropy mentioned in RFC 4086 seems
reasonable, though I don't think the rational given in the RFC is
actually correct.


The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to