In article <[EMAIL PROTECTED]> you write:
>The /definition/ of entropy is
>
>               sum_i  P_i log(1/P_i)             [1]
>
>there the sum runs over all symbols (i) in the probability
>distribution, i.e. over all symbols in the ensemble.
>
>Equation [1] is the gold standard.  It is always correct.  Any
>other expression for entropy is:
>  a) equivalent to [1]
>  b) a corollary, valid under some less-than-general conditions, or
>  c) wrong.

I disagree.  In the context of Physics, Shannon entropy may well be
the end-all and be-all of entropy measures, but in the context of
Cryptography, the situation is a little different.  In Cryptography,
there are multiple notions of entropy, and they're each useful in
different situations.

For this particular application, I would suspect that Pliam's workfactor
or Massey's "guessing entropy" could well be more accurate.  See, e.g.,
the following for a short summary and for references where you can learn
more:
  http://www.cs.berkeley.edu/~daw/my-posts/entropy-measures
Shannon entropy is often a reasonable first approximation -- it's
usually good enough for practical purposes.  But it's just an approximation,
and in some cases it can be misleading.

Example: Suppose I choose a random 256-bit AES key according to the
following distribution.  With probability 1/2, I use the all-zeros key.
Otherwise, I choose a random 256-bit key.  The Shannon entropy of this
distribution is approximately 129 bits.  However, it's a lousy way to
choose a key, because 50% of the time an adversary can break your
crypto immediately.  In other words, just because your crypto key has
129 bits of Shannon entropy doesn't mean that exhaustive keysearch will
require at least 2^129 trial decryptions.  This is one (contrived)
example where the Shannon entropy can be misleading.

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to