[EMAIL PROTECTED] writes:
> | Let me rephrase my sequence. Create a sequence of 256 consecutive  
> | bytes, with the first byte having the value of 0, the second byte the  
> | value of 1, ... and the last byte the value of 255. If you measure  
> | the entropy (according to Shannon) of that sequence of 256 bytes, you  
> | have maximum entropy.
>
> Shannon entropy is a property of a *source*, not a particular sequence
> of values.  The entropy is derived from a sum of equivocations about
> successive outputs.
>
> If we read your "create a sequence...", then you've described a source -
> a source with exactly one possible output.  All the probabilities will
> be 1 for the actual value, 0 for all other values; the equivocations are
> all 0.  So the resulting Shannon entropy is precisely 0.

Shannon information certainly falls to zero as the probability with
which a message is expected approaches 1. Kolmogorov-Chaitin
information cannot fall to zero, though it can get exceedingly small.

In either case, though, I suspect we're in agreement on what entropy
means, but Mr. Perez is not familiar with the same definitions that
the rest of us are using.

Perry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to