Let me rephrase my sequence. Create a sequence of 256 consecutive bytes, with the first byte having the value of 0, the second byte the value of 1, ... and the last byte the value of 255. If you measure the entropy (according to Shannon) of that sequence of 256 bytes, you have maximum entropy.

I so often get irritated when non-physicists discuss entropy. The word is almost always misused. I looked at Shannon's definition and it is fine, from a physics point of view. But if you apply thoughtfully to a single fixed sequence, you correctly get the answer zero.

If your sequence is defined to be { 0, 1, 2, ..., 255 }, the probability of getting that sequence is 1 and of any other sequence, 0. Plug it in.

If you have a generator of 8-bit random numbers and every sample is independent and uniformly distributed, and you ran this for a gazillion iterations and wrote to the list one day saying the special sequence { 0, 1, 2, ..., 255 } had appeared in the output, that's a different story. But still, we would talk about the entropy of your generator, not of one particular sequence of outputs.


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to