On Wed, Mar 22, 2006 at 01:58:26PM -0600, Matt Crawford wrote: > If you have a generator of 8-bit random numbers and every sample is > independent and uniformly distributed, and you ran this for a > gazillion iterations and wrote to the list one day saying the special > sequence { 0, 1, 2, ..., 255 } had appeared in the output, that's a > different story. But still, we would talk about the entropy of your > generator, not of one particular sequence of outputs.
We may want to cut the OP some slack... When a sequence is computed from output of a generator, it is meaningful to ask how much entropy the sequence retains... If regardless of the generator output the sequence is { 0, 1, ..., 255 }, we have zero entropy. Otherwise (and in any case) the entropy can in theory be computed from the probability distribution of the possible output sequences which is in principle available from the distribution of the generator outputs and the deterministic functions that create the sequence. Actually calculating the entropy for real-world functions and generators may be intractable... -- /"\ ASCII RIBBON NOTICE: If received in error, \ / CAMPAIGN Victor Duchovni please destroy and notify X AGAINST IT Security, sender. Sender does not waive / \ HTML MAIL Morgan Stanley confidentiality or privilege, and use is prohibited. --------------------------------------------------------------------- The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]