Right. Entropy is a measure of the information content of the sequence of 
values. If you think of it as a document you wish to compress, if the rolls 
were truly random, you won't be able to compress it much. But if 1 were 
significantly more likely as a result than 6, there would be be opportunity to 
compress the "document" (and the information content would be lower. Using 
logarithms to base 2, you get entropy information content) in bits

H(p) = - Sum p(i)*log2[p(i)]


On Oct 20, 2012, at 4:49 PM, Robby Findler <ro...@eecs.northwestern.edu> wrote:

> You could phrase it as "so I decided to write down every result my dad
> got and then see how random they really were."
> 
> On Sat, Oct 20, 2012 at 6:43 PM, Gregory Woodhouse <gregwoodho...@me.com> 
> wrote:
>> You could have some fun with this. For example, what is the entropy of a 
>> sequence of 1000 (or 10000) rolls of the dice?
>> 
>> Sent from my iPhone
>> 
>> On Oct 20, 2012, at 4:15 PM, Danny Yoo <d...@hashcollision.org> wrote:
>> 
>>> http://hashcollision.org/vegas-blues/
>>> 
>>> The tone ended up being a bit bluer than I expected, hence the title.
>>> 
>>> Suggestions or comments would be appreciated.  Thanks!
>>> ____________________
>>> Racket Users list:
>>> http://lists.racket-lang.org/users
>> ____________________
>>  Racket Users list:
>>  http://lists.racket-lang.org/users

____________________
  Racket Users list:
  http://lists.racket-lang.org/users

Reply via email to