For information about information in general, just Google: 
Shannon information theory tutorial

Which brings up various lecture notes such as:
http://hornacek.coa.edu/dave/Tutorial/

I think the general intuition mentioned before in terms of images gets
it right: If you wanted to transmit a string of yes/no answers to
questions, you could shorten the transmission by using ever more
clever compression schemes until it just can't be compressed any
further.  This compressed string would then look like it is completely
random, only expressible by an exact copy of itself.  Max info => max
entropy.  One of those wonderful, initially non-intuitive "paradoxes"
that only become "obvious" after the Shannon types blaze the path.


Gary

Reply via email to