Dear all,

I wondered if anyone could help with a paradox at the heart of my
understanding of entropy, information and pattern recognition.

I understand an informative signals as one which contains patterns, as
opposed to radomly distributed numbers e.g. noise. Therefore, I
equate information with structure in the signals distribution. However,
Shannon equates information with entropy, which is maximimum when each
symbol in the signal is equally as likely as the next i.e. a distribution
with no `structure'. These views are contradictory.

What am I misisng in my understanding?

Many thanks in advance,
Riz


Rizwan Choudrey
Robotics Group
Department of Engineering Science
University of Oxford
07956 455380

Reply via email to