> Do you mean this even when "entropy" is used in the context of information
> theory?
> Gustavo

No, Claude Shannon's 
http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html
usage, to separate noise from information, regards statistical entropy, a
measure of dispersion, a different meaning from theormodynamic entropy, which
was the environmental question.
From
http://www.csu.edu.au/ci/vol03/finalst3/node3.html#SECTION00030000000000000000

"The entropy is a property of a distribution over a discrete set of symbols.
It is strongly sensitive to the number or variety of the symbols and less so
to their relative probabilities of occurrence. The entropy of the sequence
has a number of equivalent interpretations. It is a measure of the complexity
of the random process that generates the sequence. It is the length of
shortest binary description of the states of the random variable that
generates the sequence, so it is the size of the most compressed description
of the sequence. It is the number of binary questions that need to be asked
(20 questions style) to determine the sequence. It also measures the average
surprise, or information gain, occasioned by the receipt of a symbol. In
other words, the entropy measures the complexity or variety of the random
variable that underlies a process." 

Fred Foldvary


=====
[EMAIL PROTECTED]

__________________________________________________
Do You Yahoo!?
Yahoo! Tax Center - online filing with TurboTax
http://taxes.yahoo.com/

Reply via email to