On Sun, Jan 22, 2012 at 3:04 AM, Evgenii Rudnyi <use...@rudnyi.ru> wrote:

> On 21.01.2012 22:03 Evgenii Rudnyi said the following:
>  On 21.01.2012 21:01 meekerdb said the following:
>>> On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:
>>>> On 21.01.2012 20:00 meekerdb said the following:
>>>>> On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:
>>>> ...
>>>>  2) If physicists say that information is the entropy, they
>>>>>> must take it literally and then apply experimental
>>>>>> thermodynamics to measure information. This however seems
>>>>>> not to happen.
>>>>> It does happen. The number of states, i.e. the information,
>>>>> available from a black hole is calculated from it's
>>>>> thermodynamic properties as calculated by Hawking. At a more
>>>>> conventional level, counting the states available to molecules
>>>>> in a gas can be used to determine the specific heat of the gas
>>>>> and vice-verse. The reason the thermodynamic measures and the
>>>>> information measures are treated separately in engineering
>>>>> problems is that the information that is important to
>>>>> engineering is infinitesimal compared to the information stored
>>>>> in the microscopic states. So the latter is considered only in
>>>>> terms of a few macroscopic averages, like temperature and
>>>>> pressure.
>>>>> Brent
>>>> Doesn't this mean that by information engineers means something
>>>> different as physicists?
>>> I don't think so. A lot of the work on information theory was done
>>> by communication engineers who were concerned with the effect of
>>> thermal noise on bandwidth. Of course engineers specialize more
>>> narrowly than physics, so within different fields of engineering
>>> there are different terminologies and different measurement
>>> methods for things that are unified in basic physics, e.g. there
>>> are engineers who specialize in magnetism and who seldom need to
>>> reflect that it is part of EM, there are others who specialize in
>>> RF and don't worry about "static" fields.
>> Do you mean that engineers use experimental thermodynamics to
>> determine information?
> >
> > Evgenii
> To be concrete. This is for example a paper from control
> J.C. Willems and H.L. Trentelman
> H_inf control in a behavioral context: The full information case
> IEEE Transactions on Automatic Control
> Volume 44, pages 521-536, 1999
> http://homes.esat.kuleuven.be/**~jwillems/Articles/**
> JournalArticles/1999.4.pdf<http://homes.esat.kuleuven.be/%7Ejwillems/Articles/JournalArticles/1999.4.pdf>
> The term information is there but the entropy not. Could you please
> explain why? Or alternatively could you please point out to papers where
> engineers use the concept of the equivalence between the entropy and
> information?

Sure, I could give a few examples as this somewhat intersects with my line
of work.

 The NIST 800-90 recommendation (
http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf ) for
random number generators is a document for engineers implementing secure
pseudo-random number generators.  An example of where it is important is
when considering entropy sources for seeding a random number generator.  If
you use something completely random, like a fair coin toss, each toss
provides 1 bit of entropy.  The formula is -log2(predictability).  With a
coin flip, you have at best a .5 chance of correctly guessing it, and
-log2(.5) = 1.  If you used a die roll, then each die roll would provide
-log2(1/6) = 2.58 bits of entropy.  The ability to measure unpredictability
is necessary to ensure, for example, that a cryptographic key is at least
as difficult to predict the random inputs that went into generating it as
it would be to brute force the key.

In addition to security, entropy is also an important concept in the field
of data compression.  The amount of entropy in a given bit string
represents the theoretical minimum number of bits it takes to represent the
information.  If 100 bits contain 100 bits of entropy, then there is no
compression algorithm that can represent those 100 bits with fewer than 100
bits.  However, if a 100 bit string contains only 50 bits of entropy, you
could compress it to 50 bits.  For example, let's say you had 100 coin
flips from an unfair coin.  This unfair coin comes up heads 90% of the
time.  Each flip represents -log2(.9) = 0.152 bits of entropy.  Thus, a
sequence of 100 coin flips with this biased coin could be represent with 16
bits.  There is only 15.2 bits of information / entropy contained in that
100 bit long sequence.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to