On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...

> >It sounds to me like you are arguing for a shift back to how
> >thermodynamics was before the Bolztmann's theoretical understanding.
> >A "back-to-roots" movement, as it were.
> 
> I would like rather to understand the meaning of your words.
> 
> By the way at the Boltzmann time the information was not there. So
> why before Boltzmann?
> 

Yes, in Boltzmann's time, the concept of information was not
understood. But probability was (at least to some extent). Now, we
know that information is essentially the logarithm of a probability. I
don't know whether information or probability is logically prior - its
probably a matter of taste.

> 
> What I observe personally is that there is information in
> informatics and information in physics (if we say that the
> thermodynamic entropy is the information). If you would agree, that
> these two informations are different, it would be fine with me, I am
> flexible with definitions.
> 
> Yet, if I understand you correctly you mean that the information in
> informatics and the thermodynamic entropy are the same. This puzzles
> me as I believe that the same physical values should have the same
> numerical values. Hence my wish to understand what you mean.
> Unfortunately you do not want to disclose it, you do not want to
> apply your theory to examples that I present.
> 
> Evgenii

Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and so
on. These are all the same concept (logarithm of a
probability). Numerically, they differ, because the context differs in
each situation.

Entropy is related in a very simple way to information. S=S_max -
I. So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max is
the capacity of the drive eg 100GB for a 100GB drive. If you store
10GB of data on it, the entropy of the drive is 90GB. That's it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.

PS

Your comment that Jaynes noted the similarity between Gibbs entropy
and Shannon entropy, which therefore motivated him to develop the
information theoretic foundation of statistical mechanics may well be
historically accurate. But this is not how the subject is presented in
a modern way, such as how Denbigh and Denbigh present it (their book
being fresh off the press the last time I really looked at this subject).

One could also note that historically, Shannon wrestled with calling
his information quantity "entropy". At that time, it was pure
analogical thinking - the precise connection between his concept and
the thermodynamic one was elucidated until at least two decades later.

-- 

----------------------------------------------------------------------------
Prof Russell Standish                  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics      hpco...@hpcoders.com.au
University of New South Wales          http://www.hpcoders.com.au
----------------------------------------------------------------------------

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to