On 27.01.2012 23:03 meekerdb said the following:
On 1/27/2012 12:43 PM, Evgenii Rudnyi wrote:
On 27.01.2012 21:22 meekerdb said the following:
On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:
On 25.01.2012 21:25 meekerdb said the following:
On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:
Let me suggest a very simple case to understand better what
you are saying. Let us consider a string "10" for
simplicity. Let us consider the next cases. I will cite
first the thermodynamic properties of Ag and Al from CODATA
tables (we will need them)
S ° (298.15 K) J K-1 mol-1
Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10
In J K-1 cm-3 it will be
Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 =
1) An abstract string "10" as the abstract book above.
2) Let us make now an aluminum plate (a page) with "10"
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 28.3 J/K.
3) Let us make now a silver plate (a page) with "10"
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 41.4 J/K.
4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then
the thermodynamic entropy is 283 J/K.
Now we have four different combinations to represent a
string "10" and the thermodynamic entropy is different. If
we take the statement literally then the information must
be different in all four cases and defined uniquely as the
thermodynamic entropy is already there. Yet in my view this
makes little sense.
Could you please comment on this four cases?
The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the
phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the
*change* in entropy per degree at the given temperature. It's
a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More
available phase space means more uncertainty of the exact
actual state and hence more information entropy. This
information is enormous compared to the "01" stamped on the
plate, the shape of the plate or any other aspects that we
would normally use to convey information. It would only be in
case we cooled the plate to near absolute zero and then tried
to encode information in its microscopic vibrational states
that the thermodynamic and the encoded information entropy
would become similar.
I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy.
Don't you agree?
Obviously not since I wrote above that the thermodynamic entropy
is a measure of how much information it would take to locate the
exact state within the phase space allowed by the thermodynamic
Does this what engineers use when they develop communication
It would certainly interesting to consider what happens when
we decrease the temperature (in the limit to zero Kelvin).
According to the Third Law the entropy will be zero then. What
do you think, can we save less information on a copper plate at
low temperatures as compared with higher temperatures? Or
Are you being deliberately obtuse? Information encoded in the
shape of the plate is not accounted for in the thermodynamic
tables - they are just based on ideal bulk material (ignoring
I am just trying to understand the meaning of the term information
that you use. I would say that there is the thermodynamic entropy
and then the Shannon information entropy. The Shannon has developed
a theory to help engineers to deal with communication (I believe
that you have also recently a similar statement). Yet, in my view
when we talk about communication devices and mechatronics, the
information that engineers are interested in has nothing to do with
the thermodynamic entropy. Do you agree or disagree with that? If
you disagree, could you please give an example from engineering
where engineers do employ the thermodynamic entropy as the estimate
I already said I disagreed. You are confusing two different things.
Because structural engineers don't employ the theory of interatomic
forces it doesn't follow that interactomic forces have nothing to do
with sturctural properties.
You disagree that engineers do not use thermodynamic entropy but you
have not shown yet how information in engineering is related with the
thermodynamic entropy. Form the Millipede example
"The earliest generation millipede devices used probes 10 nanometers in
diameter and 70 nanometers in length, producing pits about 40 nm in
diameter on fields 92 µm x 92 µm. Arranged in a 32 x 32 grid, the
resulting 3 mm x 3 mm chip stores 500 megabits of data or 62.5 MB,
resulting in an areal density, the number of bits per square inch, on
the order of 200 Gbit/in²."
If would be much easier to understand you if you say to what
thermodynamic entropy corresponds the value of 62.5 MB in Millipede.
The only example on Thermodynamic Entropy == Information so far from you
was the work on a black hole. However, as far as I know, there is no
theory yet to describe a black hole, as from one side you need
gravitation, from the other side quantum effects. The theory that unites
them seems not to exist.
My example would be Millipede
I am pretty sure that when IBM engineers develop it, they do not
employ the thermodynamic entropy to estimate its information
capabilities. Also, the increase of temperature would be destroy
saved information there.
Well, I might be deliberately obtuse indeed. Yet with the only goal
to reach a clear definition of what the information is. Right now I
would say that there is information in engineering and in physics
and they are different. The first I roughly understand and the
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at