On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:
On 23.01.2012 01:26 Russell Standish said the following:
On Sun, Jan 22, 2012 at 07:16:23PM +0100, Evgenii Rudnyi wrote:
On 20.01.2012 05:59 Russell Standish said the following:
On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
...
and since information is measured by order, a maximum of order
is conveyed by a maximum of disorder. Obviously, this is a
Babylonian muddle. Somebody or something has confounded our
language."
I would say it is many people, rather than just one. I wrote "On
Complexity and Emergence" in response to the amount of
unmitigated tripe I've seen written about these topics.
Russel,
I have read your paper
http://arxiv.org/abs/nlin/0101006
It is well written. Could you please apply the principles from
your paper to a problem on how to determine information in a book
(for example let us take your book Theory of Nothing)?
Also do you believe earnestly that this information is equal to
the thermodynamic entropy of the book?
These are two quite different questions. To someone who reads my
book, the physical form of the book is unimportant - it could just as
easily be a PDF file or a Kindle e-book as a physical paper copy. The
PDF is a little over 30,000 bytes long. Computing the information
content would be a matter of counting the number 30,000 long byte
strings that generate a recognisable variant of ToN when fed into
Acrobat reader. Then subtract the logarithm (to base 256) of this
figure from 30,000 to get the information content in bytes.
This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be found by
compressing the file - PDFs are already compressed, so we could
estimate the information content as being between 25KB and 30KB
(say).
Yet, this is already information. Hence if take the equivalence between the
informational and thermodynamic entropies literally, then even in this case the
thermodynamic entropy (that should be possible to measure by experimental
thermodynamics) must exist. What it is in this case?
To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together.
The arrangement of ink on the pages is probably quite unimportant - a
book of the same size and shape, but with blank pages would do just
as well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book?
It is a good question and in my view it again shows that thermodynamic entropy and
information are some different things, as for the same object we can define the
information differently (see also below).
To compute the thermodynamic information, one could imagine
performing a massive molecular dynamics simulation, and then count
the number of states that correspond to the physical book, take the
logarithm, then subtract that from the logarithm of the total
possible number of states the molecules could take on (if completely
disassociated).
Do not forget that molecular dynamics simulation is based on the Newton laws (even
quantum mechanics molecular dynamics). Hence you probably mean here the Monte-Carlo
method. Yet, it is much simpler to employ experimental thermodynamics (see below).
This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.
Now, how does this relate to the thermodynamic entropy of the book?
It turns out that the information computed by the in-principle
process above is equal to the difference between the maximum entropy
of the molecules making up the book (if completely disassociated) and
the thermodynamic entropy, which could be measured in a calorimeter.
If yes, can one determine the information in the book just by means
of experimental thermodynamics?
One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.
Let me suggest a very simple case to understand better what you are saying. Let us
consider a string "10" for simplicity. Let us consider the next cases. I will cite first
the thermodynamic properties of Ag and Al from CODATA tables (we will need them)
S ° (298.15 K)
J K-1 mol-1
Ag cr 42.55 ą 0.20
Al cr 28.30 ą 0.10
In J K-1 cm-3 it will be
Ag cr 42.55/107.87*10.49 = 4.14
Al cr 28.30/26.98*2.7 = 2.83
1) An abstract string "10" as the abstract book above.
2) Let us make now an aluminum plate (a page) with "10" hammered on it (as on a coin) of
the total volume 10 cm^3. The thermodynamic entropy is then 28.3 J/K.
3) Let us make now a silver plate (a page) with "10" hammered on it (as on a coin) of
the total volume 10 cm^3. The thermodynamic entropy is then 41.4 J/K.
4) We can easily make another aluminum plate (scaling all dimensions from 2) to the
total volume of 100 cm^3. Then the thermodynamic entropy
is 283 J/K.
Now we have four different combinations to represent a string "10" and the thermodynamic
entropy is different. If we take the statement literally then the information must be
different in all four cases and defined uniquely as the thermodynamic entropy is already
there. Yet in my view this makes little sense.
Could you please comment on this four cases?
The thermodynamic entropy is a measure of the information required to locate the possible
states of the plates in the phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the *change* in entropy per degree at
the given temperature. It's a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More available phase space means
more uncertainty of the exact actual state and hence more information entropy. This
information is enormous compared to the "01" stamped on the plate, the shape of the plate
or any other aspects that we would normally use to convey information. It would only be
in case we cooled the plate to near absolute zero and then tried to encode information in
its microscopic vibrational states that the thermodynamic and the encoded information
entropy would become similar.
Evgenii
P.S. Why it is impossible to state that a random string is
generated by some random generator?
Not sure what you mean, unless you're really asking "Why it is
impossible to state that a random string is generated by some
pseudorandom generator?"
In which case the answer is that a pseudorandom generator is an
algorithm, so by definition doesn't produce random numbers. There is
a lot of knowledge about how to decide if a particular PRNG is
sufficiently random for a particular purpose. No PRNG is
sufficiently random for all purposes - in particular they are very
poor for security purposes, as they're inherently predictable.
I understand. Yet if we take a finite random string, then presumably there should be
some random generate with some seed that produces it. What would be wrong with this?
Yes, that points out that any finite string cannot be known to be random.
Brent
Evgenii
Cheers
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.