Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-11 Thread Evgenii Rudnyi

On 11.02.2012 04:27 Russell Standish said the following:

On Fri, Feb 10, 2012 at 09:39:50PM +0100, Evgenii Rudnyi wrote:


Let me ask you the same question that I have recently asked Brent.
Could you please tell me, the thermodynamic entropy of what is
discussed in Jason's example below?

Evgenii



If you're asking what is the conversion constant between bits and
J/K, the answer is k_B log(2) / log(10).

I'm not sure what else to tell you...

Cheers



I am asking what a thermodynamic system is to be considered in this 
case. I understand that you can convert it his way, the question would 
be the thermodynamic entropy of what you receive this way.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Evgenii Rudnyi

On 09.02.2012 00:44 1Z said the following:





On Feb 7, 7:04 pm, Evgenii Rudnyiuse...@rudnyi.ru  wrote:


Let us take a closed vessel with oxygen and hydrogen at room
temperature. Then we open a platinum catalyst in the vessel and
the reaction starts. Will then the information in the vessel be
conserved?

Evgenii


What's the difference between  in-principle, and for-all-practical
purposes.?



What is the relationship between your question and mine?

Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Evgenii Rudnyi

On 08.02.2012 22:44 Russell Standish said the following:

On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...



What I observe personally is that there is information in
informatics and information in physics (if we say that the
thermodynamic entropy is the information). If you would agree,
that these two informations are different, it would be fine with
me, I am flexible with definitions.

Yet, if I understand you correctly you mean that the information
in informatics and the thermodynamic entropy are the same. This
puzzles me as I believe that the same physical values should have
the same numerical values. Hence my wish to understand what you
mean. Unfortunately you do not want to disclose it, you do not want
to apply your theory to examples that I present.

Evgenii


Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and
so on. These are all the same concept (logarithm of a probability).
Numerically, they differ, because the context differs in each
situation.

Entropy is related in a very simple way to information. S=S_max - I.
So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max
is the capacity of the drive eg 100GB for a 100GB drive. If you
store 10GB of data on it, the entropy of the drive is 90GB. That's
it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.


Let me ask you the same question that I have recently asked Brent. Could 
you please tell me, the thermodynamic entropy of what is discussed in 
Jason's example below?


Evgenii


On 03.02.2012 00:14 Jason Resch said the following:
...
 Evgenii,

 Sure, I could give a few examples as this somewhat intersects with my
 line of work.

 The NIST 800-90 recommendation (
 http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
 for random number generators is a document for engineers implementing
 secure pseudo-random number generators.  An example of where it is
 important is when considering entropy sources for seeding a random
 number generator.  If you use something completely random, like a
 fair coin toss, each toss provides 1 bit of entropy.  The formula is
 -log2(predictability).  With a coin flip, you have at best a .5
 chance of correctly guessing it, and -log2(.5) = 1.  If you used a
 die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
 entropy.  The ability to measure unpredictability is necessary to
 ensure, for example, that a cryptographic key is at least as
 difficult to predict the random inputs that went into generating it
 as it would be to brute force the key.

 In addition to security, entropy is also an important concept in the
 field of data compression.  The amount of entropy in a given bit
 string represents the theoretical minimum number of bits it takes to
 represent the information.  If 100 bits contain 100 bits of entropy,
 then there is no compression algorithm that can represent those 100
 bits with fewer than 100 bits.  However, if a 100 bit string contains
 only 50 bits of entropy, you could compress it to 50 bits.  For
 example, let's say you had 100 coin flips from an unfair coin.  This
 unfair coin comes up heads 90% of the time.  Each flip represents
 -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
 flips with this biased coin could be represent with 16 bits.  There
 is only 15.2 bits of information / entropy contained in that 100 bit
 long sequence.

 Jason



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Evgenii Rudnyi

On 09.02.2012 07:49 meekerdb said the following:

...



There's an interesting paper by Bennett that I ran across, which
discusses the relation of Shannon entropy, thermodynamic entropy, and
 algorithmic entropy in the context of DNA and RNA replication:

http://qi.ethz.ch/edu/qisemFS10/papers/81_Bennett_Thermodynamics_of_computation.pdf


Thank you for the link. I like the first sentence

Computers may be thought of as engines for transforming free energy 
into waste heat and mathematical work.


I am not sure though if this is more as a metaphor. I will read the 
paper, the abstract looks nice.


I believe that there was a chapter on reversible computation in

Nanoelectronics and Information Technology, ed Rainer Waser

I guess, reversible computation is kind of a strange attractor for 
engineers.


As for DNA, RNA, and proteins, I have recently read

Barbieri, M. (2007). Is the cell a semiotic system? In: Introduction to 
Biosemiotics: The New Biological Synthesis. Eds.: M. Barbieri, Springer: 
179-208.


If the author is right, it well might be that the language was developed 
even before the consciousness. By the way, the paper is written very 
well and I have to think it over.


A related discussion

http://embryogenesisexplained.com/2012/02/is-the-cell-a-semiotic-system.html

Evgenii






Brent



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Russell Standish
On Fri, Feb 10, 2012 at 09:39:50PM +0100, Evgenii Rudnyi wrote:
 
 Let me ask you the same question that I have recently asked Brent.
 Could you please tell me, the thermodynamic entropy of what is
 discussed in Jason's example below?
 
 Evgenii
 

If you're asking what is the conversion constant between bits and J/K,
the answer is k_B log(2) / log(10).

I'm not sure what else to tell you...

Cheers

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread Evgenii Rudnyi

On 07.02.2012 23:06 Russell Standish said the following:

On Tue, Feb 07, 2012 at 08:15:10PM +0100, Evgenii Rudnyi wrote:

Russell,


This is circular - temperature is usually defined in terms of
entropy:

T^{-1} = dS/dE


This is wrong. The temperature is defined according to the Zeroth
Law. The Second Law just allows us to define the absolute
temperature, but the temperature as such is defined independently
from the entropy.



This is hardly a consensus view. See
http://en.wikipedia.org/wiki/Temperature for a discussion. I don't
personally have a stake in this, having left thermodynamics as a
field more than 20 years ago.


You are right there are different approaches. You may want for example 
look at


Teaching the Second Law
http://mitworld.mit.edu/video/540

Different people, different options.


But I will point out that the zeroth law definition is limited to
equilibrium situations only, which is probably the main reason why
entropy is taken to be more fundamental in modern formulations of
statistical mechaanics.


I am not sure I understand the problem here. First one defines a 
temperature for thermal equilibrium between two subsystems. Yet, after 
that it is not a big deal to introduce a local temperature and the 
thermal field.





dependent. As far as I remember, you have used this term in
respect to informational capacity of some modern information
carrier and its number of physical states. I would suggest to
stay with this example as the definition of context dependent.
Otherwise, it does not make much sense.


It makes just as much sense with Boltzmann-Gibbs entropy. Unless
you're saying that is not connected with thermodynamics
entropy..


Unfortunately I do not get your point. In the example, with the
information carrier we have different numerical values for the
information capacity on the carrier according to the producer and
the values derived from the thermodynamic entropy.



It sounds to me like you are arguing for a shift back to how
thermodynamics was before the Bolztmann's theoretical understanding.
A back-to-roots movement, as it were.


I would like rather to understand the meaning of your words.

By the way at the Boltzmann time the information was not there. So why 
before Boltzmann?





I still do not understand what surface effects on the carrier has
to do with this difference. Do you mean that if you consider
surface effects you derive an exact equation that will connect the
information capacity of the carrier with the thermodynamic
entropy? If yes, could you please give such an equation?

Evgenii



Why do you ask for such an equation when the a) the situation being
physically described as not been fully described, and b) it may well
be pragmatically impossible to write, even though it may exist in
principle.

This seems like a cheap rhetorical trick.



As I have mentioned, I would like to understand what you mean. In order 
to achieve this, I suggest to consider simple problems to apply your 
theory. I think it is the best to understand a theory by means of simple 
practical applications. Why do you consider this as a chip rhetorical trick?


What I observe personally is that there is information in informatics 
and information in physics (if we say that the thermodynamic entropy is 
the information). If you would agree, that these two informations are 
different, it would be fine with me, I am flexible with definitions.


Yet, if I understand you correctly you mean that the information in 
informatics and the thermodynamic entropy are the same. This puzzles me 
as I believe that the same physical values should have the same 
numerical values. Hence my wish to understand what you mean. 
Unfortunately you do not want to disclose it, you do not want to apply 
your theory to examples that I present.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread Russell Standish
On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...

 It sounds to me like you are arguing for a shift back to how
 thermodynamics was before the Bolztmann's theoretical understanding.
 A back-to-roots movement, as it were.
 
 I would like rather to understand the meaning of your words.
 
 By the way at the Boltzmann time the information was not there. So
 why before Boltzmann?
 

Yes, in Boltzmann's time, the concept of information was not
understood. But probability was (at least to some extent). Now, we
know that information is essentially the logarithm of a probability. I
don't know whether information or probability is logically prior - its
probably a matter of taste.

 
 What I observe personally is that there is information in
 informatics and information in physics (if we say that the
 thermodynamic entropy is the information). If you would agree, that
 these two informations are different, it would be fine with me, I am
 flexible with definitions.
 
 Yet, if I understand you correctly you mean that the information in
 informatics and the thermodynamic entropy are the same. This puzzles
 me as I believe that the same physical values should have the same
 numerical values. Hence my wish to understand what you mean.
 Unfortunately you do not want to disclose it, you do not want to
 apply your theory to examples that I present.
 
 Evgenii

Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and so
on. These are all the same concept (logarithm of a
probability). Numerically, they differ, because the context differs in
each situation.

Entropy is related in a very simple way to information. S=S_max -
I. So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max is
the capacity of the drive eg 100GB for a 100GB drive. If you store
10GB of data on it, the entropy of the drive is 90GB. That's it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.

PS

Your comment that Jaynes noted the similarity between Gibbs entropy
and Shannon entropy, which therefore motivated him to develop the
information theoretic foundation of statistical mechanics may well be
historically accurate. But this is not how the subject is presented in
a modern way, such as how Denbigh and Denbigh present it (their book
being fresh off the press the last time I really looked at this subject).

One could also note that historically, Shannon wrestled with calling
his information quantity entropy. At that time, it was pure
analogical thinking - the precise connection between his concept and
the thermodynamic one was elucidated until at least two decades later.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread 1Z




On Feb 7, 7:04 pm, Evgenii Rudnyi use...@rudnyi.ru wrote:

 Let us take a closed vessel with oxygen and hydrogen at room
 temperature. Then we open a platinum catalyst in the vessel and the
 reaction starts. Will then the information in the vessel be conserved?

 Evgenii

What's the difference between  in-principle, and for-all-practical
purposes.?

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread meekerdb

On 2/8/2012 1:44 PM, Russell Standish wrote:

On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...


It sounds to me like you are arguing for a shift back to how
thermodynamics was before the Bolztmann's theoretical understanding.
A back-to-roots movement, as it were.

I would like rather to understand the meaning of your words.

By the way at the Boltzmann time the information was not there. So
why before Boltzmann?


Yes, in Boltzmann's time, the concept of information was not
understood. But probability was (at least to some extent). Now, we
know that information is essentially the logarithm of a probability. I
don't know whether information or probability is logically prior - its
probably a matter of taste.


What I observe personally is that there is information in
informatics and information in physics (if we say that the
thermodynamic entropy is the information). If you would agree, that
these two informations are different, it would be fine with me, I am
flexible with definitions.

Yet, if I understand you correctly you mean that the information in
informatics and the thermodynamic entropy are the same. This puzzles
me as I believe that the same physical values should have the same
numerical values. Hence my wish to understand what you mean.
Unfortunately you do not want to disclose it, you do not want to
apply your theory to examples that I present.

Evgenii

Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and so
on. These are all the same concept (logarithm of a
probability). Numerically, they differ, because the context differs in
each situation.

Entropy is related in a very simple way to information. S=S_max -
I. So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max is
the capacity of the drive eg 100GB for a 100GB drive. If you store
10GB of data on it, the entropy of the drive is 90GB. That's it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.

PS

Your comment that Jaynes noted the similarity between Gibbs entropy
and Shannon entropy, which therefore motivated him to develop the
information theoretic foundation of statistical mechanics may well be
historically accurate. But this is not how the subject is presented in
a modern way, such as how Denbigh and Denbigh present it (their book
being fresh off the press the last time I really looked at this subject).

One could also note that historically, Shannon wrestled with calling
his information quantity entropy. At that time, it was pure
analogical thinking - the precise connection between his concept and
the thermodynamic one was elucidated until at least two decades later.



There's an interesting paper by Bennett that I ran across, which discusses the relation of 
Shannon entropy, thermodynamic entropy, and algorithmic entropy in the context of DNA and 
RNA replication:


http://qi.ethz.ch/edu/qisemFS10/papers/81_Bennett_Thermodynamics_of_computation.pdf

Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Bruno Marchal


On 06 Feb 2012, at 20:42, meekerdb wrote:


On 2/6/2012 9:03 AM, 1Z wrote:



 There is also a conservation of information.  It is
 apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is hotly
debated.



It's the same as the question of wave-function collapse.  QM without  
collapse is time-reversible and so conserves information.  With  
collapse it doesn't.  But even without collapse information may  
become unavailable to us due to statistical diffusion into the  
environment or crossing and event horizon.



That's why if QM (without collapse) is 100% correct, black hole must  
reversibly evaporate.
Amazingly the presence of (p - []  p) in the material hypostases  
could explain why the core of the apparently primitive physics has to  
be given by a group or a very symmetrical group like object.
It might be related to the modular form in the general math of the  
diophantine equation (like in Fermat theorem).
In term of Smullyan singing birds (= combinators), there are no  
Kestrels (eliminators), nor Starlings (duplicators) in the core  
physical forest.


Kestrel = K. Their law is Kxy = x
Starling = S. Their law is Sxyz = xz(yz)

Then, if that is confirmed, we have the nice feature that the breaking  
of symmetries is only due to first person indeterminacy and the laws  
of big numbers.
Note that such a core physics would not been Turing complete. Forest  
(system of combinators) without both K and S (or equivalent  
eliminators and duplicators) cannot be Turing universal, although K  
can be simulated in some local way.





Brent

--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Evgenii Rudnyi

On 06.02.2012 20:42 meekerdb said the following:

On 2/6/2012 9:03 AM, 1Z wrote:

There is also a conservation of information. It is

apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is
hotly debated.



It's the same as the question of wave-function collapse. QM without
collapse is time-reversible and so conserves information. With
collapse it doesn't. But even without collapse information may become
unavailable to us due to statistical diffusion into the environment
or crossing and event horizon.

Brent



Let us take a closed vessel with oxygen and hydrogen at room 
temperature. Then we open a platinum catalyst in the vessel and the 
reaction starts. Will then the information in the vessel be conserved?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Evgenii Rudnyi

On 06.02.2012 22:19 Russell Standish said the following:

On Mon, Feb 06, 2012 at 08:20:53PM +0100, Evgenii Rudnyi wrote:

On 05.02.2012 22:46 Russell Standish said the following:

On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:


In this respect your question is actually nice, as now, I
believe, we see that it is possible to have a case when the
information capacity will be more than the number of physical
states.

Evgenii


How so?



Take a coin and cool it to zero Kelvin. Here it was my question
that you have not answered yet. Do you assume that the text on the
coin will be destroyed during cooling?



No. Previously, I mistakenly assumed that S=0 at T=0, which implies
the text being destroyed. But as I said - I withdraw that comment,
and any comment based on that mistaken assumption.




So, what happens with the entropy of the coin when the temperature goes 
to zero? Even you have withdrawn your comment, the question remains.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Evgenii Rudnyi

Russell,

 This is circular - temperature is usually defined in terms of
 entropy:

 T^{-1} = dS/dE

This is wrong. The temperature is defined according to the Zeroth Law. 
The Second Law just allows us to define the absolute temperature, but 
the temperature as such is defined independently from the entropy.


 dependent. As far as I remember, you have used this term in
 respect to informational capacity of some modern information
 carrier and its number of physical states. I would suggest to stay
 with this example as the definition of context dependent.
 Otherwise, it does not make much sense.

 It makes just as much sense with Boltzmann-Gibbs entropy. Unless
 you're saying that is not connected with thermodynamics entropy..

Unfortunately I do not get your point. In the example, with the 
information carrier we have different numerical values for the 
information capacity on the carrier according to the producer and the 
values derived from the thermodynamic entropy.


I still do not understand what surface effects on the carrier has to do 
with this difference. Do you mean that if you consider surface effects 
you derive an exact equation that will connect the information capacity 
of the carrier with the thermodynamic entropy? If yes, could you please 
give such an equation?


Evgenii


On 06.02.2012 22:17 Russell Standish said the following:

On Mon, Feb 06, 2012 at 08:36:44PM +0100, Evgenii Rudnyi wrote:

On 05.02.2012 23:05 Russell Standish said the following:


The context is there - you will just have to look for it. I
rather suspect that use of these tables refers to homogenous bulk
samples of the material, in thermal equilibrium with a heat bath
at some given temperature.


I do not get your point. Do you mean that sometimes the surface
effects could be important? Every thermodynamicist know this.
However I do not understand your problem. The thermodynamics of
surface phenomena is well established and to work with it you need
to extend the JANAF Tables with other tables. What is the problem?


The entropy will depend on what surface effect you consider
significant. Is it significant that the surface's boundary bumps an
dimples are so arranged to spell out a message in English? What if
you happen to not speak English, but only Chinese? Or might they not
be significant at all? All of these are different contexts.

Ignoring surface effects altogether is a perfectly viable model of
the physical system. Whether this is useful or not is going to
depend, well, on the context.



It would be good if you define better what do you mean by context
dependent. As far as I remember, you have used this term in
respect to informational capacity of some modern information
carrier and its number of physical states. I would suggest to stay
with this example as the definition of context dependent.
Otherwise, it does not make much sense.


It makes just as much sense with Boltzmann-Gibbs entropy. Unless
you're saying that is not connected with thermodynamics entropy...




If we were to take you at face value, we would have to conclude
that entropy is ill-defined in nonequlibrium systems.


The entropy is well-defined for a nonequilibrium system as soon as
one can use local temperature. There are some rare occasions where
local temperature is ambiguous, for example in plasma where one
defines different temperatures for electrons and molecules. Yet,
the two temperatures being defined, the entropy becomes again
well-defined.


This is circular - temperature is usually defined in terms of
entropy:

T^{-1} = dS/dE




More to the point - consider milling whatever material you have
chosen into small particles. Then consider what happens to a
container of the stuff in the Earth's gravity well, compared with
the microgravity situation on the ISS. In the former, the stuff
forms a pile on the bottom of the container - in the latter, the
stuff will be more or less uniformly distributed throughout the
containers volume. In the former case, shaking the container will
flatten the pile - but at all stages the material is in thermal
equilibrium.

In your thermodynamic context, the entropy is the same
throughout.


No it is not. As I have mentioned in this case one just must
consider surface effects.


Hence the context.




It only depends on bulk material properties, and temperature.
But most physicists would say that the milled material is in a
higher entropy state in microgravity, and that shaking the pile
in Earth's gravity raises the entropy.



Furthermore, lets assume that the particles are milled in the
form of tiny Penrose replicators (named after Lionel Penrose,
Roger's dad). When shaken, these particles stick together,
forming quite specific structures that replicate, entraining all
the replicators in the material.
(http://docs.huihoo.com/reprap/Revolutionary.pdf).

Most physicists would say that shaking a container of Penrose
replicators actually reduces the system's entropy. Yet, the
thermodynamic entropy of the 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread meekerdb

On 2/7/2012 11:04 AM, Evgenii Rudnyi wrote:

On 06.02.2012 20:42 meekerdb said the following:

On 2/6/2012 9:03 AM, 1Z wrote:

There is also a conservation of information. It is

apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is
hotly debated.



It's the same as the question of wave-function collapse. QM without
collapse is time-reversible and so conserves information. With
collapse it doesn't. But even without collapse information may become
unavailable to us due to statistical diffusion into the environment
or crossing and event horizon.

Brent



Let us take a closed vessel with oxygen and hydrogen at room temperature. Then we open a 
platinum catalyst in the vessel and the reaction starts. Will then the information in 
the vessel be conserved?


Evgenii



No, because the vessel can't be isolated at the microscopic level.

Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Russell Standish
On Tue, Feb 07, 2012 at 08:15:10PM +0100, Evgenii Rudnyi wrote:
 Russell,
 
  This is circular - temperature is usually defined in terms of
  entropy:
 
  T^{-1} = dS/dE
 
 This is wrong. The temperature is defined according to the Zeroth
 Law. The Second Law just allows us to define the absolute
 temperature, but the temperature as such is defined independently
 from the entropy.
 

This is hardly a consensus view. See
http://en.wikipedia.org/wiki/Temperature for a discussion. I don't
personally have a stake in this, having left thermodynamics as a field
more than 20 years ago.

But I will point out that the zeroth law definition is limited to
equilibrium situations only, which is probably the main reason why
entropy is taken to be more fundamental in modern formulations of
statistical mechaanics.

  dependent. As far as I remember, you have used this term in
  respect to informational capacity of some modern information
  carrier and its number of physical states. I would suggest to stay
  with this example as the definition of context dependent.
  Otherwise, it does not make much sense.
 
  It makes just as much sense with Boltzmann-Gibbs entropy. Unless
  you're saying that is not connected with thermodynamics entropy..
 
 Unfortunately I do not get your point. In the example, with the
 information carrier we have different numerical values for the
 information capacity on the carrier according to the producer and
 the values derived from the thermodynamic entropy.
 

It sounds to me like you are arguing for a shift back to how
thermodynamics was before the Bolztmann's theoretical understanding. A
back-to-roots movement, as it were.

 I still do not understand what surface effects on the carrier has to
 do with this difference. Do you mean that if you consider surface
 effects you derive an exact equation that will connect the
 information capacity of the carrier with the thermodynamic entropy?
 If yes, could you please give such an equation?
 
 Evgenii
 

Why do you ask for such an equation when the a) the situation being
physically described as not been fully described, and b) it may well
be pragmatically impossible to write, even though it may exist in
principle.

This seems like a cheap rhetorical trick.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Jason Resch
Informational laws and physical laws are, in my mind, closely  
related.  Laws related to information seem to supercede physical law.   
For example,  the impossibility of encoding information in fewer  
symbols or trying to send more over a channel in a given time period,  
than allowed.  There is also a conservation of information.  It is  
apparently industrictable.  There is a minimum physical energy  
expenditure associate with irreversible computation.  E.g. Setting a  
memory register from 1 to 0.  Other informational laws, prevent any  
compression algorithm from having any net decrease in size when  
considered over the set of all possible inputs.  You can also do  
really cool things with information, such as forward error correction:  
a file of size 1 mb can be encoded to 1.5 mb.  Then this encoded file  
can be split into 15 equally sized pieces.  The cool part is that any  
10 of these pieces (corresponding to 1 mb of information) may be used  
to recover the entire original file.  Any less than 1 mb worth of  
pieces is insufficient.


Jason

On Feb 5, 2012, at 3:46 PM, Russell Standish li...@hpcoders.com.au  
wrote:



On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:


First, we have not to forget the Third Law that states that the
change in entropy in any reaction, as well its derivatives, goes to
zero as the temperatures goes to zero Kelvin.

In this respect your question is actually nice, as now, I believe,
we see that it is possible to have a case when the information
capacity will be more than the number of physical states.

Evgenii


How so?

--

--- 
--- 
--

Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au
--- 
--- 
--


--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.




--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread 1Z


On Feb 6, 4:55 pm, Jason Resch jasonre...@gmail.com wrote:
 Informational laws and physical laws are, in my mind, closely
 related.  Laws related to information seem to supercede physical law.
 For example,  the impossibility of encoding information in fewer
 symbols or trying to send more over a channel in a given time period,
 than allowed.

Those transcend physics inasmuch as they are mathematical .

 There is also a conservation of information.  It is
 apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is hotly
debated.

 There is a minimum physical energy
 expenditure associate with irreversible computation.  E.g. Setting a
 memory register from 1 to 0.  Other informational laws, prevent any
 compression algorithm from having any net decrease in size when
 considered over the set of all possible inputs.  You can also do
 really cool things with information, such as forward error correction:
 a file of size 1 mb can be encoded to 1.5 mb.  Then this encoded file
 can be split into 15 equally sized pieces.  The cool part is that any
 10 of these pieces (corresponding to 1 mb of information) may be used
 to recover the entire original file.  Any less than 1 mb worth of
 pieces is insufficient.

 Jason

You information laws seem to have mixed origins.

 On Feb 5, 2012, at 3:46 PM, Russell Standish li...@hpcoders.com.au
 wrote:







  On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:

  First, we have not to forget the Third Law that states that the
  change in entropy in any reaction, as well its derivatives, goes to
  zero as the temperatures goes to zero Kelvin.

  In this respect your question is actually nice, as now, I believe,
  we see that it is possible to have a case when the information
  capacity will be more than the number of physical states.

  Evgenii

  How so?

  --

  ---
  ---
  --
  Prof Russell Standish                  Phone 0425 253119 (mobile)
  Principal, High Performance Coders
  Visiting Professor of Mathematics      hpco...@hpcoders.com.au
  University of New South Wales          http://www.hpcoders.com.au
  ---
  ---
  --

  --
  You received this message because you are subscribed to the Google
  Groups Everything List group.
  To post to this group, send email to everything-list@googlegroups.com.
  To unsubscribe from this group, send email to 
  everything-list+unsubscr...@googlegroups.com
  .
  For more options, visit this group 
  athttp://groups.google.com/group/everything-list?hl=en
  .

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Evgenii Rudnyi

On 05.02.2012 23:05 Russell Standish said the following:

On Fri, Feb 03, 2012 at 08:50:40PM +0100, Evgenii Rudnyi wrote:


I guess that you have never done a lab in experimental
thermodynamics. There are classical experiment where people
measure heat of combustion, heat capacity, equilibrium pressure,
equilibrium constants and then determine the entropy. If you do it,
you see that you can measure the entropy the same way as other
properties, there is no difference. A good example to this end is
JANAF Thermochemical Tables (Joint Army-Naval-Air Force
Thermochemical Tables). You will find a pdf here

http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf

It is about 230 Mb but I guess it is doable to download it. Please
open it and explain what is the difference between the tabulated
entropy and other properties there. How your personal viewpoint on
a thermodynamic system will influence numerical values of the
entropy tabulated in JANAF? What is the difference with the mass or
length? I do not see it.

You see, the JANAF Tables has started by military. They needed it
to compute for example the combustion process in rockets and they
have been successful. What part then in a rocket is context
dependent?

This is the main problem with the books on entropy and
information. They do not consider thermodynamic tables, they do not
work out simple thermodynamic examples. For example let us consider
the next problem:

--- Problem. Given
temperature, pressure, and initial number of moles of NH3, N2 and
H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of
NH3, N2 and H2 for example in the JANAF Tables and then compute
the equilibrium constant.

From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity
but it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2
Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it
is not a big deal to extend the equations to include heat
capacities as well.

Del_G_r_T = Del_H_r_298 - T Del_S_r_298

Del_G_r_T = - R T ln Kp

When Kp, total pressure and the initial number of moles are given,
it is rather straightforward to compute equilibrium composition.
If you need help, please just let me know.
---

So, the entropy is there. What is context dependent here? Where is
the difference with mass and length?

Evgenii



The context is there - you will just have to look for it. I rather
suspect that use of these tables refers to homogenous bulk samples
of the material, in thermal equilibrium with a heat bath at some
given temperature.


I do not get your point. Do you mean that sometimes the surface effects 
could be important? Every thermodynamicist know this. However I do not 
understand your problem. The thermodynamics of surface phenomena is well 
established and to work with it you need to extend the JANAF Tables with 
other tables. What is the problem?


It would be good if you define better what do you mean by context 
dependent. As far as I remember, you have used this term in respect to 
informational capacity of some modern information carrier and its number 
of physical states. I would suggest to stay with this example as the 
definition of context dependent. Otherwise, it does not make much sense.



If we were to take you at face value, we would have to conclude that
entropy is ill-defined in nonequlibrium systems.


The entropy is well-defined for a nonequilibrium system as soon as one 
can use local temperature. There are some rare occasions where local 
temperature is ambiguous, for example in plasma where one defines 
different temperatures for electrons and molecules. Yet, the two 
temperatures being defined, the entropy becomes again well-defined.



More to the point - consider milling whatever material you have
chosen into small particles. Then consider what happens to a
container of the stuff in the Earth's gravity well, compared with the
microgravity situation on the ISS. In the former, the stuff forms a
pile on the bottom of the container - in the latter, the stuff will
be more or less uniformly distributed throughout the containers
volume. In the former case, shaking the container will flatten the
pile - but at all stages the material is in thermal equilibrium.

In your thermodynamic context, the entropy is the same throughout.


No it is not. As I have mentioned in this case one just must consider 
surface effects.



It only depends on bulk material properties, and temperature. But
most physicists would say that the milled material is in a higher
entropy state in 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Russell Standish
On Mon, Feb 06, 2012 at 08:36:44PM +0100, Evgenii Rudnyi wrote:
 On 05.02.2012 23:05 Russell Standish said the following:
 
 The context is there - you will just have to look for it. I rather
 suspect that use of these tables refers to homogenous bulk samples
 of the material, in thermal equilibrium with a heat bath at some
 given temperature.
 
 I do not get your point. Do you mean that sometimes the surface
 effects could be important? Every thermodynamicist know this.
 However I do not understand your problem. The thermodynamics of
 surface phenomena is well established and to work with it you need
 to extend the JANAF Tables with other tables. What is the problem?

The entropy will depend on what surface effect you consider
significant. Is it significant that the surface's boundary bumps an
dimples are so arranged to spell out a message in English? What if you
happen to not speak English, but only Chinese? Or might they not be
significant at all? All of these are different contexts.

Ignoring surface effects altogether is a perfectly viable model of the
physical system. Whether this is useful or not is going to depend,
well, on the context.

 
 It would be good if you define better what do you mean by context
 dependent. As far as I remember, you have used this term in respect
 to informational capacity of some modern information carrier and its
 number of physical states. I would suggest to stay with this example
 as the definition of context dependent. Otherwise, it does not make
 much sense.

It makes just as much sense with Boltzmann-Gibbs entropy. Unless
you're saying that is not connected with thermodynamics entropy...

 
 If we were to take you at face value, we would have to conclude that
 entropy is ill-defined in nonequlibrium systems.
 
 The entropy is well-defined for a nonequilibrium system as soon as
 one can use local temperature. There are some rare occasions where
 local temperature is ambiguous, for example in plasma where one
 defines different temperatures for electrons and molecules. Yet, the
 two temperatures being defined, the entropy becomes again
 well-defined.

This is circular - temperature is usually defined in terms of entropy:

T^{-1} = dS/dE

 
 More to the point - consider milling whatever material you have
 chosen into small particles. Then consider what happens to a
 container of the stuff in the Earth's gravity well, compared with the
 microgravity situation on the ISS. In the former, the stuff forms a
 pile on the bottom of the container - in the latter, the stuff will
 be more or less uniformly distributed throughout the containers
 volume. In the former case, shaking the container will flatten the
 pile - but at all stages the material is in thermal equilibrium.
 
 In your thermodynamic context, the entropy is the same throughout.
 
 No it is not. As I have mentioned in this case one just must
 consider surface effects.

Hence the context.

 
 It only depends on bulk material properties, and temperature. But
 most physicists would say that the milled material is in a higher
 entropy state in microgravity, and that shaking the pile in Earth's
 gravity raises the entropy.
 
 Furthermore, lets assume that the particles are milled in the form
 of tiny Penrose replicators (named after Lionel Penrose, Roger's
 dad). When shaken, these particles stick together, forming quite
 specific structures that replicate, entraining all the replicators
 in the material. (http://docs.huihoo.com/reprap/Revolutionary.pdf).
 
 Most physicists would say that shaking a container of Penrose
 replicators actually reduces the system's entropy. Yet, the
 thermodynamic entropy of the JNAF context does not change, as that
 only depends on bulk material properties.
 
 We are again at the definition of context dependent. What are saying
 now is that when you have new physical effects, it is necessary to
 take them into account. What it has to do with your example when
 information on an information carrier was context dependent?
 

Who decides what physical effects to take into account? This is not a
question of pure relativism - I'm well aware that some models are much
better than others at describing the situtaion, but even in the case
of Penrose replicators described above, their ability to adhere and
fragment may or may not be relevant to the situation you are trying to
model.


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Russell Standish
On Mon, Feb 06, 2012 at 08:20:53PM +0100, Evgenii Rudnyi wrote:
 On 05.02.2012 22:46 Russell Standish said the following:
 On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
 
 In this respect your question is actually nice, as now, I believe,
 we see that it is possible to have a case when the information
 capacity will be more than the number of physical states.
 
 Evgenii
 
 How so?
 
 
 Take a coin and cool it to zero Kelvin. Here it was my question that
 you have not answered yet. Do you assume that the text on the coin
 will be destroyed during cooling?
 

No. Previously, I mistakenly assumed that S=0 at T=0, which implies the
text being destroyed. But as I said - I withdraw that comment, and any
comment based on that mistaken assumption.


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-05 Thread Russell Standish
On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
 
 First, we have not to forget the Third Law that states that the
 change in entropy in any reaction, as well its derivatives, goes to
 zero as the temperatures goes to zero Kelvin.
 
 In this respect your question is actually nice, as now, I believe,
 we see that it is possible to have a case when the information
 capacity will be more than the number of physical states.
 
 Evgenii

How so?

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-05 Thread Russell Standish
On Fri, Feb 03, 2012 at 08:50:40PM +0100, Evgenii Rudnyi wrote:
 
 I guess that you have never done a lab in experimental
 thermodynamics. There are classical experiment where people measure
 heat of combustion, heat capacity, equilibrium pressure, equilibrium
 constants and then determine the entropy. If you do it, you see that
 you can measure the entropy the same way as other properties, there
 is no difference. A good example to this end is JANAF Thermochemical
 Tables (Joint Army-Naval-Air Force Thermochemical Tables). You will
 find a pdf here
 
 http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf
 
 It is about 230 Mb but I guess it is doable to download it. Please
 open it and explain what is the difference between the tabulated
 entropy and other properties there. How your personal viewpoint on a
 thermodynamic system will influence numerical values of the entropy
 tabulated in JANAF? What is the difference with the mass or length?
 I do not see it.
 
 You see, the JANAF Tables has started by military. They needed it to
 compute for example the combustion process in rockets and they have
 been successful. What part then in a rocket is context dependent?
 
 This is the main problem with the books on entropy and information.
 They do not consider thermodynamic tables, they do not work out
 simple thermodynamic examples. For example let us consider the next
 problem:
 
 ---
 Problem. Given temperature, pressure, and initial number of moles of
 NH3, N2 and H2, compute the equilibrium composition.
 
 To solve the problem one should find thermodynamic properties of
 NH3, N2 and H2 for example in the JANAF Tables and then compute the
 equilibrium constant.
 
 From thermodynamics tables (all values are molar values for the
 standard pressure 1 bar, I have omitted the symbol o for simplicity but
 it is very important not to forget it):
 
 Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
 Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)
 
 2NH3 = N2 + 3H2
 
 Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)
 
 Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)
 
 Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)
 
 To make life simple, I will assume below that Del_Cp_r = 0, but it is
 not a big deal to extend the equations to include heat capacities as well.
 
 Del_G_r_T = Del_H_r_298 - T Del_S_r_298
 
 Del_G_r_T = - R T ln Kp
 
 When Kp, total pressure and the initial number of moles are given,
 it is rather straightforward to compute equilibrium composition. If
 you need help, please just let me know.
 ---
 
 So, the entropy is there. What is context dependent here? Where is
 the difference with mass and length?
 
 Evgenii
 

The context is there - you will just have to look for it. I rather
suspect that use of these tables refers to homogenous bulk samples of
the material, in thermal equilibrium with a heat bath at some given
temperature.

If we were to take you at face value, we would have to conclude that
entropy is ill-defined in nonequlibrium systems.

More to the point - consider milling whatever material you have chosen
into small particles. Then consider what happens to a container of the
stuff in the Earth's gravity well, compared with the microgravity
situation on the ISS. In the former, the stuff forms a pile on the
bottom of the container - in the latter, the stuff will be more or
less uniformly distributed throughout the containers volume. In the
former case, shaking the container will flatten the pile - but at all
stages the material is in thermal equilibrium.

In your thermodynamic context, the entropy is the same
throughout. It only depends on bulk material properties, and
temperature. But most physicists would say that the milled material is
in a higher entropy state in microgravity, and that shaking the pile
in Earth's gravity raises the entropy.

Furthermore, lets assume that the particles are milled in the form of
tiny Penrose replicators (named after Lionel Penrose, Roger's
dad). When shaken, these particles stick together, forming quite
specific structures that replicate, entraining all the replicators in
the material. (http://docs.huihoo.com/reprap/Revolutionary.pdf). 

Most physicists would say that shaking a container of Penrose
replicators actually reduces the system's entropy. Yet, the
thermodynamic entropy of the JNAF context does not change, as that
only depends on bulk material properties.

We can follow your line of thinking, and have a word entropy that is
only useful in certain contexts, then we'll need to make up a
different word for other contexts.  Alternatively, we can have a word
that applies over all macroscopic contexts, and explicitly qualify
what that context is. The underlying concept is the same in all cases
though. It appears to me, that standard scientific usage has become to
use the same word for that concept, rather than coin different words
to 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-03 Thread Evgenii Rudnyi

On 03.02.2012 00:14 Jason Resch said the following:

On Sun, Jan 22, 2012 at 3:04 AM, Evgenii Rudnyiuse...@rudnyi.ru
wrote:


On 21.01.2012 22:03 Evgenii Rudnyi said the following:

On 21.01.2012 21:01 meekerdb said the following:



On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:


On 21.01.2012 20:00 meekerdb said the following:


On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:





...

2) If physicists say that information is the entropy, they

must take it literally and then apply experimental
thermodynamics to measure information. This however
seems not to happen.



It does happen. The number of states, i.e. the
information, available from a black hole is calculated from
it's thermodynamic properties as calculated by Hawking. At
a more conventional level, counting the states available to
molecules in a gas can be used to determine the specific
heat of the gas and vice-verse. The reason the
thermodynamic measures and the information measures are
treated separately in engineering problems is that the
information that is important to engineering is
infinitesimal compared to the information stored in the
microscopic states. So the latter is considered only in
terms of a few macroscopic averages, like temperature and
pressure.

Brent



Doesn't this mean that by information engineers means
something different as physicists?



I don't think so. A lot of the work on information theory was
done by communication engineers who were concerned with the
effect of thermal noise on bandwidth. Of course engineers
specialize more narrowly than physics, so within different
fields of engineering there are different terminologies and
different measurement methods for things that are unified in
basic physics, e.g. there are engineers who specialize in
magnetism and who seldom need to reflect that it is part of EM,
there are others who specialize in RF and don't worry about
static fields.



Do you mean that engineers use experimental thermodynamics to
determine information?


Evgenii


To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman H_inf control in a behavioral
context: The full information case IEEE Transactions on Automatic
Control Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/**~jwillems/Articles/**
JournalArticles/1999.4.pdfhttp://homes.esat.kuleuven.be/%7Ejwillems/Articles/JournalArticles/1999.4.pdf




The term information is there but the entropy not. Could you please

explain why? Or alternatively could you please point out to papers
where engineers use the concept of the equivalence between the
entropy and information?




Evgenii,

Sure, I could give a few examples as this somewhat intersects with my
line of work.

The NIST 800-90 recommendation (
http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
for random number generators is a document for engineers implementing
secure pseudo-random number generators.  An example of where it is
important is when considering entropy sources for seeding a random
number generator.  If you use something completely random, like a
fair coin toss, each toss provides 1 bit of entropy.  The formula is
-log2(predictability).  With a coin flip, you have at best a .5
chance of correctly guessing it, and -log2(.5) = 1.  If you used a
die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
entropy.  The ability to measure unpredictability is necessary to
ensure, for example, that a cryptographic key is at least as
difficult to predict the random inputs that went into generating it
as it would be to brute force the key.

In addition to security, entropy is also an important concept in the
field of data compression.  The amount of entropy in a given bit
string represents the theoretical minimum number of bits it takes to
represent the information.  If 100 bits contain 100 bits of entropy,
then there is no compression algorithm that can represent those 100
bits with fewer than 100 bits.  However, if a 100 bit string contains
only 50 bits of entropy, you could compress it to 50 bits.  For
example, let's say you had 100 coin flips from an unfair coin.  This
unfair coin comes up heads 90% of the time.  Each flip represents
-log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
flips with this biased coin could be represent with 16 bits.  There
is only 15.2 bits of information / entropy contained in that 100 bit
long sequence.

Jason



Jason,

Sorry, for being unclear. In my statement I have meant the thermodynamic 
entropy. No doubt, in the information theory engineers, starting from 
Shannon, use the information entropy. Yet, I wanted to point out that I 
have not seen engineering works where engineers employ the equivalence 
between the thermodynamic entropy and the informational entropy.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-03 Thread Evgenii Rudnyi

On 02.02.2012 22:18 Russell Standish said the following:

On Thu, Feb 02, 2012 at 07:45:53PM +0100, Evgenii Rudnyi wrote:

On 01.02.2012 21:51 Stephen P. King said the following:

On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:

First the thermodynamic entropy is not context depended. This
must mean that if it is the same as information, then the
latter must not be context dependent as well. Could you please
give me an example of a physical property that is context
dependent?



Temperature is context dependent. If we consider physics at the
level of atoms there is no such a quantity as temperature.
Additionally, thermodynamic entropy does require Boltzmann's
constant to be defined with is a form of context dependency as it
specifies the level at which we are to take micro-states as
macroscopically indistinguishable.


The Boltzmann's constant, as far as I understand, is defined
uniquely. If you talk about some other universe (or Platonia)
where one could imagine something else, then it could be. Yet, in
the world that we know according to empirical scientific studies,
the Boltmann's constant is a fundamental constant. Hence I do not
understand you in this respect.


Boltzmann's constant is a unit conversion constant like c an Plank's
constant, nothing more. It has no fundamental significance.



Indeed, temperature is not available directly at the level of
particles obeying classical or quantum laws. However for example
it could be not a problem with the temperature but rather with the
description at the particle level.

Anyway, I would suggest to stick to empirical scientific knowledge
that we have. Then I do not understand what do you mean that
temperature is context dependent either.



Temperature is an averaged quantity, so whilst technically an
example of emergence, it is the weakest form of emergence.

Evgenii is stating an oft-repeated meme that entropy is not
context-dependent.

It is context dependent because it (possibly implicitly) depends on
what we mean by a thermodynamic state. In thermodynamics, we usually
mean a state defined by temperature, pressure, volume, number of
particles, and so on. The and so on is the context dependent part.
There are actually an enormous number of possible independent
thermodyamic variables that may be relevant in different situations.
In an electrical device, the arrangement of charges might be another
such thermodynamic variable.

Also, even in classic schoolbook thermodynamics, not all of
temperature, pressue, volume and particle number are relevant.
Dropping various of these terms leads to different ensembles
(microcanonical, canonical and grand canonical).

Of course, context dependence does not mean subjective. If two
observers agree on the context, the entropy is quite objective. But
it is a little more complex than something like mass or length.

This is explained very well in Denbigh  Denbigh.




I guess that you have never done a lab in experimental thermodynamics. 
There are classical experiment where people measure heat of combustion, 
heat capacity, equilibrium pressure, equilibrium constants and then 
determine the entropy. If you do it, you see that you can measure the 
entropy the same way as other properties, there is no difference. A good 
example to this end is JANAF Thermochemical Tables (Joint Army-Naval-Air 
Force Thermochemical Tables). You will find a pdf here


http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf

It is about 230 Mb but I guess it is doable to download it. Please open 
it and explain what is the difference between the tabulated entropy and 
other properties there. How your personal viewpoint on a thermodynamic 
system will influence numerical values of the entropy tabulated in 
JANAF? What is the difference with the mass or length? I do not see it.


You see, the JANAF Tables has started by military. They needed it to 
compute for example the combustion process in rockets and they have been 
successful. What part then in a rocket is context dependent?


This is the main problem with the books on entropy and information. They 
do not consider thermodynamic tables, they do not work out simple 
thermodynamic examples. For example let us consider the next problem:


---
Problem. Given temperature, pressure, and initial number of moles of 
NH3, N2 and H2, compute the equilibrium composition.


To solve the problem one should find thermodynamic properties of NH3, N2 
and H2 for example in the JANAF Tables and then compute the equilibrium 
constant.


From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-03 Thread Evgenii Rudnyi

On 02.02.2012 22:35 Russell Standish said the following:

On Wed, Feb 01, 2012 at 09:17:41PM +0100, Evgenii Rudnyi wrote:

On 29.01.2012 23:06 Russell Standish said the following:


Absolutely! But at zero kelvin, the information storage capacity
of the device is precisely zero, so cooling only works to a
certain point.



I believe that you have mentioned once that information is
negentropy. If yes, could you please comment on that? What
negentropy would mean?


Scheodinger first pointed out that living systems must export
entropy, and coined the term negative entropy to refer to this.
Brillouin shortened this to negentropy.

The basic formula is S_max = S + I.

S_max is the maximum possible value for entropy to take - the value
of entropy at thermodynamic equilibrium for a microcanonical
ensemble. S is the usual entropy, which for non-equilibrium systems
will be typically lower than S_max, and even for equilibrium systems
can be held lower by physical constraints. I is the difference, and
this is what Brillouin called negentropy. It is an information - the
information encoded in that state.

Try looking up http://en.wikipedia.org/wiki/Negentropy


Could you please explain how the negentropy is related to experimental 
thermodynamics? You will find in the previous message the link to the 
JANAF tables and a basic thermodynamic problem. Could you please 
demonstrate how the negentropy will help there?






In general, I do not understand what does it mean that information
at zero Kelvin is zero. Let us take a coin and cool it down. Do
you mean that the text on the coin will disappear? Or you mean that
no one device can read this text at zero Kelvin?



I vaguely remembered that S_max=0 at absolute zero. If it were, then
both S and I must be zero, because these are all nonnegative
quantities. But http://en.wikipedia.org/wiki/Absolute_zero states
only that entropy is at a minimum, not stricly zero. In which case,
I withdraw that comment.

Cheers


First, we have not to forget the Third Law that states that the change 
in entropy in any reaction, as well its derivatives, goes to zero as the 
temperatures goes to zero Kelvin.


In this respect your question is actually nice, as now, I believe, we 
see that it is possible to have a case when the information capacity 
will be more than the number of physical states.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread John Clark
On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyi use...@rudnyi.ru wrote:

 Could you please give me an example of a physical property that is
 context dependent?


Off the top of my head, mass, velocity, duration and length.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Evgenii Rudnyi

On 01.02.2012 22:51 John Mikes said the following:

Evgenii, I am not sure if it is your text, or Russell's:

***In general, I do not understand what does it mean that
information at zero Kelvin is zero. Let us take a coin and cool it
down. Do you mean that the text on the coin will disappear? Or you
mean that no one device can read this text at zero Kelvin?*


This was my question to Russell.


 ** I
doubt that the text embossed on a coin is its *information*. It
is part of the physical structure as e.g. the roundness. size, or
material(?) characteristics - all, what nobody can imagine how to
change for  the condition of 0-Kelvin. The abs. zero temp. conditions


Yes, but when we speak about information carrier (book, a hard drive, 
DVD, flash memory) it is exactly the same. And it has nothing to do with 
the total number of physical states in the device, as this example with 
zero temperature nicely shows.


Evgenii


are extrapolated the best way we could muster. A matter of (sci.)
faith. Maybe the so called 'interstitial' spaces also collapse? I am
not for a 'physicalistic' worldview - rather an agnostic about
'explanations' of diverse epochs based on then recent 'findings'
(mostly mathematically justified??? - realizing that we may be up to
lots of novelties we have no idea about today, not even of the
directions they may shove our views into. I say that in comparison to
our 'conventional scientific' - even everyday's - views of the world
in the past, before and after fundamental knowledge-domains were
added to our inventory. I do not condone evidences that must be,
because THERE IS NO OTHER WAY - in our existing ignorance of course.
Atoms? well, if there *is* 'matter'? (MASS??) even my
(macro)molecules I invented are suspect. So 'entropy' is a nice term
in (classical?) thermodynamics what I coined in 1942 as *the science
that tells us how things would proceed wouldn't they proceed as they
do indeed* thinking of Carnot and the isotherm/reversible
equilibria, etc. - way before the irreversible kind was taught in
college courses. Information is another rather difficult term, I like
to use 'relation' and leave it open what so far unknown relations may
affect our processes we assign to 'causes' known within the model of
the world we think we are in. The rest (including our misunderstood
model - domain) is what I may call an 'infinite complexity' of which
we are part - mostly ignorant about the 'beyond model' everything.

We 'fabricate' our context, try to explain by the portion we know of
- as if it was the totality - and live in our happy conventional
scientific terms. Human ingenuity constructed a miraculous science
and technology that is ALMOST good (some mistakes notwithstanding
occurring), then comes M. Curie, Watson-Crick, Fleming, Copernicus,
Volta, etc. and we re-write the schoolbooks.

John M

**

On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyiuse...@rudnyi.ru
wrote:


On 29.01.2012 22:49 Russell Standish said the following:


On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:


On 28.01.2012 23:26 meekerdb said the following:


On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:



A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show
that

1) There is information


and entropy

that engineers employ.





Some engineers employ information, some the thermodynamic entropy.
I have not seen though an engineering paper where both information
and the thermodynamic entropy have been used as synonyms.

2) There is the thermodynamic entropy.




+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each
other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence
is that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).




First the thermodynamic entropy is not context depended. This must
mean that if it is the same as information, then the latter must
not be context dependent as well. Could you please give me an
example of a physical property that is context dependent?

Second, when I have different numerical values, this could mean
that the units are different. Yet, if this is not the case, then in
my view we are talking about two different entities.

Could you please explain then what is common between 1) and 2)?

Evgenii





-- You received this message because you are subscribed to the
Google Groups Everything List group. To post to this group, send
email to
everything-list@googlegroups.**comeverything-list@googlegroups.com



.

To unsubscribe from this group, send email to
everything-list+unsubscribe@
**googlegroups.comeverything-list%2bunsubscr...@googlegroups.com.



For more options, visit this group at 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Evgenii Rudnyi

On 01.02.2012 21:51 Stephen P. King said the following:

On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:

On 29.01.2012 22:49 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show
that

1) There is information

and entropy


that engineers employ.


Some engineers employ information, some the thermodynamic entropy.
I have not seen though an engineering paper where both information
and the thermodynamic entropy have been used as synonyms.


2) There is the thermodynamic entropy.


+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each
other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence
is that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).



First the thermodynamic entropy is not context depended. This must
 mean that if it is the same as information, then the latter must
not be context dependent as well. Could you please give me an
example of a physical property that is context dependent?



Temperature is context dependent. If we consider physics at the level
of atoms there is no such a quantity as temperature. Additionally,
thermodynamic entropy does require Boltzmann's constant to be defined
 with is a form of context dependency as it specifies the level at
which we are to take micro-states as macroscopically
indistinguishable.


The Boltzmann's constant, as far as I understand, is defined uniquely. 
If you talk about some other universe (or Platonia) where one could 
imagine something else, then it could be. Yet, in the world that we know 
according to empirical scientific studies, the Boltmann's constant is a 
fundamental constant. Hence I do not understand you in this respect.


Indeed, temperature is not available directly at the level of particles 
obeying classical or quantum laws. However for example it could be not a 
problem with the temperature but rather with the description at the 
particle level.


Anyway, I would suggest to stick to empirical scientific knowledge that 
we have. Then I do not understand what do you mean that temperature is 
context dependent either.


We can imagine very different worlds indeed. Yet, right now we discuss 
the question (I will repeat from the email to John) as follows:


When Russell says that information is context dependent, we talk about 
for example a DVD. Then information capacity as defined by the company 
and the number of physical states are completely different. Hence the 
notation from Russell that information is context dependent.


If you mean that the temperature and the Boltzmann constant are context 
depended in the same way, could you please give practical examples?


Evgenii



Onward!

Stephen



Second, when I have different numerical values, this could mean
that the units are different. Yet, if this is not the case, then in
my view we are talking about two different entities.

Could you please explain then what is common between 1) and 2)?

Evgenii









--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread meekerdb

On 2/2/2012 10:35 AM, Evgenii Rudnyi wrote:
Yes, but when we speak about information carrier (book, a hard drive, DVD, flash memory) 
it is exactly the same. And it has nothing to do with the total number of physical 
states in the device, as this example with zero temperature nicely shows.


That's not true.  The arrangement of ink on the page, the embossed face of the coin, do 
contribute to the physical states.  It's just that the information encoded by them are 
infinitesimal compared to the information required to define the microscopic states, e.g. 
the vibrational mode of every atom.  So when we're concerned with heat energy that changes 
the vibrational modes we neglect the pattern of ink and the emobossing.  And when we're 
reading we are only interested in the information conveyed by a well defined channel, and 
we ignored what information might be encoded in the mircroscopic states.  But the two are 
both present.


Brent.

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Evgenii Rudnyi

On 02.02.2012 20:00 meekerdb said the following:

On 2/2/2012 10:35 AM, Evgenii Rudnyi wrote:

Yes, but when we speak about information carrier (book, a hard
drive, DVD, flash memory) it is exactly the same. And it has
nothing to do with the total number of physical states in the
device, as this example with zero temperature nicely shows.


That's not true. The arrangement of ink on the page, the embossed
face of the coin, do contribute to the physical states. It's just
that the information encoded by them are infinitesimal compared to
the information required to define the microscopic states, e.g. the
vibrational mode of every atom. So when we're concerned with heat
energy that changes the vibrational modes we neglect the pattern of
ink and the emobossing. And when we're reading we are only interested
in the information conveyed by a well defined channel, and we ignored
what information might be encoded in the mircroscopic states. But the
two are both present.

Brent.



Yes, I agree with this, but I think it changes nothing with the term 
information. We have a number of physical states in a carrier (that is 
influenced indeed by for example the arrangement of ink on the page) and 
we have the information capability as defined by the company that sells 
the carrier.


By the way, the example with the zero temperature (or strictly speaking 
with temperature going to zero Kelvin) seems to show that the 
information capability could be even more than the number of physical 
states.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Russell Standish
On Thu, Feb 02, 2012 at 07:45:53PM +0100, Evgenii Rudnyi wrote:
 On 01.02.2012 21:51 Stephen P. King said the following:
 On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:
 First the thermodynamic entropy is not context depended. This must
  mean that if it is the same as information, then the latter must
 not be context dependent as well. Could you please give me an
 example of a physical property that is context dependent?
 
 
 Temperature is context dependent. If we consider physics at the level
 of atoms there is no such a quantity as temperature. Additionally,
 thermodynamic entropy does require Boltzmann's constant to be defined
  with is a form of context dependency as it specifies the level at
 which we are to take micro-states as macroscopically
 indistinguishable.
 
 The Boltzmann's constant, as far as I understand, is defined
 uniquely. If you talk about some other universe (or Platonia) where
 one could imagine something else, then it could be. Yet, in the
 world that we know according to empirical scientific studies, the
 Boltmann's constant is a fundamental constant. Hence I do not
 understand you in this respect.

Boltzmann's constant is a unit conversion constant like c an Plank's
constant, nothing more. It has no fundamental significance.

 
 Indeed, temperature is not available directly at the level of
 particles obeying classical or quantum laws. However for example it
 could be not a problem with the temperature but rather with the
 description at the particle level.
 
 Anyway, I would suggest to stick to empirical scientific knowledge
 that we have. Then I do not understand what do you mean that
 temperature is context dependent either.
 

Temperature is an averaged quantity, so whilst technically an example
of emergence, it is the weakest form of emergence.

Evgenii is stating an oft-repeated meme that entropy is not
context-dependent. 

It is context dependent because it (possibly implicitly) depends on
what we mean by a thermodynamic state. In thermodynamics, we usually
mean a state defined by temperature, pressure, volume, number of
particles, and so on. The and so on is the context dependent
part. There are actually an enormous number of possible independent
thermodyamic variables that may be relevant in different
situations. In an electrical device, the arrangement of charges might
be another such thermodynamic variable.

Also, even in classic schoolbook thermodynamics, not all of
temperature, pressue, volume and particle number are
relevant. Dropping various of these terms leads to different ensembles
(microcanonical, canonical and grand canonical).

Of course, context dependence does not mean subjective. If two
observers agree on the context, the entropy is quite objective. But it
is a little more complex than something like mass or length.

This is explained very well in Denbigh  Denbigh.


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Russell Standish
On Wed, Feb 01, 2012 at 09:17:41PM +0100, Evgenii Rudnyi wrote:
 On 29.01.2012 23:06 Russell Standish said the following:
 
 Absolutely! But at zero kelvin, the information storage capacity of
 the device is precisely zero, so cooling only works to a certain
 point.
 
 
 I believe that you have mentioned once that information is
 negentropy. If yes, could you please comment on that? What
 negentropy would mean?

Scheodinger first pointed out that living systems must export entropy,
and coined the term negative entropy to refer to this. Brillouin
shortened this to negentropy.

The basic formula is S_max = S + I.

S_max is the maximum possible value for entropy to take - the value of
entropy at thermodynamic equilibrium for a microcanonical ensemble. S
is the usual entropy, which for non-equilibrium systems will be
typically lower than S_max, and even for equilibrium systems can be
held lower by physical constraints. I is the difference, and this is what
Brillouin called negentropy. It is an information - the information
encoded in that state.

Try looking up http://en.wikipedia.org/wiki/Negentropy

 
 In general, I do not understand what does it mean that information
 at zero Kelvin is zero. Let us take a coin and cool it down. Do you
 mean that the text on the coin will disappear? Or you mean that no
 one device can read this text at zero Kelvin?
 

I vaguely remembered that S_max=0 at absolute zero. If it were, then
both S and I must be zero, because these are all nonnegative
quantities. But http://en.wikipedia.org/wiki/Absolute_zero states only
that entropy is at a minimum, not stricly zero. In which case, I
withdraw that comment.

Cheers
-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Jason Resch
On Sun, Jan 22, 2012 at 3:04 AM, Evgenii Rudnyi use...@rudnyi.ru wrote:

 On 21.01.2012 22:03 Evgenii Rudnyi said the following:

  On 21.01.2012 21:01 meekerdb said the following:

 On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:

 On 21.01.2012 20:00 meekerdb said the following:

 On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:



 ...

  2) If physicists say that information is the entropy, they
 must take it literally and then apply experimental
 thermodynamics to measure information. This however seems
 not to happen.


 It does happen. The number of states, i.e. the information,
 available from a black hole is calculated from it's
 thermodynamic properties as calculated by Hawking. At a more
 conventional level, counting the states available to molecules
 in a gas can be used to determine the specific heat of the gas
 and vice-verse. The reason the thermodynamic measures and the
 information measures are treated separately in engineering
 problems is that the information that is important to
 engineering is infinitesimal compared to the information stored
 in the microscopic states. So the latter is considered only in
 terms of a few macroscopic averages, like temperature and
 pressure.

 Brent


 Doesn't this mean that by information engineers means something
 different as physicists?


 I don't think so. A lot of the work on information theory was done
 by communication engineers who were concerned with the effect of
 thermal noise on bandwidth. Of course engineers specialize more
 narrowly than physics, so within different fields of engineering
 there are different terminologies and different measurement
 methods for things that are unified in basic physics, e.g. there
 are engineers who specialize in magnetism and who seldom need to
 reflect that it is part of EM, there are others who specialize in
 RF and don't worry about static fields.


 Do you mean that engineers use experimental thermodynamics to
 determine information?

 
  Evgenii

 To be concrete. This is for example a paper from control

 J.C. Willems and H.L. Trentelman
 H_inf control in a behavioral context: The full information case
 IEEE Transactions on Automatic Control
 Volume 44, pages 521-536, 1999
 http://homes.esat.kuleuven.be/**~jwillems/Articles/**
 JournalArticles/1999.4.pdfhttp://homes.esat.kuleuven.be/%7Ejwillems/Articles/JournalArticles/1999.4.pdf

 The term information is there but the entropy not. Could you please
 explain why? Or alternatively could you please point out to papers where
 engineers use the concept of the equivalence between the entropy and
 information?



Evgenii,

Sure, I could give a few examples as this somewhat intersects with my line
of work.

 The NIST 800-90 recommendation (
http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf ) for
random number generators is a document for engineers implementing secure
pseudo-random number generators.  An example of where it is important is
when considering entropy sources for seeding a random number generator.  If
you use something completely random, like a fair coin toss, each toss
provides 1 bit of entropy.  The formula is -log2(predictability).  With a
coin flip, you have at best a .5 chance of correctly guessing it, and
-log2(.5) = 1.  If you used a die roll, then each die roll would provide
-log2(1/6) = 2.58 bits of entropy.  The ability to measure unpredictability
is necessary to ensure, for example, that a cryptographic key is at least
as difficult to predict the random inputs that went into generating it as
it would be to brute force the key.

In addition to security, entropy is also an important concept in the field
of data compression.  The amount of entropy in a given bit string
represents the theoretical minimum number of bits it takes to represent the
information.  If 100 bits contain 100 bits of entropy, then there is no
compression algorithm that can represent those 100 bits with fewer than 100
bits.  However, if a 100 bit string contains only 50 bits of entropy, you
could compress it to 50 bits.  For example, let's say you had 100 coin
flips from an unfair coin.  This unfair coin comes up heads 90% of the
time.  Each flip represents -log2(.9) = 0.152 bits of entropy.  Thus, a
sequence of 100 coin flips with this biased coin could be represent with 16
bits.  There is only 15.2 bits of information / entropy contained in that
100 bit long sequence.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Evgenii Rudnyi

On 29.01.2012 22:49 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show that

1) There is information

and entropy


that engineers employ.


Some engineers employ information, some the thermodynamic entropy. I 
have not seen though an engineering paper where both information and the 
thermodynamic entropy have been used as synonyms.



2) There is the thermodynamic entropy.


+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence is
that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).



First the thermodynamic entropy is not context depended. This must mean 
that if it is the same as information, then the latter must not be 
context dependent as well. Could you please give me an example of a 
physical property that is context dependent?


Second, when I have different numerical values, this could mean that the 
units are different. Yet, if this is not the case, then in my view we 
are talking about two different entities.


Could you please explain then what is common between 1) and 2)?

Evgenii





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Evgenii Rudnyi

On 29.01.2012 23:00 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:30:38PM +0100, Evgenii Rudnyi wrote:


The problem that I see is that the entropy changes when the
temperature changes. Or do you claim that the entropy of the
memory stick/DVD/hard disc remains the same when its temperature
changes for example from 15 to 25 degrees?


The entropy changes.



Anyway, I do not see how one can obtain the information capacity
of the storage devices from the thermodynamic entropy and this is
my point.



Who was ever claiming that? The theoretically maximum possible
information storage is related, though.


Do you claim, that the information capacity for which we pay money
of a memory stick/DVD/hard disk is equivalent to the thermodynamic
entropy of the device?



Never. The best you have is I=S_max-S, where I is the theoretical


What are S_max and S in this equation?

Evgenii


maximum possible information storage. The value C (capacity of the
storage device) must satisfy

C= I.

Usually C  I, for technological reasons. Also, it is undesirable
to have C vary with temperature, whereas I does vary in general
(particularly across phase transitions).

The information content of a drive is another number D= C, usually
much less, and very dependent on the user of that drive. If the
drive is encrypted, and the user has lost the key, the information
content is close to zero.

The quantities I, C and D are all numerical quantities having the
name information.

Cheers


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Evgenii Rudnyi

On 29.01.2012 23:06 Russell Standish said the following:

On Sat, Jan 28, 2012 at 09:41:27PM -0800, meekerdb wrote:

On 1/28/2012 3:42 PM, Russell Standish wrote:

On the other hand, if you just gave me the metallic platter from
the hard disk, and did not restrict in any way the technology
used to read and write the data, then in principle, the higher
the temperature, the more information is capable of being encoded
on the disk.


I don't think this is quite right. A higher temperature means that
there are more energy states available.  But the concept of
'temperature' implies that these are occupied in a random way
(according to the micro-canonical ensemble). For us to read and
write data requires that the act of reading or writing a bit moves
the distribution of states in phase space enough that it is
distinguishable from the random fluctuations due to temperature.
So if the medium is hotter, you need to use more energy to read
and write a bit.  This of course runs into the problems you note
below.


Hence the requirement that technology not be fixed. It is a
theoretician's answer :).


So in practice it is often colder systems that allow us to store
more data because then we can use small energy differences to
encode bits.


Absolutely! But at zero kelvin, the information storage capacity of
the device is precisely zero, so cooling only works to a certain
point.



I believe that you have mentioned once that information is negentropy. 
If yes, could you please comment on that? What negentropy would mean?


In general, I do not understand what does it mean that information at 
zero Kelvin is zero. Let us take a coin and cool it down. Do you mean 
that the text on the coin will disappear? Or you mean that no one device 
can read this text at zero Kelvin?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Stephen P. King

On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:

On 29.01.2012 22:49 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show that

1) There is information

and entropy


that engineers employ.


Some engineers employ information, some the thermodynamic entropy. I 
have not seen though an engineering paper where both information and 
the thermodynamic entropy have been used as synonyms.



2) There is the thermodynamic entropy.


+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence is
that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).



First the thermodynamic entropy is not context depended. This must 
mean that if it is the same as information, then the latter must not 
be context dependent as well. Could you please give me an example of a 
physical property that is context dependent?




Temperature is context dependent. If we consider physics at the 
level of atoms there is no such a quantity as temperature. Additionally, 
thermodynamic entropy does require Boltzmann's constant to be defined 
with is a form of context dependency as it specifies the level at which 
we are to take micro-states as macroscopically indistinguishable.


Onward!

Stephen


Second, when I have different numerical values, this could mean that 
the units are different. Yet, if this is not the case, then in my view 
we are talking about two different entities.


Could you please explain then what is common between 1) and 2)?

Evgenii







--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread John Mikes
Evgenii, I am not sure if it is your text, or Russell's:

   ***In general, I do not understand what does it mean that information
at zero Kelvin is zero. Let us take a coin and cool it down. Do you mean
that the text on the coin will disappear? Or you mean that no one device
can read this text at zero Kelvin?*
**
I doubt that the text embossed on a coin is its *information*. It is
part of the physical structure as e.g. the roundness. size, or
material(?) characteristics - all, what nobody can imagine how to change
for  the condition of 0-Kelvin. The abs. zero temp. conditions are
extrapolated the best way we could muster. A matter of (sci.) faith. Maybe
the so called 'interstitial' spaces also collapse? I am not for a
'physicalistic' worldview - rather an agnostic about 'explanations' of
diverse epochs based on then recent 'findings' (mostly mathematically
justified??? -
realizing that we may be up to lots of novelties we have no idea about
today, not even of the directions they may shove our views into. I say that
in comparison to our 'conventional scientific' - even everyday's - views of
the world in the past, before and after fundamental knowledge-domains were
added to our inventory.
I do not condone evidences that must be, because THERE IS NO OTHER WAY -
in our existing ignorance of course. Atoms? well, if there *is* 'matter'?
(MASS??) even my (macro)molecules I invented are suspect.
So 'entropy' is a nice term in (classical?) thermodynamics what I coined in
1942 as *the science that tells us how things would proceed wouldn't they
proceed as they do indeed* thinking of Carnot and the isotherm/reversible
equilibria, etc. - way before the irreversible kind was taught in college
courses. Information is another rather difficult term, I like to use
'relation' and leave it open what so far unknown relations may affect our
processes we assign to 'causes' known within the model of the world we
think we are in. The rest (including our misunderstood model - domain) is
what I may call an 'infinite complexity' of which we are part - mostly
ignorant about the 'beyond model' everything.

We 'fabricate' our context, try to explain by the portion we know of - as
if it was the totality - and live in our happy conventional scientific
terms.
Human ingenuity constructed a miraculous science and technology that is
ALMOST good (some mistakes notwithstanding occurring), then comes M. Curie,
Watson-Crick, Fleming, Copernicus, Volta, etc. and we re-write the
schoolbooks.

John M

**

On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyi use...@rudnyi.ru wrote:

 On 29.01.2012 22:49 Russell Standish said the following:

 On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

 On 28.01.2012 23:26 meekerdb said the following:

 On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


 A good suggestion. It well might be that I express my thoughts
 unclear, sorry for that. Yet, I think that my examples show that

 1) There is information

 and entropy

 that engineers employ.


 Some engineers employ information, some the thermodynamic entropy. I have
 not seen though an engineering paper where both information and the
 thermodynamic entropy have been used as synonyms.

  2) There is the thermodynamic entropy.


 + thermodynamic information


 3) Numerical values in 1) and 2) are not related to each other.


 Fixed that for you. Why should you expect the different types of
 information that come from different contexts to have the same
 numerical value? The whole point of On complexity and emergence is
 that notions of information and entropy are complete context
 sensitive (that is not to say their subjective as such - people
 agreeing on the context will agree on the numerical values).



 First the thermodynamic entropy is not context depended. This must mean
 that if it is the same as information, then the latter must not be context
 dependent as well. Could you please give me an example of a physical
 property that is context dependent?

 Second, when I have different numerical values, this could mean that the
 units are different. Yet, if this is not the case, then in my view we are
 talking about two different entities.

 Could you please explain then what is common between 1) and 2)?

 Evgenii



 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to 
 everything-list@googlegroups.**comeverything-list@googlegroups.com
 .
 To unsubscribe from this group, send email to everything-list+unsubscribe@
 **googlegroups.com everything-list%2bunsubscr...@googlegroups.com.
 For more options, visit this group at http://groups.google.com/**
 group/everything-list?hl=enhttp://groups.google.com/group/everything-list?hl=en
 .



-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-31 Thread John Clark
On Mon, Jan 30, 2012  Craig Weinberg whatsons...@gmail.com wrote:

 I just explained


3 days after learning that the subject even existed here we sit at your
feet while you explain all about it to us.


  that Shannon information has nothing to do with anything except data
 compression.


Except for data compression? Except for identifying the core, must have,
part of any message. Except for telling us exactly what's important and
what is not. Except for showing how to build things like the internet.

Except for that Mrs. Lincoln how did you like the play?

Shannon can tell you how many books can be sent over a noisy wire in a
given amout of time without error, and if you're willing to tolerate a few
errors Shannon can tell you how to send even more. If the contents of books
is not information what do you call the contents of books?

 Nothing can become a 'file' without irreversible loss.


Ah, well, that explains why I can't make heads or tails out of your ideas,
all I've seen is your mail files, now if I'd seen your original glorious
Email just as it was as you typed it on your computer screen with no
irreversible loss I would have long ago become convinced you were right and
were in fact the second coming of Issac Newton. So when you respond to this
post please don't send me a file full of irreversible loss, send me your
ORIGINAL, send me the real deal.


  The terms signal and noise refer to information (signal) and entropy
 (noise). Get it straight.


One man's signal is another man's noise, to a fan of hisses and clicks and
pops the music is the noise.  First you decide what you want to call the
signal and then Shannon can tell you what the signal to noise ratio is and
he can show you ways to improve it.

 And your way of dealing with it is to say it (bits electrons information
 logic etc) does not exist. I would never have guessed that coming up with a
 theory of everything could be so easy.


 If you understand my hypothesis then you will see there is no reason to
 think they exist.


Then I dearly hope my mind never goes so soft that I understand your
hypothesis.

 Just as you think free will has no reason to exist.


No no a thousand times no! Free will would have to improve dramatically
before it could have the lofty property of nonexistence; free will is a
idea so bad its not even wrong.


  I thought Foucault's Discipline and Punish was one of the most
 interesting books I've ever read.


I don't consider social criticism a part of philosophy even if I agree with
it because it always includes matters of taste. Professional philosophers
might write interesting books about history or about what society should or
should not do, but none of them have contributed to our understanding of
the nature of reality in centuries. That's not to say philosophy hasn't
made progress, it just wasn't made by philosophers.

 Feynman I think would have been intrigued by my ideas


Delusions of grandeur.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-30 Thread Craig Weinberg
On Jan 30, 12:03 am, John Clark johnkcl...@gmail.com wrote:
 On Sun, Jan 29, 2012 Craig Weinberg whatsons...@gmail.com wrote:

  I'm not talking about fluid flow,

 OK

  I'm talking about simulating everything - potential and actual chemical
  reactions, etc.

 OK

  Water can be described by multiplying the known interactions of H2O,

 But many, probably most, of water's interactions are unknown to this
 day. Virtually all of organic chemistry (including DNA reactions!)
 involves water somewhere in the chain of reaction, but organic chemistry
 is very far from a closed subject, there is still much to learn.

cool. I didn't know that. What about DNA though? Why would it be any
less mysterious?

 Another
 example, up to now nobody has derived the temperature that water freezes
 at from first principles because the resulting quantum mechanical
 equations are so mathematically complicated that nobody has yet figured
 out how to solve them.

Water is strange stuff. It's blue color comes from inside of it too.
Intramolecular collisions rather than reflection.


  DNA would need many more variables.

 BULLSHIT!


Why?

  Non-Shannon information would be anything that is not directly involved
  in the compression of a digitally sampled description into another digital
  description.

 In other words non-Shannon information is gaseous philosophical flatulence.

Uhh, what? I just explained that Shannon information has nothing to do
with anything except data compression. It's like I just explained what
a catalytic converter is and you said 'in other words non-catalytic
converters are gaseous philosophical flatulence.'


         Shannon information is not information in general, it is [...]



 Shannon published his work in 1948 but you never even heard about it
 until 3 days ago, and now you're a great world authority on the subject
 telling us all exactly what it does and does not mean.

I'm only the expert compared to you, since your explanation which you
argued with all the authority of a seasoned expert was dead wrong.
Precisely wrong.

 I don't mind
 ignorance, I'm ignorant about a lot of stuff myself, but there is a
 certain kind of arrogant aggressive ignorance that I find very distasteful.

That sentence embodies it perfectly.


 In contrast Richard Feynman displayed humble ignorance, he did as much
 as anyone to develop Quantum Mechanics but he said I think it's safe to
 say that nobody understands Quantum Mechanics, in describing the work
 that won him the Nobel Prize he said he found a way to sweep
 mathematical difficulties under the rug. He also said I know how hard
 it is to really know something; how careful you have to be about
 checking the experiments; how easy it is to make mistakes and fool
 yourself. I know what it means to know something.

Yes, I'm familiar with Feynman.


         Compression and encryption are deformations.



 If you can get the exact same file out after compression or encryption
 then obviously nothing has been lost and all deformations or shrinkage
 are reversible.

Nothing can become a 'file' without irreversible loss. Once it's a
file, sure you can do all kinds of transformations to it, but you'll
never get the original live band playing a song off of an mp3.


         I understand what you mean completely



 Apparently not

No, I have understood you from the start. I knew you were wrong about
information and entropy and you were. You don't understand my position
though, so you assume it's senseless and throw things in my general
direction.


         White noise is called noise for a reason.



 And its called white for a reason, a evil occidental mindset
 conspiracy created by round eyed white devils.

I would imagine it's called white because it is additive interference.
My point still stands. The terms signal and noise refer to information
(signal) and entropy (noise). Get it straight. Or don't.


  How do you expect mathematics to deal with anything as subjective as
  quality? A novel that's high quality to you may be junk to me.

  I don't expect mathematics to deal with it. I expect a theory of
  everything to deal with it.

 And your way of dealing with it is to say it (bits electrons information
 logic etc) does not exist. I would never have guessed that coming up
 with a theory of everything could be so easy.

If you understand my hypothesis then you will see there is no reason
to think they exist. Just as you think free will has no reason to
exist.


  I'm not a big philosophy or religion fan myself but Wittgenstein,
  Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
  impressive things to say.

 As I've said before nearly everything they and all other recent
 philosophers say can be put into one of four categories:

 1) False.
 2) True but obvious, a truism disguised in pretentious language.
 3) True and deep but discovered first and explained better by a
 mathematician or scientist or someone else who didn't write
 philosopher in the box 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Evgenii Rudnyi

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


...


You disagree that engineers do not use thermodynamic entropy



Yes. I disagreed that information has nothing to do with
thermodynamic entropy, as you wrote above. You keep switching
formulations. You write X and ask if I agree. I disagree. Then you
claim I've disagreed with Y. Please pay attention to your own
writing. There's a difference between X is used in place of Y and
X has nothing to do with Y.


A good suggestion. It well might be that I express my thoughts unclear, 
sorry for that. Yet, I think that my examples show that


1) There is information that engineers employ.

2) There is the thermodynamic entropy.

3) Numerical values in 1) and 2) are not related to each other.

Otherwise I would appreciate if you express the relationship between 
information that engineers use and the thermodynamic entropy in your own 
words, as this is the question that I would like to understand.


I understand you when you say about the number of microstates. I do not 
understand though how they are related to the information employed by 
engineers. I would be glad to hear your comment on that.


Evgenii


but you have not shown yet how information in engineering is
related with the thermodynamic entropy. Form the Millipede example


http://en.wikipedia.org/wiki/Millipede_memory


The earliest generation millipede devices used probes 10
nanometers in diameter and 70 nanometers in length, producing pits
about 40 nm in diameter on fields 92 µm x 92 µm. Arranged in a 32 x
32 grid, the resulting 3 mm x 3 mm chip stores 500 megabits of data
or 62.5 MB, resulting in an areal density, the number of bits per
square inch, on the order of 200 Gbit/in².

If would be much easier to understand you if you say to what
thermodynamic entropy corresponds the value of 62.5 MB in
Millipede.



The Shannon information capacity is 5e8 bits. The thermodynamic
entropy depends on the energy used to switch a memory element. I'd
guess it must correspond to at least few tens of thousands of
electrons at 9v, so

S ~ [5e8 * 9e4 eV]/[8.6e-5 eV/degK * 300degK]~17e15

So the total entropy is about 17e15+5e8, and the information portion
is numerically (but not functionally) negligible compared to the
thermodynamic.

Brent



The only example on Thermodynamic Entropy == Information so far
from you was the work on a black hole. However, as far as I know,
there is no theory yet to describe a black hole, as from one side
you need gravitation, from the other side quantum effects. The
theory that unites them seems not to exist.

Evgenii




My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do
not employ the thermodynamic entropy to estimate its
information capabilities. Also, the increase of temperature
would be destroy saved information there.

Well, I might be deliberately obtuse indeed. Yet with the only
goal to reach a clear definition of what the information is.
Right now I would say that there is information in engineering
and in physics and they are different. The first I roughly
understand and the second not.

Evgenii




Brent











--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Evgenii Rudnyi

On 29.01.2012 00:42 Russell Standish said the following:

On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:


...


In general we are surrounded devices that store information (hard
discs, memory sticks, DVD, etc.). The information that these
devices can store, I believe, is known with accuracy to one bit.


Because they're engineered that way. It would be rather inconvenient
if one's information storage varied with temperature.


Can you suggest a thermodynamic state which entropy gives us
exactly that amount of information?

Here would be again a question about temperature. If I operate my
memory stick in some reasonable range of temperatures, the
information it contains does not change. Yet, the entropy in my
view changes.


Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that matter.
What's the problem with that?


The problem that I see is that the entropy changes when the temperature 
changes. Or do you claim that the entropy of the memory stick/DVD/hard 
disc remains the same when its temperature changes for example from 15 
to 25 degrees?


Anyway, I do not see how one can obtain the information capacity of the 
storage devices from the thermodynamic entropy and this is my point.


Do you claim, that the information capacity for which we pay money of a 
memory stick/DVD/hard disk is equivalent to the thermodynamic entropy of 
the device?


Evgenii



So these are my doubts for which I do not see an answer.

Evgenii





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Russell Standish
On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:
 On 28.01.2012 23:26 meekerdb said the following:
 On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:
 
 A good suggestion. It well might be that I express my thoughts
 unclear, sorry for that. Yet, I think that my examples show that
 
 1) There is information 
and entropy 

 that engineers employ.
 
 2) There is the thermodynamic entropy.

+ thermodynamic information

 
 3) Numerical values in 1) and 2) are not related to each other.
 

Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence is
that notions of information and entropy are complete context sensitive
(that is not to say their subjective as such - people agreeing on the
context will agree on the numerical values).


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Russell Standish
On Sun, Jan 29, 2012 at 04:30:38PM +0100, Evgenii Rudnyi wrote:
 
 The problem that I see is that the entropy changes when the
 temperature changes. Or do you claim that the entropy of the memory
 stick/DVD/hard disc remains the same when its temperature changes
 for example from 15 to 25 degrees?

The entropy changes.

 
 Anyway, I do not see how one can obtain the information capacity of
 the storage devices from the thermodynamic entropy and this is my
 point.
 

Who was ever claiming that? The theoretically maximum possible
information storage is related, though.

 Do you claim, that the information capacity for which we pay money
 of a memory stick/DVD/hard disk is equivalent to the thermodynamic
 entropy of the device?
 

Never. The best you have is I=S_max-S, where I is the theoretical
maximum possible information storage. The value C (capacity of the
storage device) must satisfy

C = I.

Usually C  I, for technological reasons. Also, it is undesirable to
have C vary with temperature, whereas I does vary in general
(particularly across phase transitions).

The information content of a drive is another number D = C, usually
much less, and very dependent on the user of that drive. If the drive
is encrypted, and the user has lost the key, the information content
is close to zero.

The quantities I, C and D are all numerical quantities having the name
information. 

Cheers
-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Russell Standish
On Sat, Jan 28, 2012 at 09:41:27PM -0800, meekerdb wrote:
 On 1/28/2012 3:42 PM, Russell Standish wrote:
 On the other hand, if you just gave me the metallic platter from the
 hard disk, and did not restrict in any way the technology used to read
 and write the data, then in principle, the higher the temperature, the
 more information is capable of being encoded on the disk.
 
 I don't think this is quite right. A higher temperature means that
 there are more energy states available.  But the concept of
 'temperature' implies that these are occupied in a random way
 (according to the micro-canonical ensemble). For us to read and
 write data requires that the act of reading or writing a bit moves
 the distribution of states in phase space enough that it is
 distinguishable from the random fluctuations due to temperature. 
  So
 if the medium is hotter, you need to use more energy to read and
 write a bit.  This of course runs into the problems you note below.

Hence the requirement that technology not be fixed. It is a
theoretician's answer :).

 So in practice it is often colder systems that allow us to store
 more data because then we can use small energy differences to encode
 bits.

Absolutely! But at zero kelvin, the information storage capacity of the
device is precisely zero, so cooling only works to a certain point.

 
 Brent
 
 
 In practice, various phase transitions will make this more difficult
 to achieve as temperature is increased. Passing the curie point, for
 instance, will mean we can no longer rely on magnetism, although
 presumably even below the curie point we can increase the information
 storage in some other way (eg moving atoms around by an STM) and
 ignoring the ferromagnetic behaviour. By the same token, passing the
 freezing and boiling points will make it even harder - but still
 doable with sufficiently advanced technology.
 
  From an engineering viewpoint it looks a bit strange.
 How so?
 
 If engineers would take the statement the maximum possible value
 for information increases with temperature literally, they should
 operate a hard disk at higher temperatures (the higher the better
 according to such a statement). Yet this does not happens. Do you
 know why?
 
 In general we are surrounded devices that store information (hard
 discs, memory sticks, DVD, etc.). The information that these devices
 can store, I believe, is known with accuracy to one bit.
 Because they're engineered that way. It would be rather inconvenient if
 one's information storage varied with temperature.
 
 Can you
 suggest a thermodynamic state which entropy gives us exactly that
 amount of information?
 
 Here would be again a question about temperature. If I operate my
 memory stick in some reasonable range of temperatures, the
 information it contains does not change. Yet, the entropy in my view
 changes.
 Sure - because they're engineered that way, and they operate a long
 way from the theoretical maximum storage capability of that
 matter. What's the problem with that?
 
 So these are my doubts for which I do not see an answer.
 
 Evgenii
 
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to 
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/everything-list?hl=en.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread John Clark
On Sun, Jan 29, 2012 Craig Weinberg whatsons...@gmail.com wrote:

 I'm not talking about fluid flow,


OK

 I'm talking about simulating everything - potential and actual chemical
 reactions, etc.


OK

 Water can be described by multiplying the known interactions of H2O,


But many, probably most, of water's interactions are unknown to this
day. Virtually all of organic chemistry (including DNA reactions!)
involves water somewhere in the chain of reaction, but organic chemistry
is very far from a closed subject, there is still much to learn. Another
example, up to now nobody has derived the temperature that water freezes
at from first principles because the resulting quantum mechanical
equations are so mathematically complicated that nobody has yet figured
out how to solve them.

 DNA would need many more variables.


BULLSHIT!

 Non-Shannon information would be anything that is not directly involved
 in the compression of a digitally sampled description into another digital
 description.


In other words non-Shannon information is gaseous philosophical flatulence.

Shannon information is not information in general, it is [...]


Shannon published his work in 1948 but you never even heard about it
until 3 days ago, and now you're a great world authority on the subject
telling us all exactly what it does and does not mean. I don't mind
ignorance, I'm ignorant about a lot of stuff myself, but there is a
certain kind of arrogant aggressive ignorance that I find very distasteful.

In contrast Richard Feynman displayed humble ignorance, he did as much
as anyone to develop Quantum Mechanics but he said I think it's safe to
say that nobody understands Quantum Mechanics, in describing the work
that won him the Nobel Prize he said he found a way to sweep
mathematical difficulties under the rug. He also said I know how hard
it is to really know something; how careful you have to be about
checking the experiments; how easy it is to make mistakes and fool
yourself. I know what it means to know something.

Compression and encryption are deformations.


If you can get the exact same file out after compression or encryption
then obviously nothing has been lost and all deformations or shrinkage
are reversible.

I understand what you mean completely


Apparently not

White noise is called noise for a reason.


And its called white for a reason, a evil occidental mindset
conspiracy created by round eyed white devils.


 How do you expect mathematics to deal with anything as subjective as
 quality? A novel that's high quality to you may be junk to me.


 I don't expect mathematics to deal with it. I expect a theory of
 everything to deal with it.


And your way of dealing with it is to say it (bits electrons information
logic etc) does not exist. I would never have guessed that coming up
with a theory of everything could be so easy.

 I'm not a big philosophy or religion fan myself but Wittgenstein,
 Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
 impressive things to say.


As I've said before nearly everything they and all other recent
philosophers say can be put into one of four categories:

1) False.
2) True but obvious, a truism disguised in pretentious language.
3) True and deep but discovered first and explained better by a
mathematician or scientist or someone else who didn't write
philosopher in the box labeled occupation on his tax form.
4) So bad its not even wrong.

 Here's some sample articles on the subject:


I know how to look up things on Google too, and I wonder how many of the
authors of those articles graduated from high school.

 Science begins when you distrust experts. - Richard Feynman. You're
 right, I'll trust Feynman.


If you think Feynman would treat your ideas with anything other than
contempt you're nuts. And you should look at the short one minute video
by Feynman called You don't like it? Go somewhere else!:

http://www.youtube.com/watch?v=iMDTcMD6pOw


 John K Clark
YouTube - Videos from this email

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Evgenii Rudnyi

On 28.01.2012 00:24 Craig Weinberg said the following:

On Jan 27, 1:31 pm, John Clarkjohnkcl...@gmail.com  wrote:

On Thu, Jan 26, 2012 at 8:03 PM, Craig
Weinbergwhatsons...@gmail.comwrote:


With the second law of thermodynamics, it seems like heat could
only dissipate by heating something else up.


The second law says that energy will tend to get diluted in space
over time, and heat conducting to other matter is one way for this
to happen but it is not the only way. Photons radiating outward in
all directions from a hot object is another way energy can get
diluted. But among many other things, you don't think photons, or
logic, exist so I doubt this answer will satisfy you.


It would satisfy me if I you had some examples, but I don't think
that you know the answer for sure. If a vacuum is a good insulator
(like a vacuum thermos) and a perfect vacuum, as far as I have been
able to read online, is a perfect insulator. Electricity and heat
pass from object to object, not from space to space. Please point out
any source you can find to the contrary. What little I find agrees
with vacuums being insulators of heat and electricity.


Graig,

Radiation does happen. If you need more detail, there is a nice free 
book from MIT


A Heat Transfer Textbook,  4th edition
John H. Lienhard IV, Professor, University of Houston
John H. Lienhard V, Professor, Massachusetts Institute of Technology

http://web.mit.edu/lienhard/www/ahtt.html

Some disadvantage is that it is thick but you go directly to Part IV 
Thermal Radiation Heat Transfer. Vacuum is a good insulator but thermal 
radiation gets through.


It is pretty important for example to include radiation in the case of 
free convection as it may account up to 40% of heat transfer in this case.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Russell Standish
On Sat, Jan 28, 2012 at 08:58:54AM +0100, Evgenii Rudnyi wrote:
 On 27.01.2012 23:46 Russell Standish said the following:
 For one thing, it indicates to storing just two bits of information
 on these physical substrates is grossly inefficient!
 
 Well, you could contact governments then and try to convince them
 that coins in use are highly inefficient. It would be a good chance
 to have funding.

Chuckle. Maybe we can persuade them to get behind bitcoin :).

 
 By the way, at what temperature there will be possible to save more
 information, at higher or at lower one. 

What does this mean?

 Brent and John are talking
 about the entropy and the higher temperature the higher the entropy.

True. But information has no such relationship with temperature, other
than that the maximum possible value for information increases with temperature.

Remember the equation S+I = S_max. S_max obviously increases with
temperature. So usually does S, but S can be decreased by organisation
of the matter - by the input of information.

 From an engineering viewpoint it looks a bit strange.

How so?

 
 Evgenii
 
 Cheers
 
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to 
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/everything-list?hl=en.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Evgenii Rudnyi

On 28.01.2012 11:20 Russell Standish said the following:

On Sat, Jan 28, 2012 at 08:58:54AM +0100, Evgenii Rudnyi wrote:

On 27.01.2012 23:46 Russell Standish said the following:

For one thing, it indicates to storing just two bits of
information on these physical substrates is grossly inefficient!


Well, you could contact governments then and try to convince them
that coins in use are highly inefficient. It would be a good
chance to have funding.


Chuckle. Maybe we can persuade them to get behind bitcoin :).



By the way, at what temperature there will be possible to save
more information, at higher or at lower one.


What does this mean?


Let us take a hard disk. Can I save more information on it at higher or 
lower temperatures?





Brent and John are talking about the entropy and the higher
temperature the higher the entropy.


True. But information has no such relationship with temperature,
other than that the maximum possible value for information increases
with temperature.

Remember the equation S+I = S_max. S_max obviously increases with
temperature. So usually does S, but S can be decreased by
organisation of the matter - by the input of information.


From an engineering viewpoint it looks a bit strange.


 How so?


If engineers would take the statement the maximum possible value for 
information increases with temperature literally, they should operate a 
hard disk at higher temperatures (the higher the better according to 
such a statement). Yet this does not happens. Do you know why?


In general we are surrounded devices that store information (hard discs, 
memory sticks, DVD, etc.). The information that these devices can store, 
I believe, is known with accuracy to one bit. Can you suggest a 
thermodynamic state which entropy gives us exactly that amount of 
information?


Here would be again a question about temperature. If I operate my memory 
stick in some reasonable range of temperatures, the information it 
contains does not change. Yet, the entropy in my view changes.


So these are my doubts for which I do not see an answer.

Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread meekerdb

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:

On 27.01.2012 23:03 meekerdb said the following:

On 1/27/2012 12:43 PM, Evgenii Rudnyi wrote:

On 27.01.2012 21:22 meekerdb said the following:

On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what
you are saying. Let us consider a string 10 for
simplicity. Let us consider the next cases. I will cite
first the thermodynamic properties of Ag and Al from CODATA
tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 =
2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then
the thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a
string 10 and the thermodynamic entropy is different. If
we take the statement literally then the information must
be different in all four cases and defined uniquely as the
thermodynamic entropy is already there. Yet in my view this
makes little sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the
phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the
*change* in entropy per degree at the given temperature. It's
a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More
available phase space means more uncertainty of the exact
actual state and hence more information entropy. This
information is enormous compared to the 01 stamped on the
plate, the shape of the plate or any other aspects that we
would normally use to convey information. It would only be in
case we cooled the plate to near absolute zero and then tried
to encode information in its microscopic vibrational states
that the thermodynamic and the encoded information entropy
would become similar.



I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy.
Don't you agree?


Obviously not since I wrote above that the thermodynamic entropy
is a measure of how much information it would take to locate the
exact state within the phase space allowed by the thermodynamic
parameters.


Does this what engineers use when they develop communication
devices?





It would certainly interesting to consider what happens when
we decrease the temperature (in the limit to zero Kelvin).
According to the Third Law the entropy will be zero then. What
do you think, can we save less information on a copper plate at
low temperatures as compared with higher temperatures? Or
more?


Are you being deliberately obtuse? Information encoded in the
shape of the plate is not accounted for in the thermodynamic
tables - they are just based on ideal bulk material (ignoring
boundaries).


I am just trying to understand the meaning of the term information
 that you use. I would say that there is the thermodynamic entropy
and then the Shannon information entropy. The Shannon has developed
a theory to help engineers to deal with communication (I believe
that you have also recently a similar statement). Yet, in my view
when we talk about communication devices and mechatronics, the
information that engineers are interested in has nothing to do with
the thermodynamic entropy. Do you agree or disagree with that? If
you disagree, could you please give an example from engineering
where engineers do employ the thermodynamic entropy as the estimate
of information.


I already said I disagreed. You are confusing two different things.
Because structural engineers don't employ the theory of interatomic
forces it doesn't follow that interactomic forces have nothing to do
 with sturctural properties.

Brent


You disagree that engineers do not use thermodynamic entropy



Yes. I disagreed that information has nothing to do with thermodynamic entropy, as you 
wrote above. You keep switching formulations.  You write X and ask if I agree. I 
disagree.  Then you claim I've disagreed with Y. Please pay attention to your own 
writing.  There's a difference between X is used in place of Y and X has nothing to do 
with Y.


but you have not shown yet how information in engineering is related with the 
thermodynamic entropy. Form the Millipede example


 http://en.wikipedia.org/wiki/Millipede_memory


Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread meekerdb

On 1/27/2012 11:58 PM, Evgenii Rudnyi wrote:

On 27.01.2012 23:46 Russell Standish said the following:

On Fri, Jan 27, 2012 at 08:27:31PM +0100, Evgenii Rudnyi wrote:

On 26.01.2012 12:00 Russell Standish said the following:

If you included these two bits, the thermodynamic entropy is two
bits less, = 4.15 x 10^{-24} J/K less

This is so many orders of magnitude less than the entropy due to
the material, its probably not worth including, but it is there.


I do not believe that effects below the experimental noise are
important for empirical science. You probably mean then some other
science, it would be good if you define what science you mean.

Evgenii


For one thing, it indicates to storing just two bits of information
on these physical substrates is grossly inefficient!


Well, you could contact governments then and try to convince them that coins in use are 
highly inefficient. It would be a good chance to have funding.


By the way, at what temperature there will be possible to save more information, at 
higher or at lower one. Brent and John are talking about the entropy and the higher 
temperature the higher the entropy. From an engineering viewpoint it looks a bit strange.


At a higher temperature there are more microstates accessible and hence more uncertainty 
as to which state is actually realized.  But if you're storing information, which you want 
to retrieve, this uncertainty is noise and you have to use larger increments of energy to 
reliably switch states.  So for storage it is more efficient (takes less energy per bit) 
to be colder.


Brent



Evgenii


Cheers





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Russell Standish
On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:
 
 Let us take a hard disk. Can I save more information on it at higher
 or lower temperatures?

This is a strictly ambiguous question. If we take the usual meaning of
hard disk as including a particular apparatus (heads, controller
logic, SATA interface and so on) to read and write the data, then
there will be a limited range of temperatures over which that
apparatus will operate. Outside of that range, (both higher and lower)
the information storage will fall to zero. That is a purely
engineering question.

On the other hand, if you just gave me the metallic platter from the
hard disk, and did not restrict in any way the technology used to read
and write the data, then in principle, the higher the temperature, the
more information is capable of being encoded on the disk. 

In practice, various phase transitions will make this more difficult
to achieve as temperature is increased. Passing the curie point, for
instance, will mean we can no longer rely on magnetism, although
presumably even below the curie point we can increase the information
storage in some other way (eg moving atoms around by an STM) and
ignoring the ferromagnetic behaviour. By the same token, passing the
freezing and boiling points will make it even harder - but still
doable with sufficiently advanced technology.

 
 From an engineering viewpoint it looks a bit strange.
 
  How so?
 
 
 If engineers would take the statement the maximum possible value
 for information increases with temperature literally, they should
 operate a hard disk at higher temperatures (the higher the better
 according to such a statement). Yet this does not happens. Do you
 know why?
 
 In general we are surrounded devices that store information (hard
 discs, memory sticks, DVD, etc.). The information that these devices
 can store, I believe, is known with accuracy to one bit. 

Because they're engineered that way. It would be rather inconvenient if
one's information storage varied with temperature. 

 Can you
 suggest a thermodynamic state which entropy gives us exactly that
 amount of information?
 
 Here would be again a question about temperature. If I operate my
 memory stick in some reasonable range of temperatures, the
 information it contains does not change. Yet, the entropy in my view
 changes.

Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that
matter. What's the problem with that?

 
 So these are my doubts for which I do not see an answer.
 
 Evgenii
 

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread meekerdb

On 1/28/2012 3:42 PM, Russell Standish wrote:

On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:

Let us take a hard disk. Can I save more information on it at higher
or lower temperatures?

This is a strictly ambiguous question. If we take the usual meaning of
hard disk as including a particular apparatus (heads, controller
logic, SATA interface and so on) to read and write the data, then
there will be a limited range of temperatures over which that
apparatus will operate. Outside of that range, (both higher and lower)
the information storage will fall to zero. That is a purely
engineering question.

On the other hand, if you just gave me the metallic platter from the
hard disk, and did not restrict in any way the technology used to read
and write the data, then in principle, the higher the temperature, the
more information is capable of being encoded on the disk.


I don't think this is quite right. A higher temperature means that there are more energy 
states available.  But the concept of 'temperature' implies that these are occupied in a 
random way (according to the micro-canonical ensemble). For us to read and write data 
requires that the act of reading or writing a bit moves the distribution of states in 
phase space enough that it is distinguishable from the random fluctuations due to 
temperature.  So if the medium is hotter, you need to use more energy to read and write a 
bit.  This of course runs into the problems you note below.  So in practice it is often 
colder systems that allow us to store more data because then we can use small energy 
differences to encode bits.


Brent



In practice, various phase transitions will make this more difficult
to achieve as temperature is increased. Passing the curie point, for
instance, will mean we can no longer rely on magnetism, although
presumably even below the curie point we can increase the information
storage in some other way (eg moving atoms around by an STM) and
ignoring the ferromagnetic behaviour. By the same token, passing the
freezing and boiling points will make it even harder - but still
doable with sufficiently advanced technology.


 From an engineering viewpoint it looks a bit strange.
How so?


If engineers would take the statement the maximum possible value
for information increases with temperature literally, they should
operate a hard disk at higher temperatures (the higher the better
according to such a statement). Yet this does not happens. Do you
know why?

In general we are surrounded devices that store information (hard
discs, memory sticks, DVD, etc.). The information that these devices
can store, I believe, is known with accuracy to one bit.

Because they're engineered that way. It would be rather inconvenient if
one's information storage varied with temperature.


Can you
suggest a thermodynamic state which entropy gives us exactly that
amount of information?

Here would be again a question about temperature. If I operate my
memory stick in some reasonable range of temperatures, the
information it contains does not change. Yet, the entropy in my view
changes.

Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that
matter. What's the problem with that?


So these are my doubts for which I do not see an answer.

Evgenii



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Craig Weinberg
On Jan 28, 1:48 pm, John Clark johnkcl...@gmail.com wrote:
 On Fri, Jan 27, 2012 at 5:51 PM, Craig Weinberg whatsons...@gmail.comwrote:

  You could much more easily write a probabilistic equation to simulate any
  given volume of water than the same volume of DNA, especially

 The motion of both can be well described by Napier-Stokes  equations which
 describe fluid flow using Newton's laws, and DNA being more viscous than
 water the resulting equations would be simpler than the ones for water.

I'm not talking about fluid flow, I'm talking about simulating
everything - potential and actual chemical reactions, etc. Water can
be described by multiplying the known interactions of H2O, DNA would
need many more variables.


   when you get into secondary and tertiary structure.

 You've got to play fair, it you talk about micro states for DNA I get to
 talk about micro states for water.

  I had not heard of Shannon information.

 Somehow I'm not surprised, and it's Shannon Information Theory.

No, I've heard of Shannon Information Theory. I didn't realize that it
was such an instrumental special case theory though.


  The key phrase for me here is the thermodynamic entropy is interpreted as
  being proportional to the amount of further Shannon information needed to
  define the detailed microscopic state of the
  system.

 OK, although I don't see what purpose the word further serves in the
 above, and although I know all about Claude Shannon the term Shannon
 information is nonstandard. What would Non-Shannon information be?

Non-Shannon information would be anything that is not directly
involved in the compression of a digitally sampled description into
another digital description. Further means that if you add x calories
of heat, you need x more units of Shannon information to define the
effect of the added heat/motion.


   This confirms what I have been saying and is the opposite of what you
  are saying.

 What on Earth are you talking about?? The more entropy a system has the
 more information needed to describe it.

Yes. It is information that lets you describe patterns more easily.
The more pattern there is, the more you can say 'yes, I get it, add
500 0s and then another 1'. When there is less information, less
pattern, more energy, it takes more information to describe it. There
are no patterns to give your compression a shortcut.


   This means that DNA, having low entropy compared with pure water, has
  high pattern content, high information, and less Shannon information

 I see, it has high information and less information. No I take that back, I
 don't see, although it is consistent with your usual logical standards.

Shannon information is not information in general, it is a specific
kind of information about information which is really inversely
proportional to information in any other sense. It's uninformability
is what it is. Drag. Entropy. Resistance to the process (not
thermodynamic resistance).


  Easier to compress does *not* mean less information

 It means a message has been inflated with useless gas and a compression
 program can remove that gas and recover the small kernel of information
 undamaged.

Hahaha. The useless gas is what separates coherence and sanity from
garbage. It's useless to a computer, sure, but without the gas it's
useless to us. Next time you want to look at a picture, try viewing it
in it's compressed form in a hex editor. Get rid of all that useless
gas.

 White noise has no gas in it for a compression program to
 deflate, that's why if you don't know the specific compression program used
 the resulting file ( like a zip or gif file) would look like random white
 noise, and yet its full of useful information if you know how to get it.
 The same thing is true of encrypted files, if the encription is good then
 the file will look completely random, just white noise, to anyone who does
 not have the secret key.

I understand what you mean completely, and that is indeed how
computers treat data, but it is the opposite of what it means to
inform in general terms. Compression and encryption are deformations.
Decryption is how we get any information out of it. White noise is
called noise for a reason. The opposite of noise is signal. Signals
are signifying and informing, thus information.


  The compressibility of a novel or picture does not relate to the quality
  of information

 How do you expect mathematics to deal with anything as subjective as
 quality? A novel that's high quality to you may be junk to me.

I don't expect mathematics to deal with it. I expect a theory of
everything to deal with it.


  Knowledge and wisdom are already owned by philosophy and religion,

 I've never heard of religion saying anything wise, philosophy does contain
 wisdom but none of it came from professional philosophers, at least not in
 the last 300 years.

I'm not a big philosophy or religion fan myself but Wittgenstein,
Heidegger, Sarte, Foucault, Kierkegaard were recent 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Craig Weinberg
On Jan 26, 11:11 pm, meekerdb meeke...@verizon.net wrote:
 On 1/26/2012 5:03 PM, Craig Weinberg wrote:

  Ok, so how does it effect the entropy of the structures? The red
  house, the white house, and the mixed house (even if an interesting
  pattern is made in the bricks), all behave in a physically identical
  way, do they not?
  No they don't.  They reflect photons differently; which is why you could 
  use the pattern
  to send a message.
  True, although it's only relevant if you have photons to reflect. If I
  turn out the lights (completely) does that change the entropy of the
  red house? What if I turn the lights back on, has entropy been
  suddenly reduced? Would a brighter light put more information or less
  entropy onto the white house than the red house, ie, does the pattern
  cost something in photons?

 Yes.

That doesn't make sense to me. I think if two houses had two different
patterns with the same numbers of each brick, neither one could
possibly have a different cost in photons than the other. In a house
of four bricks, Red Red White White cannot have a different photon
absorption than Red White White Red.




  I'm just curious, not trying to argue with you about it. On a similar
  note, I was wondering about heat loss in a vacuum today. With the
  second law of thermodynamics, it seems like heat could only dissipate
  by heating something else up. If there was nothing in the universe
  except a blob of molten nickel, would it cool off over time in an
  infinite vacuum? It seems like it wouldn't. It seems like you would
  need some other matter at a different temperature to seek a common
  equilibrium with. Or is the heat just lost over time no matter what?

 The heat would be lost by infrared radiation.

Lost to where? Energy is neither created nor...lost.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread John Clark
On Thu, Jan 26, 2012  Craig Weinberg whatsons...@gmail.com wrote:

 If a bucket of water has more of it than DNA, then the word information
 is meaningless.


You would need to send more, far far more, dots and dashes down a wire to
inform a intelligent entity what the position and velocity of every
molecule in bucket of water is than to inform it exactly what the human
genome is. Now what word didn't you understand.


  A symphony then would have less information and more entropy than random
 noise.


No, a symphony would have less information but LESS entropy than random
white noise. That's why lossless computer image and sound compression
programs don't work with white noise, there is no redundancy to remove
because white noise has no redundancy.  It would take many more dots and
dashes sent down a wire to describe every pop and click in a piece of white
noise than to describe a symphony of equal length.

 If the word information is to have any meaning, quantity and
 compressibility of data must be distinguished from quality of it's
 interpretation.



If you want to clearly distinguish these things, and I agree that is a very
good idea, then you need separate words for the separate ideas. Quality is
subjective so mathematics can not deal with it, mathematics can work with
quantity however, so if quality comes into play you can not use the word
information because mathematics already owns that word; but there are
plenty of other words that you can use, words like knowledge or
wisdom.


 Let's say your definition were true though. What does it have to do with
 information being directly proportionate to entropy?


The larger the entropy something has the more information it has.

 If entropy were equal or proportionate to information, then are saying
 that the more information something contains, the less it matters.


Whether it matters or not is subjective so you should not use the word
information in the above. A bucket of water contains far more information
than the human genome but the human genome has far more knowledge, at least
I think so, although a bucket of water might disagree with me.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread John Clark
On Thu, Jan 26, 2012 at 8:03 PM, Craig Weinberg whatsons...@gmail.comwrote:

 With the second law of thermodynamics, it seems like heat could only
 dissipate by heating something else up.


The second law says that energy will tend to get diluted in space over
time, and heat conducting to other matter is one way for this to happen but
it is not the only way. Photons radiating outward in all directions from a
hot object is another way energy can get diluted. But among many other
things, you don't think photons, or logic, exist so I doubt this answer
will satisfy you.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what you are
 saying. Let us consider a string 10 for simplicity. Let us
consider the next cases. I will cite first the thermodynamic
properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered on
it (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered on it
 (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string 10
and the thermodynamic entropy is different. If we take the
statement literally then the information must be different in all
four cases and defined uniquely as the thermodynamic entropy is
already there. Yet in my view this makes little sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information required to
 locate the possible states of the plates in the phase space of
atomic configurations constituting them. Note that the thermodynamic
entropy you quote is really the *change* in entropy per degree at the
given temperature. It's a measure of how much more phase space
becomes available to the atomic states when the internal energy is
increased. More available phase space means more uncertainty of the
exact actual state and hence more information entropy. This
information is enormous compared to the 01 stamped on the plate,
the shape of the plate or any other aspects that we would normally
use to convey information. It would only be in case we cooled the
plate to near absolute zero and then tried to encode information in
its microscopic vibrational states that the thermodynamic and the
encoded information entropy would become similar.



I would say that from your answer it follows that engineering 
information has nothing to do with the thermodynamic entropy. Don't you 
agree?


It would certainly interesting to consider what happens when we decrease 
the temperature (in the limit to zero Kelvin). According to the Third 
Law the entropy will be zero then. What do you think, can we save less 
information on a copper plate at low temperatures as compared with 
higher temperatures? Or more?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread meekerdb

On 1/27/2012 3:56 AM, Craig Weinberg wrote:

On Jan 26, 11:11 pm, meekerdbmeeke...@verizon.net  wrote:

On 1/26/2012 5:03 PM, Craig Weinberg wrote:


Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not?

No they don't.  They reflect photons differently; which is why you could use 
the pattern
to send a message.

True, although it's only relevant if you have photons to reflect. If I
turn out the lights (completely) does that change the entropy of the
red house? What if I turn the lights back on, has entropy been
suddenly reduced? Would a brighter light put more information or less
entropy onto the white house than the red house, ie, does the pattern
cost something in photons?

Yes.

That doesn't make sense to me. I think if two houses had two different
patterns with the same numbers of each brick, neither one could
possibly have a different cost in photons than the other. In a house
of four bricks, Red Red White White cannot have a different photon
absorption than Red White White Red.





I'm just curious, not trying to argue with you about it. On a similar
note, I was wondering about heat loss in a vacuum today. With the
second law of thermodynamics, it seems like heat could only dissipate
by heating something else up. If there was nothing in the universe
except a blob of molten nickel, would it cool off over time in an
infinite vacuum? It seems like it wouldn't. It seems like you would
need some other matter at a different temperature to seek a common
equilibrium with. Or is the heat just lost over time no matter what?

The heat would be lost by infrared radiation.

Lost to where? Energy is neither created nor...lost.


The reason I seldom respond to your posts is that you seem unwilling to put any effort 
into understanding what is written to you.


Lost to the photons.

Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 26.01.2012 12:00 Russell Standish said the following:

On Wed, Jan 25, 2012 at 08:47:03PM +0100, Evgenii Rudnyi wrote:


Let me suggest a very simple case to understand better what you
are saying. Let us consider a string 10 for simplicity. Let us
consider the next cases. I will cite first the thermodynamic
properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag  cr  42.55 ą 0.20 Al  cr  28.30 ą 0.10

In J K-1 cm-3 it will be

Ag  cr  42.55/107.87*10.49 = 4.14 Al  cr  28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered
on it (as on a coin) of the total volume 10 cm^3. The
thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered on
it (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string 10
and the thermodynamic entropy is different. If we take the
statement literally then the information must be different in all
four cases and defined uniquely as the thermodynamic entropy is
already there. Yet in my view this makes little sense.

Could you please comment on this four cases?



Brent commented quite aptly on these cases in another post. The fact
that you calculate the thermodynamic entropy the way you do implies
you are disregarding the information contained in the symbols
embossed on the coin.


Well, I do disregard the surface effects. However, the statement was 
that the informational entropy is the same as thermodynamic entropy, so 
we must consider the total entropy.



If you included these two bits, the thermodynamic entropy is two
bits less, = 4.15 x 10^{-24} J/K less

This is so many orders of magnitude less than the entropy due to the
material, its probably not worth including, but it is there.


I do not believe that effects below the experimental noise are important 
for empirical science. You probably mean then some other science, it 
would be good if you define what science you mean.


Evgenii


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 26.01.2012 19:01 John Clark said the following:

On Thu, Jan 19, 2012 at 5:28 PM, Craig
Weinbergwhatsons...@gmail.comwrote:


...


If I have red legos and white legos, and I build two opposite
monochrome

houses and one of mixed blocks, how in the world does that effect
the entropy of the plastic bricks in any way?



It does not effect the entropy of the plastic bricks but it does
change the entropy of the structures built with those plastic bricks.


This change in the entropy is below of experimental noise. Just estimate 
what difference it makes and the difference in what digit in the total 
entropy you will have. Hence the talk about the thermodynamic entropy as 
the information source in this case is just meaningless, as you cannot 
experimentally measure what you are talking about.


Evgenii


For a single part in isolation entropy is not defined, a single water
molecule has no entropy but a trillion trillion of them in a drop of
water does.

John K Clark



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 27.01.2012 05:11 meekerdb said the following:

On 1/26/2012 5:03 PM, Craig Weinberg wrote:


...



I'm just curious, not trying to argue with you about it. On a
similar note, I was wondering about heat loss in a vacuum today.
With the second law of thermodynamics, it seems like heat could
only dissipate by heating something else up. If there was nothing
in the universe except a blob of molten nickel, would it cool off
over time in an infinite vacuum? It seems like it wouldn't. It
seems like you would need some other matter at a different
temperature to seek a common equilibrium with. Or is the heat just
lost over time no matter what?


The heat would be lost by infrared radiation.



Brent,

if we consider a heated block in an infinite universe, then does its 
temperature go then to zero Kelvin?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread meekerdb

On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what you are
 saying. Let us consider a string 10 for simplicity. Let us
consider the next cases. I will cite first the thermodynamic
properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered on
it (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered on it
 (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string 10
and the thermodynamic entropy is different. If we take the
statement literally then the information must be different in all
four cases and defined uniquely as the thermodynamic entropy is
already there. Yet in my view this makes little sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information required to
 locate the possible states of the plates in the phase space of
atomic configurations constituting them. Note that the thermodynamic
entropy you quote is really the *change* in entropy per degree at the
given temperature. It's a measure of how much more phase space
becomes available to the atomic states when the internal energy is
increased. More available phase space means more uncertainty of the
exact actual state and hence more information entropy. This
information is enormous compared to the 01 stamped on the plate,
the shape of the plate or any other aspects that we would normally
use to convey information. It would only be in case we cooled the
plate to near absolute zero and then tried to encode information in
its microscopic vibrational states that the thermodynamic and the
encoded information entropy would become similar.



I would say that from your answer it follows that engineering information has nothing to 
do with the thermodynamic entropy. Don't you agree?


Obviously not since I wrote above that the thermodynamic entropy is a measure of how much 
information it would take to locate the exact state within the phase space allowed by the 
thermodynamic parameters.




It would certainly interesting to consider what happens when we decrease the temperature 
(in the limit to zero Kelvin). According to the Third Law the entropy will be zero then. 
What do you think, can we save less information on a copper plate at low temperatures as 
compared with higher temperatures? Or more?


Are you being deliberately obtuse?  Information encoded in the shape of the plate is not 
accounted for in the thermodynamic tables - they are just based on ideal bulk material 
(ignoring boundaries).


Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 27.01.2012 21:22 meekerdb said the following:

On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what you
are saying. Let us consider a string 10 for simplicity. Let
us consider the next cases. I will cite first the
thermodynamic properties of Ag and Al from CODATA tables (we
will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3. The
thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered
on it (as on a coin) of the total volume 10 cm^3. The
thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string
10 and the thermodynamic entropy is different. If we take
the statement literally then the information must be different
in all four cases and defined uniquely as the thermodynamic
entropy is already there. Yet in my view this makes little
sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the phase
space of atomic configurations constituting them. Note that the
thermodynamic entropy you quote is really the *change* in entropy
per degree at the given temperature. It's a measure of how much
more phase space becomes available to the atomic states when the
internal energy is increased. More available phase space means
more uncertainty of the exact actual state and hence more
information entropy. This information is enormous compared to the
01 stamped on the plate, the shape of the plate or any other
aspects that we would normally use to convey information. It
would only be in case we cooled the plate to near absolute zero
and then tried to encode information in its microscopic
vibrational states that the thermodynamic and the encoded
information entropy would become similar.



I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy. Don't
 you agree?


Obviously not since I wrote above that the thermodynamic entropy is a
 measure of how much information it would take to locate the exact
state within the phase space allowed by the thermodynamic
parameters.


Does this what engineers use when they develop communication devices?





It would certainly interesting to consider what happens when we
decrease the temperature (in the limit to zero Kelvin). According
to the Third Law the entropy will be zero then. What do you think,
can we save less information on a copper plate at low temperatures
as compared with higher temperatures? Or more?


Are you being deliberately obtuse? Information encoded in the shape
of the plate is not accounted for in the thermodynamic tables - they
are just based on ideal bulk material (ignoring boundaries).


I am just trying to understand the meaning of the term information that 
you use. I would say that there is the thermodynamic entropy and then 
the Shannon information entropy. The Shannon has developed a theory to 
help engineers to deal with communication (I believe that you have also 
recently a similar statement). Yet, in my view when we talk about 
communication devices and mechatronics, the information that engineers 
are interested in has nothing to do with the thermodynamic entropy. Do 
you agree or disagree with that? If you disagree, could you please give 
an example from engineering where engineers do employ the thermodynamic 
entropy as the estimate of information. My example would be Millipede


http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do not employ 
the thermodynamic entropy to estimate its information capabilities. 
Also, the increase of temperature would be destroy saved information there.


Well, I might be deliberately obtuse indeed. Yet with the only goal to 
reach a clear definition of what the information is. Right now I would 
say that there is information in engineering and in physics and they are 
different. The first I roughly understand and the second not.


Evgenii




Brent



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread meekerdb

On 1/27/2012 12:43 PM, Evgenii Rudnyi wrote:

On 27.01.2012 21:22 meekerdb said the following:

On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what you
are saying. Let us consider a string 10 for simplicity. Let
us consider the next cases. I will cite first the
thermodynamic properties of Ag and Al from CODATA tables (we
will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3. The
thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered
on it (as on a coin) of the total volume 10 cm^3. The
thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string
10 and the thermodynamic entropy is different. If we take
the statement literally then the information must be different
in all four cases and defined uniquely as the thermodynamic
entropy is already there. Yet in my view this makes little
sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the phase
space of atomic configurations constituting them. Note that the
thermodynamic entropy you quote is really the *change* in entropy
per degree at the given temperature. It's a measure of how much
more phase space becomes available to the atomic states when the
internal energy is increased. More available phase space means
more uncertainty of the exact actual state and hence more
information entropy. This information is enormous compared to the
01 stamped on the plate, the shape of the plate or any other
aspects that we would normally use to convey information. It
would only be in case we cooled the plate to near absolute zero
and then tried to encode information in its microscopic
vibrational states that the thermodynamic and the encoded
information entropy would become similar.



I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy. Don't
 you agree?


Obviously not since I wrote above that the thermodynamic entropy is a
 measure of how much information it would take to locate the exact
state within the phase space allowed by the thermodynamic
parameters.


Does this what engineers use when they develop communication devices?





It would certainly interesting to consider what happens when we
decrease the temperature (in the limit to zero Kelvin). According
to the Third Law the entropy will be zero then. What do you think,
can we save less information on a copper plate at low temperatures
as compared with higher temperatures? Or more?


Are you being deliberately obtuse? Information encoded in the shape
of the plate is not accounted for in the thermodynamic tables - they
are just based on ideal bulk material (ignoring boundaries).


I am just trying to understand the meaning of the term information that you use. I would 
say that there is the thermodynamic entropy and then the Shannon information entropy. 
The Shannon has developed a theory to help engineers to deal with communication (I 
believe that you have also recently a similar statement). Yet, in my view when we talk 
about communication devices and mechatronics, the information that engineers are 
interested in has nothing to do with the thermodynamic entropy. Do you agree or disagree 
with that? If you disagree, could you please give an example from engineering where 
engineers do employ the thermodynamic entropy as the estimate of information. 


I already said I disagreed.  You are confusing two different things.  Because structural 
engineers don't employ the theory of interatomic forces it doesn't follow that 
interactomic forces have nothing to do with sturctural properties.


Brent



My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do not employ the 
thermodynamic entropy to estimate its information capabilities. Also, the increase of 
temperature would be destroy saved information there.


Well, I might be deliberately obtuse indeed. Yet with the only goal to reach a clear 
definition of what the information is. Right now I would say that there is information 
in engineering and in physics and they are different. The first I roughly understand and 
the second not.


Evgenii




Brent





--
You received this message because you are subscribed to the Google 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Russell Standish
On Fri, Jan 27, 2012 at 08:27:31PM +0100, Evgenii Rudnyi wrote:
 On 26.01.2012 12:00 Russell Standish said the following:
 If you included these two bits, the thermodynamic entropy is two
 bits less, = 4.15 x 10^{-24} J/K less
 
 This is so many orders of magnitude less than the entropy due to the
 material, its probably not worth including, but it is there.
 
 I do not believe that effects below the experimental noise are
 important for empirical science. You probably mean then some other
 science, it would be good if you define what science you mean.
 
 Evgenii

For one thing, it indicates to storing just two bits of information on
these physical substrates is grossly inefficient!

Cheers

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Craig Weinberg
On Jan 27, 11:42 am, John Clark johnkcl...@gmail.com wrote:
 On Thu, Jan 26, 2012  Craig Weinberg whatsons...@gmail.com wrote:

  If a bucket of water has more of it than DNA, then the word information
  is meaningless.

 You would need to send more, far far more, dots and dashes down a wire to
 inform a intelligent entity what the position and velocity of every
 molecule in bucket of water is than to inform it exactly what the human
 genome is.

It depends what kind of compression you are using. You could much more
easily write a probabilistic equation to simulate any given volume of
water than the same volume of DNA, especially when you get into
secondary and tertiary structure.

 Now what word didn't you understand.

Condescension doesn't impress me. I understand your words perfectly,
it's just that what they are saying seems to be incorrect.


   A symphony then would have less information and more entropy than random
  noise.

 No, a symphony would have less information but LESS entropy than random
 white noise.

Ok, I think I see what the confusion is. We are operating not only
different definitions of entropy but different assumptions about the
universe which directly relate to information.

This QA: 
http://stackoverflow.com/questions/651135/shannons-entropy-formula-help-my-confusion
was the only page I could find that was written simply enough to make
sense to me. Your definition assumes that the universe is a platform
for encoding and decoding information and mine does not. You are
talking about entropy in terms of resistance to compression of
redundancy. Ok, but the relationship of Shannon entropy and
thermodynamic entropy is not what you are implying it is. The Wiki was
helpful: http://en.wikipedia.org/wiki/Entropy_(information_theory)

At an everyday practical level the links between information entropy
and thermodynamic entropy are not evident. Physicists and chemists are
apt to be more interested in changes in entropy as a system
spontaneously evolves away from its initial conditions, in accordance
with the second law of thermodynamics, rather than an unchanging
probability distribution. And, as the minuteness of Boltzmann's
constant kB indicates, the changes in S / kB for even tiny amounts of
substances in chemical and physical processes represent amounts of
entropy which are so large as to be off the scale compared to anything
seen in data compression or signal processing. Furthermore, in
classical thermodynamics the entropy is defined in terms of
macroscopic measurements and makes no reference to any probability
distribution, which is central to the definition of information
entropy.

But, at a multidisciplinary level, connections can be made between
thermodynamic and informational entropy, although it took many years
in the development of the theories of statistical mechanics and
information theory to make the relationship fully apparent. In fact,
in the view of Jaynes (1957), thermodynamic entropy, as explained by
statistical mechanics, should be seen as an application of Shannon's
information theory: the thermodynamic entropy is interpreted as being
proportional to the amount of further Shannon information needed to
define the detailed microscopic state of the system, that remains
uncommunicated by a description solely in terms of the macroscopic
variables of classical thermodynamics, with the constant of
proportionality being just the Boltzmann constant.

The key phrase for me here is the thermodynamic entropy is
interpreted as being proportional to the amount of further Shannon
information needed to define the detailed microscopic state of the
system. This confirms what I have been saying and is the opposite of
what you are saying. Thermodynamic entropy is proportional to the
amount of Shannon information *needed* to (encode/compress/extract
redundancy) from a given description to arrive at a maximally
compressed description. The more entropy or patternlessness you have,
ie the more equilibrium of probability and lack of redundancy you
have, the less information you have and the more Shannon information
you need to avoid lossy compression.

This means that DNA, having low entropy compared with pure water, has
high pattern content, high information, and less Shannon information
is required to describe it. Easier to compress does *not* mean less
information, it means more information is present already because in
essence the job is already partially done for you. Shannon entropy
then, is a measure of drag on compression, a figurative use of the
term entropy for the specific purpose of encoding and decoding. I am
using the literal thermodynamic sense of entropy, as well as the
figurative vernacular sense of entropy as degradation of order or
coherence, both of these are loosely inversely proportional to Shannon
entropy. The compressibility of a novel or picture does not relate to
the quality of information, not to mention qualities of significance.
Weighing art by the pound is not a 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread meekerdb

On 1/27/2012 2:51 PM, Craig Weinberg wrote:

On Jan 27, 11:42 am, John Clarkjohnkcl...@gmail.com  wrote:

On Thu, Jan 26, 2012  Craig Weinbergwhatsons...@gmail.com  wrote:


If a bucket of water has more of it than DNA, then the word information
is meaningless.

You would need to send more, far far more, dots and dashes down a wire to
inform a intelligent entity what the position and velocity of every
molecule in bucket of water is than to inform it exactly what the human
genome is.

It depends what kind of compression you are using. You could much more
easily write a probabilistic equation to simulate any given volume of
water than the same volume of DNA, especially when you get into
secondary and tertiary structure.


Now what word didn't you understand.

Condescension doesn't impress me. I understand your words perfectly,
it's just that what they are saying seems to be incorrect.


A symphony then would have less information and more entropy than random

noise.

No, a symphony would have less information but LESS entropy than random
white noise.

Ok, I think I see what the confusion is. We are operating not only
different definitions of entropy but different assumptions about the
universe which directly relate to information.

This QA: 
http://stackoverflow.com/questions/651135/shannons-entropy-formula-help-my-confusion
was the only page I could find that was written simply enough to make
sense to me. Your definition assumes that the universe is a platform
for encoding and decoding information and mine does not. You are
talking about entropy in terms of resistance to compression of
redundancy. Ok, but the relationship of Shannon entropy and
thermodynamic entropy is not what you are implying it is. The Wiki was
helpful: http://en.wikipedia.org/wiki/Entropy_(information_theory)

At an everyday practical level the links between information entropy
and thermodynamic entropy are not evident. Physicists and chemists are
apt to be more interested in changes in entropy as a system
spontaneously evolves away from its initial conditions, in accordance
with the second law of thermodynamics, rather than an unchanging
probability distribution. And, as the minuteness of Boltzmann's
constant kB indicates, the changes in S / kB for even tiny amounts of
substances in chemical and physical processes represent amounts of
entropy which are so large as to be off the scale compared to anything
seen in data compression or signal processing. Furthermore, in
classical thermodynamics the entropy is defined in terms of
macroscopic measurements and makes no reference to any probability
distribution, which is central to the definition of information
entropy.

But, at a multidisciplinary level, connections can be made between
thermodynamic and informational entropy, although it took many years
in the development of the theories of statistical mechanics and
information theory to make the relationship fully apparent. In fact,
in the view of Jaynes (1957), thermodynamic entropy, as explained by
statistical mechanics, should be seen as an application of Shannon's
information theory: the thermodynamic entropy is interpreted as being
proportional to the amount of further Shannon information needed to
define the detailed microscopic state of the system, that remains
uncommunicated by a description solely in terms of the macroscopic
variables of classical thermodynamics, with the constant of
proportionality being just the Boltzmann constant.

The key phrase for me here is the thermodynamic entropy is
interpreted as being proportional to the amount of further Shannon
information needed to define the detailed microscopic state of the
system. This confirms what I have been saying and is the opposite of
what you are saying. Thermodynamic entropy is proportional to the
amount of Shannon information *needed* to (encode/compress/extract
redundancy) from a given description to arrive at a maximally
compressed description. The more entropy or patternlessness you have,
ie the more equilibrium of probability and lack of redundancy you
have, the less information you have and the more Shannon information
you need to avoid lossy compression.

This means that DNA, having low entropy compared with pure water, has
high pattern content, high information, and less Shannon information
is required to describe it. Easier to compress does *not* mean less
information,


You're switching meanings of information.  Something highly compressible, like, 
A doesn't convey much information in either the 
colloquial or Shannon sense.  I think it's important to keep in mind that these measures 
of information are relative to some context.  Removed from it's cellular environment, the 
code for a strand of DNA would not convey much information in the colloquial sense, but 
its Shannon information would be the same.




it means more information is present already because in
essence the job is already partially done for you. Shannon 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Craig Weinberg
On Jan 27, 1:31 pm, John Clark johnkcl...@gmail.com wrote:
 On Thu, Jan 26, 2012 at 8:03 PM, Craig Weinberg whatsons...@gmail.comwrote:

  With the second law of thermodynamics, it seems like heat could only
  dissipate by heating something else up.

 The second law says that energy will tend to get diluted in space over
 time, and heat conducting to other matter is one way for this to happen but
 it is not the only way. Photons radiating outward in all directions from a
 hot object is another way energy can get diluted. But among many other
 things, you don't think photons, or logic, exist so I doubt this answer
 will satisfy you.

It would satisfy me if I you had some examples, but I don't think that
you know the answer for sure. If a vacuum is a good insulator (like a
vacuum thermos) and a perfect vacuum, as far as I have been able to
read online, is a perfect insulator. Electricity and heat pass from
object to object, not from space to space. Please point out any source
you can find to the contrary. What little I find agrees with vacuums
being insulators of heat and electricity.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Craig Weinberg
On Jan 27, 2:22 pm, meekerdb meeke...@verizon.net wrote:
 On 1/27/2012 3:56 AM, Craig Weinberg wrote:









  On Jan 26, 11:11 pm, meekerdbmeeke...@verizon.net  wrote:
  On 1/26/2012 5:03 PM, Craig Weinberg wrote:

  Ok, so how does it effect the entropy of the structures? The red
  house, the white house, and the mixed house (even if an interesting
  pattern is made in the bricks), all behave in a physically identical
  way, do they not?
  No they don't.  They reflect photons differently; which is why you could 
  use the pattern
  to send a message.
  True, although it's only relevant if you have photons to reflect. If I
  turn out the lights (completely) does that change the entropy of the
  red house? What if I turn the lights back on, has entropy been
  suddenly reduced? Would a brighter light put more information or less
  entropy onto the white house than the red house, ie, does the pattern
  cost something in photons?
  Yes.
  That doesn't make sense to me. I think if two houses had two different
  patterns with the same numbers of each brick, neither one could
  possibly have a different cost in photons than the other. In a house
  of four bricks, Red Red White White cannot have a different photon
  absorption than Red White White Red.

  I'm just curious, not trying to argue with you about it. On a similar
  note, I was wondering about heat loss in a vacuum today. With the
  second law of thermodynamics, it seems like heat could only dissipate
  by heating something else up. If there was nothing in the universe
  except a blob of molten nickel, would it cool off over time in an
  infinite vacuum? It seems like it wouldn't. It seems like you would
  need some other matter at a different temperature to seek a common
  equilibrium with. Or is the heat just lost over time no matter what?
  The heat would be lost by infrared radiation.
  Lost to where? Energy is neither created nor...lost.

 The reason I seldom respond to your posts is that you seem unwilling to put 
 any effort
 into understanding what is written to you.


I understand completely, and I apologize, but I am not here to
understand second hand summaries of authoritative knowledge form the
past. I am only interested in first hand, common sense realities
because my hypothesis presents a radical challenge of the post-
Enlightenment and pre-Enlightenment worldviews. EVERYTHING must be
questioned anew.

It's hard to find any first hand information on experiments on the
basics of modern physics as the accounts all take the interpretation
as a foregone solution. You never see any documentation of double slit
tests which don't presume photons to begin with. If I had access to a
lab there are a lot of experiments I would want to run that might be
revealing in a completely new way.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Craig Weinberg
On Jan 27, 2:33 pm, Evgenii Rudnyi use...@rudnyi.ru wrote:
 On 26.01.2012 19:01 John Clark said the following:

  On Thu, Jan 19, 2012 at 5:28 PM, Craig
  Weinbergwhatsons...@gmail.comwrote:

 ...

  If I have red legos and white legos, and I build two opposite
  monochrome
  houses and one of mixed blocks, how in the world does that effect
  the entropy of the plastic bricks in any way?

  It does not effect the entropy of the plastic bricks but it does
  change the entropy of the structures built with those plastic bricks.

 This change in the entropy is below of experimental noise. Just estimate
 what difference it makes and the difference in what digit in the total
 entropy you will have. Hence the talk about the thermodynamic entropy as
 the information source in this case is just meaningless, as you cannot
 experimentally measure what you are talking about.

 Evgenii

Thanks, that's what I thought.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 27.01.2012 23:03 meekerdb said the following:

On 1/27/2012 12:43 PM, Evgenii Rudnyi wrote:

On 27.01.2012 21:22 meekerdb said the following:

On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what
you are saying. Let us consider a string 10 for
simplicity. Let us consider the next cases. I will cite
first the thermodynamic properties of Ag and Al from CODATA
tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 =
2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then
the thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a
string 10 and the thermodynamic entropy is different. If
we take the statement literally then the information must
be different in all four cases and defined uniquely as the
thermodynamic entropy is already there. Yet in my view this
makes little sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the
phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the
*change* in entropy per degree at the given temperature. It's
a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More
available phase space means more uncertainty of the exact
actual state and hence more information entropy. This
information is enormous compared to the 01 stamped on the
plate, the shape of the plate or any other aspects that we
would normally use to convey information. It would only be in
case we cooled the plate to near absolute zero and then tried
to encode information in its microscopic vibrational states
that the thermodynamic and the encoded information entropy
would become similar.



I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy.
Don't you agree?


Obviously not since I wrote above that the thermodynamic entropy
is a measure of how much information it would take to locate the
exact state within the phase space allowed by the thermodynamic
parameters.


Does this what engineers use when they develop communication
devices?





It would certainly interesting to consider what happens when
we decrease the temperature (in the limit to zero Kelvin).
According to the Third Law the entropy will be zero then. What
do you think, can we save less information on a copper plate at
low temperatures as compared with higher temperatures? Or
more?


Are you being deliberately obtuse? Information encoded in the
shape of the plate is not accounted for in the thermodynamic
tables - they are just based on ideal bulk material (ignoring
boundaries).


I am just trying to understand the meaning of the term information
 that you use. I would say that there is the thermodynamic entropy
and then the Shannon information entropy. The Shannon has developed
a theory to help engineers to deal with communication (I believe
that you have also recently a similar statement). Yet, in my view
when we talk about communication devices and mechatronics, the
information that engineers are interested in has nothing to do with
the thermodynamic entropy. Do you agree or disagree with that? If
you disagree, could you please give an example from engineering
where engineers do employ the thermodynamic entropy as the estimate
of information.


I already said I disagreed. You are confusing two different things.
Because structural engineers don't employ the theory of interatomic
forces it doesn't follow that interactomic forces have nothing to do
 with sturctural properties.

Brent


You disagree that engineers do not use thermodynamic entropy but you 
have not shown yet how information in engineering is related with the 
thermodynamic entropy. Form the Millipede example


 http://en.wikipedia.org/wiki/Millipede_memory

The earliest generation millipede devices used probes 10 nanometers in 
diameter and 70 nanometers in length, producing pits about 40 nm in 
diameter on fields 92 µm x 92 µm. Arranged in a 32 x 32 grid, the 
resulting 3 mm x 3 mm chip stores 500 megabits of data or 62.5 MB, 
resulting in an areal density, the number of bits per square inch, on 
the order of 200 Gbit/in².


If would be much easier 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 27.01.2012 23:46 Russell Standish said the following:

On Fri, Jan 27, 2012 at 08:27:31PM +0100, Evgenii Rudnyi wrote:

On 26.01.2012 12:00 Russell Standish said the following:

If you included these two bits, the thermodynamic entropy is two
bits less, = 4.15 x 10^{-24} J/K less

This is so many orders of magnitude less than the entropy due to
the material, its probably not worth including, but it is there.


I do not believe that effects below the experimental noise are
important for empirical science. You probably mean then some other
science, it would be good if you define what science you mean.

Evgenii


For one thing, it indicates to storing just two bits of information
on these physical substrates is grossly inefficient!


Well, you could contact governments then and try to convince them that 
coins in use are highly inefficient. It would be a good chance to have 
funding.


By the way, at what temperature there will be possible to save more 
information, at higher or at lower one. Brent and John are talking about 
the entropy and the higher temperature the higher the entropy. From an 
engineering viewpoint it looks a bit strange.


Evgenii


Cheers



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-26 Thread Russell Standish
On Wed, Jan 25, 2012 at 08:47:03PM +0100, Evgenii Rudnyi wrote:
 
 Let me suggest a very simple case to understand better what you are
 saying. Let us consider a string 10 for simplicity. Let us
 consider the next cases. I will cite first the thermodynamic
 properties of Ag and Al from CODATA tables (we will need them)
 
 S ° (298.15 K)
 J K-1 mol-1
 
 Ag  cr  42.55 ą 0.20
 Al  cr  28.30 ą 0.10
 
 In J K-1 cm-3 it will be
 
 Ag  cr  42.55/107.87*10.49 = 4.14
 Al  cr  28.30/26.98*2.7 = 2.83
 
 1) An abstract string 10 as the abstract book above.
 
 2) Let us make now an aluminum plate (a page) with 10 hammered on
 it (as on a coin) of the total volume 10 cm^3. The thermodynamic
 entropy is then 28.3 J/K.
 
 3) Let us make now a silver plate (a page) with 10 hammered on it
 (as on a coin) of the total volume 10 cm^3. The thermodynamic
 entropy is then 41.4 J/K.
 
 4) We can easily make another aluminum plate (scaling all dimensions
 from 2) to the total volume of 100 cm^3. Then the thermodynamic
 entropy is 283 J/K.
 
 Now we have four different combinations to represent a string 10
 and the thermodynamic entropy is different. If we take the statement
 literally then the information must be different in all four cases
 and defined uniquely as the thermodynamic entropy is already there.
 Yet in my view this makes little sense.
 
 Could you please comment on this four cases?
 

Brent commented quite aptly on these cases in another post. The fact
that you calculate the thermodynamic entropy the way you do implies
you are disregarding the information contained in the symbols embossed
on the coin.

If you included these two bits, the thermodynamic entropy is two bits
less, = 4.15 x 10^{-24} J/K less 

This is so many orders of magnitude less than the entropy due to the
material, its probably not worth including, but it is there.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-26 Thread John Clark
On Thu, Jan 19, 2012 at 5:28 PM, Craig Weinberg whatsons...@gmail.comwrote:

 I thought that the whole point of information theory is to move beyond
 quality into pure quantification.


 Yes.


  the suggestion that information can be defined as not having anything to
 do with the difference between order and the absence of order is laughably
 preposterous


Yes.

 The idea that a bucket of water has more 'information' than DNA is
 meaningless.


What word didn't you understand?

  No, if its repeating then it would have less information, that is to
 say it would take less information to describe the result.



  Of course, but how does that jibe with the notion that information
 ismolecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less
 internal degrees of freedom within a DNA molecule then A-T G-C A-T?


It would take little information to describe a repeating sequence like
A-T-A-T-A-T and few ways to change it's micro-state without altering
its macro orderly appearance, so it has a very low entropy,  but it would
take a lot of information to describe a random sequence A-T G-C A-T... and
lots of ways to alter it's micro-state with it still looking random, so it
has a high entropy.

 I see no reason to use the word information at all for this. It sounds
 like you are just talking about entropy to me.


As I said, think about entropy as a measure of the number of ways you can
change the micro-structure of something without changing its large scale
macro appearance.


  If I have red legos and white legos, and I build two opposite monochrome
 houses and one of mixed blocks, how in the world does that effect the
 entropy of the plastic bricks in any way?


It does not effect the entropy of the plastic bricks but it does change the
entropy of the structures built with those plastic bricks. For a single
part in isolation entropy is not defined, a single water molecule has no
entropy but a trillion trillion of them in a drop of water does.

  John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-26 Thread Craig Weinberg
On Jan 26, 1:01 pm, John Clark johnkcl...@gmail.com wrote:
 On Thu, Jan 19, 2012 at 5:28 PM, Craig Weinberg whatsons...@gmail.comwrote:

  I thought that the whole point of information theory is to move beyond
  quality into pure quantification.

  Yes.

   the suggestion that information can be defined as not having anything to
  do with the difference between order and the absence of order is laughably
  preposterous

 Yes.

  The idea that a bucket of water has more 'information' than DNA is
  meaningless.

 What word didn't you understand?

Information. If a bucket of water has more of it than DNA, then the
word information is meaningless.


   No, if its repeating then it would have less information, that is to
  say it would take less information to describe the result.

   Of course, but how does that jibe with the notion that information
  ismolecular entropy? How does A-T A-T A-T or G-T G-T G-T guarantee less
  internal degrees of freedom within a DNA molecule then A-T G-C A-T?

 It would take little information to describe a repeating sequence like
 A-T-A-T-A-T and few ways to change it's micro-state without altering
 its macro orderly appearance,

Describe it to who? Macro appearance to what? If you live alone on a
planet that is only liquid, how does one 'describe' a repeating
sequence? Besides your own mind, what would tell you that A-T-A-T-A-
T... can be expressed in any other way other than what it literally
is?

 so it has a very low entropy,  but it would
 take a lot of information to describe a random sequence A-T G-C A-T... and
 lots of ways to alter it's micro-state with it still looking random, so it
 has a high entropy.

So you are saying water has more information than DNA, but DNA that is
completely random has the same amount (or less) information than the
DNA that belonged to Beethoven. A symphony then would have less
information and more entropy than random noise. If the word
information is to have any meaning, quantity and compressibility of
data must be distinguished from quality of it's interpretation. Which
of course parallels the AI treatment of intelligence (trivial or
quantitative processing capacity) and cognitive awareness
(consciousness).


  I see no reason to use the word information at all for this. It sounds
  like you are just talking about entropy to me.

 As I said, think about entropy as a measure of the number of ways you can
 change the micro-structure of something without changing its large scale
 macro appearance.

I don't think it's a good definition because micro and macro are
relative to an observer, not to the universe, but I understand what
you mean. There really is no definition related to order or pattern
that isn't subjective. The degree to which something's 'large scale
macro appearance' changes is contingent entirely on our ability to
perceive and recognize the changes.

Let's say your definition were true though. What does it have to do
with information being directly proportionate to entropy? If entropy
were equal or proportionate to information, then are saying that the
more information something contains, the less it matters. The more
information you have on the micro level, the less you can tell at the
macro. It seems obvious that they are inversely proportional. To
inform something is to reduce it's entropy (which necessarily means
increasing entropy somewhere else...entropy is all about space). I
build a sand castle and it has lower entropy than the rest of the
beach. Over time, the sand will return to the beach and we say the
entropy has returned to the higher beach level. If I encase the
sandcastle in lucite, it will slow down that process tremendously
because the form has no space to fall away from the castle.


   If I have red legos and white legos, and I build two opposite monochrome
  houses and one of mixed blocks, how in the world does that effect the
  entropy of the plastic bricks in any way?

 It does not effect the entropy of the plastic bricks but it does change the
 entropy of the structures built with those plastic bricks. For a single
 part in isolation entropy is not defined, a single water molecule has no
 entropy but a trillion trillion of them in a drop of water does.


Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not? That would seem to preclude information itself from
having any objective material presence.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-26 Thread meekerdb

On 1/26/2012 3:32 PM, Craig Weinberg wrote:

Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not?


No they don't.  They reflect photons differently; which is why you could use the pattern 
to send a message.


There seems to be a lot of confusion about information as defined by Shannon.  Shannon's 
information is relative to the uncertainty in a message.  So it depends on how you define 
the possible messages.  If different patterns of red and white legos constitute the 
possible messages, then you can measure the information capacity of this message system by 
Shannon's formula.  It's *not* the measure of some particular message - it's the measure 
of the *capacity* of the message system.


Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-26 Thread Craig Weinberg
On Jan 26, 6:54 pm, meekerdb meeke...@verizon.net wrote:
 On 1/26/2012 3:32 PM, Craig Weinberg wrote:

  Ok, so how does it effect the entropy of the structures? The red
  house, the white house, and the mixed house (even if an interesting
  pattern is made in the bricks), all behave in a physically identical
  way, do they not?

 No they don't.  They reflect photons differently; which is why you could use 
 the pattern
 to send a message.

True, although it's only relevant if you have photons to reflect. If I
turn out the lights (completely) does that change the entropy of the
red house? What if I turn the lights back on, has entropy been
suddenly reduced? Would a brighter light put more information or less
entropy onto the white house than the red house, ie, does the pattern
cost something in photons?

I'm just curious, not trying to argue with you about it. On a similar
note, I was wondering about heat loss in a vacuum today. With the
second law of thermodynamics, it seems like heat could only dissipate
by heating something else up. If there was nothing in the universe
except a blob of molten nickel, would it cool off over time in an
infinite vacuum? It seems like it wouldn't. It seems like you would
need some other matter at a different temperature to seek a common
equilibrium with. Or is the heat just lost over time no matter what?


 There seems to be a lot of confusion about information as defined by Shannon. 
  Shannon's
 information is relative to the uncertainty in a message.  So it depends on 
 how you define
 the possible messages.  If different patterns of red and white legos 
 constitute the
 possible messages, then you can measure the information capacity of this 
 message system by
 Shannon's formula.  It's *not* the measure of some particular message - it's 
 the measure
 of the *capacity* of the message system.

That makes more sense. As long as the possibility of messages is
subjective, I don't have a problem with it. It's when information is
treated as an objective entity that I vote no,

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-26 Thread meekerdb

On 1/26/2012 5:03 PM, Craig Weinberg wrote:

On Jan 26, 6:54 pm, meekerdbmeeke...@verizon.net  wrote:

On 1/26/2012 3:32 PM, Craig Weinberg wrote:


Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not?

No they don't.  They reflect photons differently; which is why you could use 
the pattern
to send a message.

True, although it's only relevant if you have photons to reflect. If I
turn out the lights (completely) does that change the entropy of the
red house? What if I turn the lights back on, has entropy been
suddenly reduced? Would a brighter light put more information or less
entropy onto the white house than the red house, ie, does the pattern
cost something in photons?


Yes.



I'm just curious, not trying to argue with you about it. On a similar
note, I was wondering about heat loss in a vacuum today. With the
second law of thermodynamics, it seems like heat could only dissipate
by heating something else up. If there was nothing in the universe
except a blob of molten nickel, would it cool off over time in an
infinite vacuum? It seems like it wouldn't. It seems like you would
need some other matter at a different temperature to seek a common
equilibrium with. Or is the heat just lost over time no matter what?


The heat would be lost by infrared radiation.

Brent




There seems to be a lot of confusion about information as defined by Shannon.  
Shannon's
information is relative to the uncertainty in a message.  So it depends on how 
you define
the possible messages.  If different patterns of red and white legos constitute 
the
possible messages, then you can measure the information capacity of this 
message system by
Shannon's formula.  It's *not* the measure of some particular message - it's 
the measure
of the *capacity* of the message system.

That makes more sense. As long as the possibility of messages is
subjective, I don't have a problem with it. It's when information is
treated as an objective entity that I vote no,

Craig



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-25 Thread Evgenii Rudnyi

On 23.01.2012 01:26 Russell Standish said the following:

On Sun, Jan 22, 2012 at 07:16:23PM +0100, Evgenii Rudnyi wrote:

On 20.01.2012 05:59 Russell Standish said the following:

On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:


...


and since information is measured by order, a maximum of order
is conveyed by a maximum of disorder. Obviously, this is a
Babylonian muddle. Somebody or something has confounded our
language.



I would say it is many people, rather than just one. I wrote On
Complexity and Emergence in response to the amount of
unmitigated tripe I've seen written about these topics.



Russel,

I have read your paper

http://arxiv.org/abs/nlin/0101006

It is well written. Could you please apply the principles from
your paper to a problem on how to determine information in a book
(for example let us take your book Theory of Nothing)?

Also do you believe earnestly that this information is equal to
the thermodynamic entropy of the book?


These are two quite different questions. To someone who reads my
book, the physical form of the book is unimportant - it could just as
easily be a PDF file or a Kindle e-book as a physical paper copy. The
PDF is a little over 30,000 bytes long. Computing the information
content would be a matter of counting the number 30,000 long byte
strings that generate a recognisable variant of ToN when fed into
Acrobat reader. Then subtract the logarithm (to base 256) of this
figure from 30,000 to get the information content in bytes.

This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be found by
compressing the file - PDFs are already compressed, so we could
estimate the information content as being between 25KB and 30KB
(say).


Yet, this is already information. Hence if take the equivalence between 
the informational and thermodynamic entropies literally, then even in 
this case the thermodynamic entropy (that should be possible to measure 
by experimental thermodynamics) must exist. What it is in this case?



To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together.
The arrangement of ink on the pages is probably quite unimportant - a
book of the same size and shape, but with blank pages would do just
as well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book?


It is a good question and in my view it again shows that thermodynamic 
entropy and information are some different things, as for the same 
object we can define the information differently (see also below).



To compute the thermodynamic information, one could imagine
performing a massive molecular dynamics simulation, and then count
the number of states that correspond to the physical book, take the
logarithm, then subtract that from the logarithm of the total
possible number of states the molecules could take on (if completely
disassociated).


Do not forget that molecular dynamics simulation is based on the Newton 
laws (even quantum mechanics molecular dynamics). Hence you probably 
mean here the Monte-Carlo method. Yet, it is much simpler to employ 
experimental thermodynamics (see below).



This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.

Now, how does this relate to the thermodynamic entropy of the book?
It turns out that the information computed by the in-principle
process above is equal to the difference between the maximum entropy
of the molecules making up the book (if completely disassociated) and
the thermodynamic entropy, which could be measured in a calorimeter.



If yes, can one determine the information in the book just by means
of experimental thermodynamics?



One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.


Let me suggest a very simple case to understand better what you are 
saying. Let us consider a string 10 for simplicity. Let us consider 
the next cases. I will cite first the thermodynamic properties of Ag and 
Al from CODATA tables (we will need them)


S ° (298.15 K)
J K-1 mol-1

Ag  cr  42.55 ą 0.20
Al  cr  28.30 ą 0.10

In J K-1 cm-3 it will be

Ag  cr  42.55/107.87*10.49 = 4.14
Al  cr  28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered on it 
(as on a coin) of the total volume 10 cm^3. The thermodynamic entropy is 
then 28.3 J/K.


3) Let us make now a silver plate (a page) with 10 hammered on it (as 
on a coin) of the total volume 10 cm^3. The thermodynamic entropy is 
then 41.4 J/K.


4) We can easily make another aluminum plate (scaling all dimensions 
from 2) to the total 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-25 Thread Evgenii Rudnyi

On 24.01.2012 13:49 Craig Weinberg said the following:


If you are instead saying that they are inversely proportional then
I would agree in general - information can be considered negentropy.
Sorry, I thought you were saying that they are directly proportional
measures (Brent and Evgenii seem to be talking about it that way). I


I am not an expert in the informational entropy. For me it does not 
matter how they define it in the information theory, whether as entropy 
or negentropy. My point is that this has nothing to do with the 
thermodynamic entropy (see my previous message with four cases for the 
string 10).


Evgenii


think that we can go further in understanding information though.
Negentropy is a good beginning but it does not address significance.
The degree to which information has the capacity to inform is even
more important than the energy cost to generate. Significance of
information is a subjective quality which is independent of entropy
but essential to the purpose of information. In fact, information
itself could be considered the quantitative shadow of the quality of
significance. Information that does not inform something is not
information.

Craig



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-25 Thread Evgenii Rudnyi

On 24.01.2012 22:56 meekerdb said the following:


 In thinking about how to answer this I came across an excellent paper
by Roman Frigg and Charlotte Werndl
http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates
the relation more comprehensively than I could and which also gives
some historical background and extensions: specifically look at
section 4.

Brent



Thanks for the link. I will try to work it out to see if they have an 
answer to the four cases with the string 10 that I have described in 
my reply to Russell.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-25 Thread meekerdb

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

On 23.01.2012 01:26 Russell Standish said the following:

On Sun, Jan 22, 2012 at 07:16:23PM +0100, Evgenii Rudnyi wrote:

On 20.01.2012 05:59 Russell Standish said the following:

On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:


...


and since information is measured by order, a maximum of order
is conveyed by a maximum of disorder. Obviously, this is a
Babylonian muddle. Somebody or something has confounded our
language.



I would say it is many people, rather than just one. I wrote On
Complexity and Emergence in response to the amount of
unmitigated tripe I've seen written about these topics.



Russel,

I have read your paper

http://arxiv.org/abs/nlin/0101006

It is well written. Could you please apply the principles from
your paper to a problem on how to determine information in a book
(for example let us take your book Theory of Nothing)?

Also do you believe earnestly that this information is equal to
the thermodynamic entropy of the book?


These are two quite different questions. To someone who reads my
book, the physical form of the book is unimportant - it could just as
easily be a PDF file or a Kindle e-book as a physical paper copy. The
PDF is a little over 30,000 bytes long. Computing the information
content would be a matter of counting the number 30,000 long byte
strings that generate a recognisable variant of ToN when fed into
Acrobat reader. Then subtract the logarithm (to base 256) of this
figure from 30,000 to get the information content in bytes.

This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be found by
compressing the file - PDFs are already compressed, so we could
estimate the information content as being between 25KB and 30KB
(say).


Yet, this is already information. Hence if take the equivalence between the 
informational and thermodynamic entropies literally, then even in this case the 
thermodynamic entropy (that should be possible to measure by experimental 
thermodynamics) must exist. What it is in this case?



To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together.
The arrangement of ink on the pages is probably quite unimportant - a
book of the same size and shape, but with blank pages would do just
as well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book?


It is a good question and in my view it again shows that thermodynamic entropy and 
information are some different things, as for the same object we can define the 
information differently (see also below).



To compute the thermodynamic information, one could imagine
performing a massive molecular dynamics simulation, and then count
the number of states that correspond to the physical book, take the
logarithm, then subtract that from the logarithm of the total
possible number of states the molecules could take on (if completely
disassociated).


Do not forget that molecular dynamics simulation is based on the Newton laws (even 
quantum mechanics molecular dynamics). Hence you probably mean here the Monte-Carlo 
method. Yet, it is much simpler to employ experimental thermodynamics (see below).



This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.

Now, how does this relate to the thermodynamic entropy of the book?
It turns out that the information computed by the in-principle
process above is equal to the difference between the maximum entropy
of the molecules making up the book (if completely disassociated) and
the thermodynamic entropy, which could be measured in a calorimeter.



If yes, can one determine the information in the book just by means
of experimental thermodynamics?



One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.


Let me suggest a very simple case to understand better what you are saying. Let us 
consider a string 10 for simplicity. Let us consider the next cases. I will cite first 
the thermodynamic properties of Ag and Al from CODATA tables (we will need them)


S ° (298.15 K)
J K-1 mol-1

Ag  cr  42.55 ą 0.20
Al  cr  28.30 ą 0.10

In J K-1 cm-3 it will be

Ag  cr  42.55/107.87*10.49 = 4.14
Al  cr  28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered on it (as on a coin) of 
the total volume 10 cm^3. The thermodynamic entropy is then 28.3 J/K.


3) Let us make now a silver plate (a page) with 10 hammered on it (as on a coin) of 
the total volume 10 cm^3. The thermodynamic entropy is then 41.4 J/K.


4) We can easily make another aluminum plate 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-24 Thread Craig Weinberg
On Jan 23, 11:25 pm, Russell Standish li...@hpcoders.com.au wrote:
 On Mon, Jan 23, 2012 at 05:20:28AM -0800, Craig Weinberg wrote:

  Besides, any such quantitative measure does not take sequence into
  account. A book or file which is completely scrambled down to the
  level of characters or pixels has the same quantity of entropy
  displacement as the in tact text. To reduce information to quantity
  alone means that a 240k text file can be rearranged to be 40kb of
  nothing but 1s and then 200kb of nothing but 0s and have the same
  amount of information and entropy. It's a gross misunderstanding of
  how information works.

  Craig

 Rearranging the text file to have 40KB of 1s and 200KB of 0s
 dramatically reduces the information and increases the entropy by the
 same amount, although not nearly as much as completely scrambling the
 file. I'd say you have a gross misunderstanding of how these measures
 work if you think otherwise.

All this time I thought that you have been saying that entropy and
information are the same thing:

   This suggests to me that a molecule of DNA belonging to a
kangaroo could
have no more information than the same molecule with the primary
sequence
scrambled into randomness

   That is correct, it would have the same quantity of
information, but most
   would be of the opinion that the quality has changed.

If you are instead saying that they are inversely proportional then I
would agree in general - information can be considered negentropy.
Sorry, I thought you were saying that they are directly proportional
measures (Brent and Evgenii seem to be talking about it that way). I
think that we can go further in understanding information though.
Negentropy is a good beginning but it does not address significance.
The degree to which information has the capacity to inform is even
more important than the energy cost to generate. Significance of
information is a subjective quality which is independent of entropy
but essential to the purpose of information. In fact, information
itself could be considered the quantitative shadow of the quality of
significance. Information that does not inform something is not
information.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-24 Thread meekerdb

On 1/22/2012 1:04 AM, Evgenii Rudnyi wrote:

On 21.01.2012 22:03 Evgenii Rudnyi said the following:

On 21.01.2012 21:01 meekerdb said the following:

On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:

On 21.01.2012 20:00 meekerdb said the following:

On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:




...


2) If physicists say that information is the entropy, they
must take it literally and then apply experimental
thermodynamics to measure information. This however seems
not to happen.


It does happen. The number of states, i.e. the information,
available from a black hole is calculated from it's
thermodynamic properties as calculated by Hawking. At a more
conventional level, counting the states available to molecules
in a gas can be used to determine the specific heat of the gas
and vice-verse. The reason the thermodynamic measures and the
information measures are treated separately in engineering
problems is that the information that is important to
engineering is infinitesimal compared to the information stored
in the microscopic states. So the latter is considered only in
terms of a few macroscopic averages, like temperature and
pressure.

Brent


Doesn't this mean that by information engineers means something
different as physicists?


I don't think so. A lot of the work on information theory was done
by communication engineers who were concerned with the effect of
thermal noise on bandwidth. Of course engineers specialize more
narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement
methods for things that are unified in basic physics, e.g. there
are engineers who specialize in magnetism and who seldom need to
reflect that it is part of EM, there are others who specialize in
RF and don't worry about static fields.


Do you mean that engineers use experimental thermodynamics to
determine information?


 Evgenii

To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf

The term information is there but the entropy not. Could you please explain why? Or 
alternatively could you please point out to papers where engineers use the concept of 
the equivalence between the entropy and information?


Evgenii



In thinking about how to answer this I came across an excellent paper by Roman Frigg and 
Charlotte Werndl http://www.romanfrigg.org/writings/EntropyGuide.pdf which explicates the 
relation more comprehensively than I could and which also gives some historical background 
and extensions: specifically look at section 4.


Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-23 Thread Craig Weinberg
On Jan 22, 7:26 pm, Russell Standish li...@hpcoders.com.au wrote:


 Now, how does this relate to the thermodynamic entropy of the book? It
 turns out that the information computed by the in-principle process
 above is equal to the difference between the maximum entropy of the
 molecules making up the book (if completely disassociated) and the
 thermodynamic entropy, which could be measured in a calorimeter.

  If yes, can one determine the
  information in the book just by means of experimental
  thermodynamics?

 One can certainly determine the information of the physical book
 (defined however you might like) - but that is not the same as the
 information of the abstract book.

This would only work of the information were meaningless and a-
signifying. I can write a whole book with just the words The movie
Goodfellas. Anyone who has seen that movie has a rich text of
memories from which to inform themselves through that association.
That is what being informed actually is, associating and integrating
presented texts with a body of accumulated texts and contexts. If you
conflate information with the data that happens to be associated with
a particular text in a particular language-media context, you are
literally weighing stories by the pound (or gram).

Besides, any such quantitative measure does not take sequence into
account. A book or file which is completely scrambled down to the
level of characters or pixels has the same quantity of entropy
displacement as the in tact text. To reduce information to quantity
alone means that a 240k text file can be rearranged to be 40kb of
nothing but 1s and then 200kb of nothing but 0s and have the same
amount of information and entropy. It's a gross misunderstanding of
how information works.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-22 Thread Evgenii Rudnyi

On 21.01.2012 22:03 Evgenii Rudnyi said the following:

On 21.01.2012 21:01 meekerdb said the following:

On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:

On 21.01.2012 20:00 meekerdb said the following:

On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:




...


2) If physicists say that information is the entropy, they
must take it literally and then apply experimental
thermodynamics to measure information. This however seems
not to happen.


It does happen. The number of states, i.e. the information,
available from a black hole is calculated from it's
thermodynamic properties as calculated by Hawking. At a more
conventional level, counting the states available to molecules
in a gas can be used to determine the specific heat of the gas
and vice-verse. The reason the thermodynamic measures and the
information measures are treated separately in engineering
problems is that the information that is important to
engineering is infinitesimal compared to the information stored
in the microscopic states. So the latter is considered only in
terms of a few macroscopic averages, like temperature and
pressure.

Brent


Doesn't this mean that by information engineers means something
different as physicists?


I don't think so. A lot of the work on information theory was done
by communication engineers who were concerned with the effect of
thermal noise on bandwidth. Of course engineers specialize more
narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement
methods for things that are unified in basic physics, e.g. there
are engineers who specialize in magnetism and who seldom need to
reflect that it is part of EM, there are others who specialize in
RF and don't worry about static fields.


Do you mean that engineers use experimental thermodynamics to
determine information?


 Evgenii

To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman
H_inf control in a behavioral context: The full information case
IEEE Transactions on Automatic Control
Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/~jwillems/Articles/JournalArticles/1999.4.pdf

The term information is there but the entropy not. Could you please 
explain why? Or alternatively could you please point out to papers where 
engineers use the concept of the equivalence between the entropy and 
information?


Evgenii




Brent



Evgenii







--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-22 Thread Evgenii Rudnyi

On 20.01.2012 05:59 Russell Standish said the following:

On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:


...


and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language.



I would say it is many people, rather than just one. I wrote On
Complexity and Emergence in response to the amount of unmitigated
tripe I've seen written about these topics.



Russel,

I have read your paper

http://arxiv.org/abs/nlin/0101006

It is well written. Could you please apply the principles from your 
paper to a problem on how to determine information in a book (for 
example let us take your book Theory of Nothing)?


Also do you believe earnestly that this information is equal to the 
thermodynamic entropy of the book? If yes, can one determine the 
information in the book just by means of experimental thermodynamics?


Evgenii

P.S. Why it is impossible to state that a random string is generated by 
some random generator?



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-22 Thread Russell Standish
On Sun, Jan 22, 2012 at 07:16:23PM +0100, Evgenii Rudnyi wrote:
 On 20.01.2012 05:59 Russell Standish said the following:
 On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:
 
 ...
 
 and since information is measured by order, a maximum of order is
 conveyed by a maximum of disorder. Obviously, this is a Babylonian
 muddle. Somebody or something has confounded our language.
 
 
 I would say it is many people, rather than just one. I wrote On
 Complexity and Emergence in response to the amount of unmitigated
 tripe I've seen written about these topics.
 
 
 Russel,
 
 I have read your paper
 
 http://arxiv.org/abs/nlin/0101006
 
 It is well written. Could you please apply the principles from your
 paper to a problem on how to determine information in a book (for
 example let us take your book Theory of Nothing)?
 
 Also do you believe earnestly that this information is equal to the
 thermodynamic entropy of the book? 

These are two quite different questions. To someone who reads my book,
the physical form of the book is unimportant - it could just as easily
be a PDF file or a Kindle e-book as a physical paper copy. The PDF is
a little over 30,000 bytes long. Computing the information content
would be a matter of counting the number 30,000 long byte strings that
generate a recognisable variant of ToN when fed into Acrobat
reader. Then subtract the logarithm (to base 256) of this figure from
30,000 to get the information content in bytes.

This is quite impractical, of course, not to speak of expense in
paying for an army of people to go through 256^30,000 variants to
decide which ones are the true ToN's. An upper bound can be
found by compressing the file - PDFs are already compressed, so we
could estimate the information content as being between 25KB and 30KB (say).

To a physicist, it is the physical form that is important - the fact
that it is made of paper, with a bit of glue to hold it together. The
arrangement of ink on the pages is probably quite unimportant - a book
of the same size and shape, but with blank pages would do just as
well. Even if the arrangement of ink is important, then does
typesetting the book in a different font lead to the same book or a
different book? 

To compute the thermodynamic information, one could imagine performing
a massive molecular dynamics simulation, and then count the number of
states that correspond to the physical book, take the logarithm, then
subtract that from the logarithm of the total possible number of
states the molecules could take on (if completely disassociated).

This is, of course, completely impractical. Computing the complexity
of something is generally NP-hard. But in principle doable.

Now, how does this relate to the thermodynamic entropy of the book? It
turns out that the information computed by the in-principle process
above is equal to the difference between the maximum entropy of the
molecules making up the book (if completely disassociated) and the
thermodynamic entropy, which could be measured in a calorimeter.


 If yes, can one determine the
 information in the book just by means of experimental
 thermodynamics?
 

One can certainly determine the information of the physical book
(defined however you might like) - but that is not the same as the
information of the abstract book.

 Evgenii
 
 P.S. Why it is impossible to state that a random string is generated
 by some random generator?
 

Not sure what you mean, unless you're really asking Why it is
impossible to state that a random string is generated by some
pseudorandom generator?

In which case the answer is that a pseudorandom generator is an
algorithm, so by definition doesn't produce random numbers. There is a
lot of knowledge about how to decide if a particular PRNG is
sufficiently random for a particular purpose. No PRNG is sufficiently
random for all purposes - in particular they are very poor for
security purposes, as they're inherently predictable.

Cheers

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-21 Thread Evgenii Rudnyi

On 20.01.2012 05:59 Russell Standish said the following:

On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:


...



Basically I do not understand what the term information then
brings. One can certainly state that information is the same as the
entropy (we are free with definitions after all). Yet I miss the
meaning of that. Let me put it this way, we have the thermodynamic
entropy and then the informational entropy as defined by Shannon.
The first used to designe a motor and the second to design a
controller. Now let us suppose that these two entropies are the
same. What this changes in a design of a motor and a controller? In
my view nothing.



I can well recommend Denbigh  Denbigh's book from the 80s - its a
bit more of a modern understanding of the topic than Jaynes :)

@book{Denbigh-Denbigh87, author = {Denbigh, K. G. and Denbigh, J.},
publisher = { Cambridge UP}, title = { Entropy in Relation to
Incomplete Knowledge}, year = { 1987}, }


Thanks. On biotaconv they have recommended John Avery's Information 
Theory and Evolution but I think I have already satisfied my curiosity 
with Jaynes's two papers. My personal feeling is as follows:


1) The concept of information is useless in conventional thermodynamic 
problems. Let us take for example the Fe-C phase diagram


http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

What information has to do with the entropies of the phases in this 
phase diagram? Do you mean that I find an answer in Denbigh's book?


2) If physicists say that information is the entropy, they must take it 
literally and then apply experimental thermodynamics to measure 
information. This however seems not to happen.


3) I am working with engineers developing mechatronics products. 
Thermodynamics (hence the entropy) is there as well as information. 
However, I have not met a practitioner yet who makes a connection 
between the entropy and information.





By the way, have you seen the answer to my question:


Also remember that at constant volume dS = (Cv/T) dT and dU =
CvdT. If the entropy is information then its derivative must
be related to information as well. Hence Cv must be related to
information. This however means that the energy also somehow
related to information.


If the entropy is the same as information, than through the
derivatives all thermodynamic properties are related to
information as well. I am not sure if this makes sense in respect
for example to design a self-driving car.



The information embodied in the thermodynamic state is presumably
not relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.


Sorry, I do not understand what this means.


I am aware of works that estimated the thermodynamic limit (kT) to
process information. I do not see however, how this proves the
equivalence of information and entropy.

Evgenii


...


and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language.



I would say it is many people, rather than just one. I wrote On
Complexity and Emergence in response to the amount of unmitigated
tripe I've seen written about these topics.




I have found your work on archiv.org and I will look at it. Thank you 
for mentioning it.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-21 Thread meekerdb

On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:

On 20.01.2012 05:59 Russell Standish said the following:

On Thu, Jan 19, 2012 at 08:03:41PM +0100, Evgenii Rudnyi wrote:


...



Basically I do not understand what the term information then
brings. One can certainly state that information is the same as the
entropy (we are free with definitions after all). Yet I miss the
meaning of that. Let me put it this way, we have the thermodynamic
entropy and then the informational entropy as defined by Shannon.
The first used to designe a motor and the second to design a
controller. Now let us suppose that these two entropies are the
same. What this changes in a design of a motor and a controller? In
my view nothing.



I can well recommend Denbigh  Denbigh's book from the 80s - its a
bit more of a modern understanding of the topic than Jaynes :)

@book{Denbigh-Denbigh87, author = {Denbigh, K. G. and Denbigh, J.},
publisher = { Cambridge UP}, title = { Entropy in Relation to
Incomplete Knowledge}, year = { 1987}, }


Thanks. On biotaconv they have recommended John Avery's Information Theory and 
Evolution but I think I have already satisfied my curiosity with Jaynes's two papers. 
My personal feeling is as follows:


1) The concept of information is useless in conventional thermodynamic problems. Let us 
take for example the Fe-C phase diagram


http://www.calphad.com/graphs/Metastable%20Fe-C%20Phase%20Diagram.gif

What information has to do with the entropies of the phases in this phase diagram? Do 
you mean that I find an answer in Denbigh's book?


2) If physicists say that information is the entropy, they must take it literally and 
then apply experimental thermodynamics to measure information. This however seems not to 
happen.


It does happen.  The number of states, i.e. the information, available from a black hole 
is calculated from it's thermodynamic properties as calculated by Hawking.  At a more 
conventional level, counting the states available to molecules in a gas can be used to 
determine the specific heat of the gas and vice-verse.  The reason the thermodynamic 
measures and the information measures are treated separately in engineering problems is 
that the information that is important to engineering is infinitesimal compared to the 
information stored in the microscopic states.  So the latter is considered only in terms 
of a few macroscopic averages, like temperature and pressure.


Brent



3) I am working with engineers developing mechatronics products. Thermodynamics (hence 
the entropy) is there as well as information. However, I have not met a practitioner yet 
who makes a connection between the entropy and information.





By the way, have you seen the answer to my question:


Also remember that at constant volume dS = (Cv/T) dT and dU =
CvdT. If the entropy is information then its derivative must
be related to information as well. Hence Cv must be related to
information. This however means that the energy also somehow
related to information.


If the entropy is the same as information, than through the
derivatives all thermodynamic properties are related to
information as well. I am not sure if this makes sense in respect
for example to design a self-driving car.



The information embodied in the thermodynamic state is presumably
not relevant to the design of a self-driving car. By the same token,
thermodynamic treatment (typically) discards a lot of information
useful for engineering.


Sorry, I do not understand what this means.


I am aware of works that estimated the thermodynamic limit (kT) to
process information. I do not see however, how this proves the
equivalence of information and entropy.

Evgenii


...


and since information is measured by order, a maximum of order is
conveyed by a maximum of disorder. Obviously, this is a Babylonian
muddle. Somebody or something has confounded our language.



I would say it is many people, rather than just one. I wrote On
Complexity and Emergence in response to the amount of unmitigated
tripe I've seen written about these topics.




I have found your work on archiv.org and I will look at it. Thank you for 
mentioning it.

Evgenii



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-21 Thread Evgenii Rudnyi

On 21.01.2012 20:00 meekerdb said the following:

On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:




...


2) If physicists say that information is the entropy, they must
take it literally and then apply experimental thermodynamics to
measure information. This however seems not to happen.


It does happen. The number of states, i.e. the information, available
 from a black hole is calculated from it's thermodynamic properties
as calculated by Hawking. At a more conventional level, counting the
states available to molecules in a gas can be used to determine the
specific heat of the gas and vice-verse. The reason the thermodynamic
measures and the information measures are treated separately in
engineering problems is that the information that is important to
engineering is infinitesimal compared to the information stored in
the microscopic states. So the latter is considered only in terms of
a few macroscopic averages, like temperature and pressure.

Brent


Doesn't this mean that by information engineers means something 
different as physicists?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-21 Thread meekerdb

On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:

On 21.01.2012 20:00 meekerdb said the following:

On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:




...


2) If physicists say that information is the entropy, they must
take it literally and then apply experimental thermodynamics to
measure information. This however seems not to happen.


It does happen. The number of states, i.e. the information, available
 from a black hole is calculated from it's thermodynamic properties
as calculated by Hawking. At a more conventional level, counting the
states available to molecules in a gas can be used to determine the
specific heat of the gas and vice-verse. The reason the thermodynamic
measures and the information measures are treated separately in
engineering problems is that the information that is important to
engineering is infinitesimal compared to the information stored in
the microscopic states. So the latter is considered only in terms of
a few macroscopic averages, like temperature and pressure.

Brent


Doesn't this mean that by information engineers means something different as 
physicists?


I don't think so. A lot of the work on information theory was done by communication 
engineers who were concerned with the effect of thermal noise on bandwidth.  Of course 
engineers specialize more narrowly than physics, so within different fields of engineering 
there are different terminologies and different measurement methods for things that are 
unified in basic physics, e.g. there are engineers who specialize in magnetism and who 
seldom need to reflect that it is part of EM, there are others who specialize in RF and 
don't worry about static fields.


Brent



Evgenii



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-21 Thread Evgenii Rudnyi

On 21.01.2012 21:01 meekerdb said the following:

On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:

On 21.01.2012 20:00 meekerdb said the following:

On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:




...


2) If physicists say that information is the entropy, they
must take it literally and then apply experimental
thermodynamics to measure information. This however seems not
to happen.


It does happen. The number of states, i.e. the information,
available from a black hole is calculated from it's thermodynamic
properties as calculated by Hawking. At a more conventional
level, counting the states available to molecules in a gas can be
used to determine the specific heat of the gas and vice-verse.
The reason the thermodynamic measures and the information
measures are treated separately in engineering problems is that
the information that is important to engineering is infinitesimal
compared to the information stored in the microscopic states. So
the latter is considered only in terms of a few macroscopic
averages, like temperature and pressure.

Brent


Doesn't this mean that by information engineers means something
different as physicists?


I don't think so. A lot of the work on information theory was done by
 communication engineers who were concerned with the effect of
thermal noise on bandwidth. Of course engineers specialize more
narrowly than physics, so within different fields of engineering
there are different terminologies and different measurement methods
for things that are unified in basic physics, e.g. there are
engineers who specialize in magnetism and who seldom need to reflect
that it is part of EM, there are others who specialize in RF and
don't worry about static fields.


Do you mean that engineers use experimental thermodynamics to determine 
information?


Evgenii


Brent



Evgenii





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-20 Thread Craig Weinberg
On Jan 19, 5:40 pm, meekerdb meeke...@verizon.net wrote:
 On 1/19/2012 2:05 PM, Craig Weinberg wrote:

  How is one any form of information more or less likely to be causally
  effective than any other form?

 Would you rather have an instruction manual in English or Urdu?

Since I tend to put instruction manuals in a drawer and never look at
them, I would rather have the Urdu one as a novelty.

What difference does it make what I would rather have though? Both the
English and Urdu manuals are equally informative or non-informative
objectively (assuming they are equivalent translations), and neither
of them are causally effective without a subjective interpreter who is
causally effective.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-19 Thread Craig Weinberg
On Jan 19, 12:37 am, meekerdb meeke...@verizon.net wrote:
 On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:









  On 18.01.2012 18:47 John Clark said the following:
  On Sun, Jan 15, 2012 at 3:54 PM, Evgenii Rudnyiuse...@rudnyi.ru
  wrote:

   Some physicists say that information is related to the entropy

  That is incorrect, ALL physicists say that information is related to
  entropy. There are quite a number of definitions of entropy, one I
  like, although not as rigorous as some it does convey the basic idea:
  entropy is a measure of the number of ways the microscopic structure
  of something can be changed without changing the macroscopic
  properties. Thus, the living human body has very low entropy because
  there are relatively few changes that could be made in it without a
  drastic change in macroscopic properties, like being dead; a bucket
  of water has a much higher entropy because there are lots of ways you
  could change the microscopic position of all those water molecules
  and it would still look like a bucket of water; cool the water and
  form ice and you have less entropy because the molecules line up into
  a orderly lattice so there are fewer changes you could make. The
  ultimate in high entropy objects is a Black Hole because whatever is
  inside one on the outside any Black Hole can be completely described
  with just 3 numbers, its mass, spin and electrical charge.

  John K Clark

  If you look around you may still find species of scientists who still are 
  working with
  classical thermodynamics (search for example for CALPHAD). Well, if you 
  refer to them as
  physicists or not, it is your choice. Anyway in experimental thermodynamics 
  people
  determine entropies, for example from CODATA tables

 http://www.codata.org/resources/databases/key1.html

  S ° (298.15 K)
  J K-1 mol-1

  Ag  cr  42.55 ą 0.20
  Al  cr  28.30 ą 0.10

  Do you mean that 1 mole of Ag has more information than 1 mole of Al at 
  298.15 K?

 Yes, it has more internal degrees of freedom so that it takes addition of 
 more energy in
 order to increase those we measure as temperature.

This suggests to me that a molecule of DNA belonging to a kangaroo
could have no more information than the same molecule with the primary
sequence scrambled into randomness or 'blanked out' with a single
repeating A-T base pair. That would seem to make this definition of
information the exact opposite of the colloquial meaning of the term.
A blank hard drive could have more information as one full of billions
of documents if the platters were at a different temperatures?

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-19 Thread meekerdb

On 1/19/2012 7:21 AM, Craig Weinberg wrote:

On Jan 19, 12:37 am, meekerdbmeeke...@verizon.net  wrote:

On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:










On 18.01.2012 18:47 John Clark said the following:

On Sun, Jan 15, 2012 at 3:54 PM, Evgenii Rudnyiuse...@rudnyi.ru
wrote:
 Some physicists say that information is related to the entropy
That is incorrect, ALL physicists say that information is related to
entropy. There are quite a number of definitions of entropy, one I
like, although not as rigorous as some it does convey the basic idea:
entropy is a measure of the number of ways the microscopic structure
of something can be changed without changing the macroscopic
properties. Thus, the living human body has very low entropy because
there are relatively few changes that could be made in it without a
drastic change in macroscopic properties, like being dead; a bucket
of water has a much higher entropy because there are lots of ways you
could change the microscopic position of all those water molecules
and it would still look like a bucket of water; cool the water and
form ice and you have less entropy because the molecules line up into
a orderly lattice so there are fewer changes you could make. The
ultimate in high entropy objects is a Black Hole because whatever is
inside one on the outside any Black Hole can be completely described
with just 3 numbers, its mass, spin and electrical charge.
John K Clark

If you look around you may still find species of scientists who still are 
working with
classical thermodynamics (search for example for CALPHAD). Well, if you refer 
to them as
physicists or not, it is your choice. Anyway in experimental thermodynamics 
people
determine entropies, for example from CODATA tables
http://www.codata.org/resources/databases/key1.html
S ° (298.15 K)
J K-1 mol-1
Ag  cr  42.55 ą 0.20
Al  cr  28.30 ą 0.10
Do you mean that 1 mole of Ag has more information than 1 mole of Al at 298.15 
K?

Yes, it has more internal degrees of freedom so that it takes addition of more 
energy in
order to increase those we measure as temperature.

This suggests to me that a molecule of DNA belonging to a kangaroo
could have no more information than the same molecule with the primary
sequence scrambled into randomness or 'blanked out' with a single
repeating A-T base pair. That would seem to make this definition of
information the exact opposite of the colloquial meaning of the term.


That's because the colloquial meaning of the terms takes into account the environment and 
which form of information can be causally effective.


Brent


A blank hard drive could have more information as one full of billions
of documents if the platters were at a different temperatures?

Craig



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-19 Thread Evgenii Rudnyi

Russell,

I know that many physicists identify the entropy with information. 
Recently I had a nice discussion on biotaconv and people pointed out 
that presumably Edwin T. Jaynes was the first to make such a connection 
(Information theory and statistical mechanics, 1957). Google Scholar 
shows that his paper has been cited more than 5000 times, that is 
impressive and it shows indeed that this is in a way mainstream.


I have studied Jaynes papers but I have been stacked with for example

“With such an interpretation the expression “irreversible process” 
represents a semantic confusion; it is not the physical process that is 
irreversible, but rather our ability to follow it. The second law of 
thermodynamics then becomes merely the statement that although our 
information as to the state of a system may be lost in a variety of 
ways, the only way in which it can be gained is by carrying out further 
measurements.”


“It is important to realize that the tendency of entropy to increase is 
not a consequence of the laws of physics as such, … . An entropy 
increase may occur unavoidably, due to our incomplete knowledge of the 
forces acting on a system, or it may be entirely voluntary act on our part.”


This is above of my understanding. As I have mentioned, I do not buy it, 
I still consider the entropy as it has been defined by for example Gibbs.


Basically I do not understand what the term information then brings. One 
can certainly state that information is the same as the entropy (we are 
free with definitions after all). Yet I miss the meaning of that. Let me 
put it this way, we have the thermodynamic entropy and then the 
informational entropy as defined by Shannon. The first used to designe a 
motor and the second to design a controller. Now let us suppose that 
these two entropies are the same. What this changes in a design of a 
motor and a controller? In my view nothing.


By the way, have you seen the answer to my question:

 Also remember that at constant volume dS = (Cv/T) dT and dU =
 CvdT. If the entropy is information then its derivative must be
 related to information as well. Hence Cv must be related to
 information. This however means that the energy also somehow
 related to information.

If the entropy is the same as information, than through the derivatives 
all thermodynamic properties are related to information as well. I am 
not sure if this makes sense in respect for example to design a 
self-driving car.


I am aware of works that estimated the thermodynamic limit (kT) to 
process information. I do not see however, how this proves the 
equivalence of information and entropy.


Evgenii

P.S. For a long time, people have identified the entropy with chaos. I 
have recently read a nice book to this end, Entropy and Art by Arnheim, 
1971, it is really nice. One quote:


The absurd consequences of neglecting structure but using the concept 
of order just the same are evident if one examines the present 
terminology of information theory. Here order is described as the 
carrier of information, because information is defined as the opposite 
of entropy, and entropy is a measure of disorder. To transmit 
information means to induce order. This sounds reasonable enough. Next, 
since entropy grows with the probability of a state of affairs, 
information does the opposite: it increases with its improbability. The 
less likely an event is to happen, the more information does its 
occurrence represent. This again seems reasonable. Now what sort of 
sequence of events will be least predictable and therefore carry a 
maximum of information? Obviously a totally disordered one, since when 
we are confronted with chaos we can never predict what will happen next. 
The conclusion is that total disorder provides a maximum of information; 
and since information is measured by order, a maximum of order is 
conveyed by a maximum of disorder. Obviously, this is a Babylonian 
muddle. Somebody or something has confounded our language.


--
http://blog.rudnyi.ru




On 18.01.2012 23:42 Russell Standish said the following:

On Wed, Jan 18, 2012 at 08:13:07PM +0100, Evgenii Rudnyi wrote:

On 18.01.2012 18:47 John Clark said the following:

On Sun, Jan 15, 2012 at 3:54 PM, Evgenii
Rudnyiuse...@rudnyi.ru wrote:

 Some physicists say that information is related to the
entropy




That is incorrect, ALL physicists say that information is related
to entropy. There are quite a number of definitions of entropy,
one I like, although not as rigorous as some it does convey the
basic idea: entropy is a measure of the number of ways the
microscopic structure of something can be changed without
changing the macroscopic properties. Thus, the living human body
has very low entropy because there are relatively few changes
that could be made in it without a drastic change in macroscopic
properties, like being dead; a bucket of water has a much higher
entropy because there are lots of ways you could change the
microscopic 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-19 Thread Evgenii Rudnyi

On 19.01.2012 06:37 meekerdb said the following:

On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:


...


If you look around you may still find species of scientists who
still are working with classical thermodynamics (search for example
for CALPHAD). Well, if you refer to them as physicists or not, it
is your choice. Anyway in experimental thermodynamics people
determine entropies, for example from CODATA tables

http://www.codata.org/resources/databases/key1.html

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

Do you mean that 1 mole of Ag has more information than 1 mole of
Al at 298.15 K?


Yes, it has more internal degrees of freedom so that it takes
addition of more energy in order to increase those we measure as
temperature.


Could you please explain then why engineers do not use the CODATA/JANAF 
Tables to find the best material to keep information?


Evgenii



Brent



Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.
If the entropy is information then its derivative must be related
to information as well. Hence Cv must be related to information.
This however means that the energy also somehow related to
information.

Finally, the entropy is defined by the Second Law and the best
would be to stick to this definition. Only in this case, it is
possible to understand what we are talking about.

Evgenii




--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-19 Thread meekerdb

On 1/19/2012 11:06 AM, Evgenii Rudnyi wrote:

On 19.01.2012 06:37 meekerdb said the following:

On 1/18/2012 11:13 AM, Evgenii Rudnyi wrote:


...


If you look around you may still find species of scientists who
still are working with classical thermodynamics (search for example
for CALPHAD). Well, if you refer to them as physicists or not, it
is your choice. Anyway in experimental thermodynamics people
determine entropies, for example from CODATA tables

http://www.codata.org/resources/databases/key1.html

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

Do you mean that 1 mole of Ag has more information than 1 mole of
Al at 298.15 K?


Yes, it has more internal degrees of freedom so that it takes
addition of more energy in order to increase those we measure as
temperature.


Could you please explain then why engineers do not use the CODATA/JANAF Tables to find 
the best material to keep information?


Because they are interested in information that they can insert and retrieve.  I once 
invented write-only-memory, but it didn't sell. :-)


Brent



Evgenii



Brent



Also remember that at constant volume dS = (Cv/T) dT and dU = CvdT.



If the entropy is information then its derivative must be related
to information as well. Hence Cv must be related to information.
This however means that the energy also somehow related to
information.

Finally, the entropy is defined by the Second Law and the best
would be to stick to this definition. Only in this case, it is
possible to understand what we are talking about.

Evgenii






--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



  1   2   >