Panpsychist emergence

2019-12-07 Thread Philip Thrift

I am now - it seems - in the category "panpsychist friends & colleagues".

https://twitter.com/smellosopher/status/1203343875173158913 

[image: Image may contain: text]





@philipthrift

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/75da76b4-7d7c-4ef5-81ec-a58b75efee4b%40googlegroups.com.


Re: On the emergence of the solidly real from the realm of the abstract

2019-02-23 Thread 'Chris de Morsella' via Everything List
"Quantum error correction may be how the emergent fabric of space-time achieves 
its robustness, despite being woven out of fragile quantum particles."

Intriguing suggestion for the exquisite first person experiential stability of 
this mysterious space-time emerging from quantum soup in a holographic 
universe if this hypothesis maps to actual reality that is.
The Dong, Silverstein and Torroba’s dS/dS model seems like an important step to 
conceptually enable a holographic universe with emergent properties of 
space-time (and maybe emergent gravity as well) in positively curved space time 
-- e.g. our universe -- in which the boundry layer must, by virtue of its in 
our case ever so slight positive curvature be infinitely in the future. This is 
a problem for any holographic projection (where is the screen... the lower 
dimensional boundary layer, if it is infinitely far off?)

Still wrapping my head around this conceptual model, using mathematics from 
string theory (theoretical Randall-Sundrum throats) to help "uplift" each AdS, 
transforming the two saddle-shaped AdS spaces into bowl-shaped dS spaces, which 
are subsequenyly "glued" together. The CFTs (lower dimensional boundary layers) 
describing both hemispheres become coupled with each other, forming a single 
quantum system that is holographically dual to the entire spherical de Sitter 
space.
Quite a neat trick that may help to further investigate the holographic 
universe hypothesis.
As stated by Patrick Hayden, a theoretical physicist and computer scientist at 
Stanford who studies the AdS/CFT correspondence and its relationship to quantum 
error correction, said he and other experts are mulling over Dong, Silverstein 
and Torroba’s dS/dS model. He said it’s too soon to tell whether insights about 
how space-time is woven and how quantum gravity works in AdS space will carry 
over to a de Sitter model. “But there’s a path — something to be done,” Hayden 
said. “You can formulate concrete mathematical questions. I think a lot is 
going to happen in the next few years.”
How Our Universe Could Emerge as a Hologram | Quanta Magazine  
|  
|   
|   
|   ||

   |

  |
|  
|   |  
How Our Universe Could Emerge as a Hologram | Quanta Magazine
 
Physicists have devised a holographic model of “de Sitter space,” the term for 
a universe like ours, that could give us new clues about the origin of space 
and time.
  |   |

  |

  |

  

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at https://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.


Re: Re: Emergence of Properties

2012-11-17 Thread Roger Clough
Hi Stephen P. King 

OK if you're satisfied with a vague feeling of agreement among
multiple observers. That of course would cause you to see fuzzy
or incomplete objects.

The Turing Test was suggested to try to wake you up.

[Roger Clough], [rclo...@verizon.net]
11/17/2012 
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-16, 11:37:13
Subject: Re: Emergence of Properties


On 11/16/2012 8:48 AM, Roger Clough wrote:

Hi Stephen P. King 

But how could one know if the others are telling the truth ?

Umm, I only assume the barest appearance of interactions. All of this is 
fully consistent with Leibniz' monadology. Monads have no windows and do not 
exchange substances. All interactions are only mutual synchronizations of their 
percepts.


The surest test could only be a Turing Test. 

I am not sure how that is related...



Plus I have another difficulty with solipsim. If perception
must proceed existence, then one could never be stabbed
in the back. 

Existence must be primitive ontologically, or else how are properties to be 
extracted from it by perception? There are no knives (or spoons), only 
phenomena of mutual agreements.



[Roger Clough], [rclo...@verizon.net]
11/16/2012 
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-16, 07:25:39
Subject: Re: Emergence of Properties


On 11/16/2012 6:44 AM, Roger Clough wrote:

Hi Stephen P. King 

How is the agreement of many minds known if they are all solipsists ? 

Hi Roger,

The agreement is known by the appearance of a common world. It is the 
manifestation of their mutual truth. 



-- 
Onward!

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-16 Thread Roger Clough
Hi Stephen P. King 

How is the agreement of many minds known if they are all solipsists ? 


[Roger Clough], [rclo...@verizon.net]
11/16/2012 
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-15, 16:46:15
Subject: Re: Emergence of Properties


On 11/15/2012 11:27 AM, Roger Clough wrote:

Hi Stephen P. King 

But many minds are in agreement that God exists, so that must be true ?

Hi Roger,

In my proposed definitions, must only follows if and only if there is no 
accessible possible world where a contraindication of the agreement occurs. Put 
more simply, a statement is true iff there is no knowable contradiction of the 
statement. The possible existence of an unknowable contradiction to the truth 
of a statement acts to support the idea of fallibility.



And must unicorns exist because I believe that they do ?

The existence or non-existence is not contingent on anything, especially 
the belief of one person. Your question should be phrased as: Must unicorns be 
a physical creature because of my belief in such? The answer might be yes is 
there is some means by which your belief has the causal power to generate a 
physical being.


-- 
Onward!

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-16 Thread Stephen P. King

On 11/16/2012 6:44 AM, Roger Clough wrote:

Hi Stephen P. King
How is the agreement of many minds known if they are all solipsists ?


Hi Roger,

The agreement is known by the appearance of a common world. It is 
the manifestation of their mutual truth.



[Roger Clough], [rclo...@verizon.net] mailto:rclo...@verizon.net]
11/16/2012
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content -
*From:* Stephen P. King mailto:stephe...@charter.net
*Receiver:* everything-list mailto:everything-list@googlegroups.com
*Time:* 2012-11-15, 16:46:15
*Subject:* Re: Emergence of Properties

On 11/15/2012 11:27 AM, Roger Clough wrote:

Hi Stephen P. King
But many minds are in agreement that God exists, so that must be
true ?


Hi Roger,

In my proposed definitions, must only follows if and only if
there is no accessible possible world where a contraindication of
the agreement occurs. Put more simply, a statement is true iff
there is no knowable contradiction of the statement. The possible
existence of an unknowable contradiction to the truth of a
statement acts to support the idea of fallibility.


And must unicorns exist because I believe that they do ?


The existence or non-existence is not contingent on anything,
especially the belief of one person. Your question should be
phrased as: Must unicorns be a physical creature because of my
belief in such? The answer might be yes is there is some means
by which your belief has the causal power to generate a physical
being.




--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-16 Thread Roger Clough
Hi Stephen P. King 

But how could one know if the others are telling the truth ?
The surest test could only be a Turing Test. 

Plus I have another difficulty with solipsim. If perception
must proceed existence, then one could never be stabbed
in the back. 

[Roger Clough], [rclo...@verizon.net]
11/16/2012 
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-16, 07:25:39
Subject: Re: Emergence of Properties


On 11/16/2012 6:44 AM, Roger Clough wrote:

Hi Stephen P. King 

How is the agreement of many minds known if they are all solipsists ? 

Hi Roger,

The agreement is known by the appearance of a common world. It is the 
manifestation of their mutual truth. 




[Roger Clough], [rclo...@verizon.net]
11/16/2012 
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-15, 16:46:15
Subject: Re: Emergence of Properties


On 11/15/2012 11:27 AM, Roger Clough wrote:

Hi Stephen P. King 

But many minds are in agreement that God exists, so that must be true ?

Hi Roger,

In my proposed definitions, must only follows if and only if there is no 
accessible possible world where a contraindication of the agreement occurs. Put 
more simply, a statement is true iff there is no knowable contradiction of the 
statement. The possible existence of an unknowable contradiction to the truth 
of a statement acts to support the idea of fallibility.



And must unicorns exist because I believe that they do ?

The existence or non-existence is not contingent on anything, especially 
the belief of one person. Your question should be phrased as: Must unicorns be 
a physical creature because of my belief in such? The answer might be yes is 
there is some means by which your belief has the causal power to generate a 
physical being.




-- 
Onward!

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-16 Thread Stephen P. King

On 11/16/2012 8:48 AM, Roger Clough wrote:

Hi Stephen P. King
But how could one know if the others are telling the truth ?


Umm, I only assume the barest appearance of interactions. All of 
this is fully consistent with Leibniz' monadology. Monads have no 
windows and do not exchange substances. All interactions are only mutual 
synchronizations of their percepts.



The surest test could only be a Turing Test.


I am not sure how that is related...


Plus I have another difficulty with solipsim. If perception
must proceed existence, then one could never be stabbed
in the back.


Existence must be primitive ontologically, or else how are 
properties to be extracted from it by perception? There are no knives 
(or spoons), only phenomena of mutual agreements.



[Roger Clough], [rclo...@verizon.net] mailto:rclo...@verizon.net]
11/16/2012
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content -
*From:* Stephen P. King mailto:stephe...@charter.net
*Receiver:* everything-list mailto:everything-list@googlegroups.com
*Time:* 2012-11-16, 07:25:39
*Subject:* Re: Emergence of Properties

On 11/16/2012 6:44 AM, Roger Clough wrote:

Hi Stephen P. King
How is the agreement of many minds known if they are all
solipsists ?


Hi Roger,

The agreement is known by the appearance of a common world. It
is the manifestation of their mutual truth.




--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-15 Thread Roger Clough
Hi Stephen P. King 

But many minds are in agreement that God exists, so that must be true ?

And must unicorns exist because I believe that they do ?


[Roger Clough], [rclo...@verizon.net]
11/15/2012 
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-03, 12:31:14
Subject: Re: Emergence of Properties


On 11/3/2012 8:57 AM, Roger Clough wrote:

The properties of spacetime things are what can be measured (ie facts).
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths).

Hi Roger,

I do not assume that the can't be contradicted is an a priori fixed 
apartheid on truths. I define necessary truths to be contingent on many minds 
in agreement.


-- 
Onward!

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-15 Thread Stephen P. King

On 11/15/2012 11:27 AM, Roger Clough wrote:

Hi Stephen P. King
But many minds are in agreement that God exists, so that must be true ?


Hi Roger,

In my proposed definitions, must only follows if and only if 
there is no accessible possible world where a contraindication of the 
agreement occurs. Put more simply, a statement is true iff there is no 
knowable contradiction of the statement. The possible existence of an 
unknowable contradiction to the truth of a statement acts to support the 
idea of fallibility.



And must unicorns exist because I believe that they do ?


The existence or non-existence is not contingent on anything, 
especially the belief of one person. Your question should be phrased as: 
Must unicorns be a physical creature because of my belief in such? The 
answer might be yes is there is some means by which your belief has 
the causal power to generate a physical being.


--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-05 Thread Roger Clough
Hi Stephen P. King 

I have no problem with that, although
I do think that there are some eternal truths
external to those minds.

Roger Clough, rclo...@verizon.net
11/5/2012 
Forever is a long time, especially near the end. -Woody Allen


- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-11-03, 13:31:14
Subject: Re: Emergence of Properties


On 11/3/2012 8:57 AM, Roger Clough wrote:

The properties of spacetime things are what can be measured (ie facts).
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths).

Hi Roger,

I do not assume that the can't be contradicted is an a priori fixed 
apartheid on truths. I define necessary truths to be contingent on many minds 
in agreement.


-- 
Onward!

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-05 Thread Roger Clough
Hi Stephen P. King  

In the end, we must accept a truth, so in the end,
all truth is pragmatic. We must cast our own vote.


Roger Clough, rclo...@verizon.net 
11/5/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-03, 13:31:14 
Subject: Re: Emergence of Properties 


On 11/3/2012 8:57 AM, Roger Clough wrote: 

The properties of spacetime things are what can be measured (ie facts). 
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths). 

Hi Roger, 

I do not assume that the can't be contradicted is an a priori fixed 
apartheid on truths. I define necessary truths to be contingent on many minds 
in agreement. 


--  
Onward! 

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-05 Thread Stephen P. King

On 11/5/2012 1:17 PM, Roger Clough wrote:

Hi Stephen P. King
I have no problem with that, although
I do think that there are some eternal truths
external to those minds.


Dear Roger,

OK, but what allows those 'external truths to be knowable? Maybe 
they are unknowable and if so what difference does their existence make? 
Think about what my claim below implies as we take the number of minds 
to infinity. Does the truth value increase to certainty of an arbitrary 
statement or not? Is it possible for an infinite number of minds to 
agree on the truth value of more than one sentence?



Roger Clough, rclo...@verizon.net mailto:rclo...@verizon.net
11/5/2012
Forever is a long time, especially near the end. -Woody Allen

- Receiving the following content -
*From:* Stephen P. King mailto:stephe...@charter.net
*Receiver:* everything-list mailto:everything-list@googlegroups.com
*Time:* 2012-11-03, 13:31:14
*Subject:* Re: Emergence of Properties

On 11/3/2012 8:57 AM, Roger Clough wrote:

The properties of spacetime things are what can be measured (ie facts).
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths).

Hi Roger,

I do not assume that the can't be contradicted is an a
priori fixed apartheid on truths. I define necessary truths to be
contingent on _many minds_ in agreement.

-- 




--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-05 Thread Stephen P. King

On 11/5/2012 1:19 PM, Roger Clough wrote:

Hi Stephen P. King

In the end, we must accept a truth, so in the end,
all truth is pragmatic. We must cast our own vote.

Dear Roger,

Are you familiar with Kenneth Arrow's impossibility theorem 
http://en.wikipedia.org/wiki/Arrow%27s_impossibility_theorem and the 
voting paradox ?


http://mindyourdecisions.com/blog/2008/02/12/game-theory-tuesdays-someone-is-going-to-be-unhappy-an-illustration-of-the-voting-paradox/

The executive summary is that whenever there are at least 2 people and 
at least 3 options, it's impossible to aggregate individual preferences 
without violating some desired conditions, like Pareto efficiency. You 
either have to accept that society will not act rationally like an 
individual would, or you have to accept that society's preferences will 
exactly mimic one person's preferences. In a sense, that makes the 
individual a dictator.


I suspect that this impossibility might explain why people are so 
easily seduced by arguments like Einstein's quip: The moon still exists 
if I am not looking at it! We always over-value our own individual 
contribution to the definiteness of properties that we observe in the 
physical universe. It also might have something to do with theproblem of 
Free Will http://plato.stanford.edu/entries/freewill/ and the absurd 
implications of the Quantum Suicide argument 
http://en.wikipedia.org/wiki/Quantum_suicide_and_immortality.


--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-05 Thread Roger Clough
Hi Stephen P. King  

Hmmm. Spacetime is xyzt and so extended, 
1p is inextended and so not part of that.
Thus, contrary to you and Berkeley,
1p and the physical universe do not need
each other. xyzt does fine on its own.


Roger Clough, rclo...@verizon.net 
11/5/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-03, 13:35:50 
Subject: Re: Emergence of Properties 


On 11/3/2012 9:18 AM, Roger Clough wrote: 
 Yes, Aristotle's substances and their properties do not change with time. 
 But Leibniz's do very rapidly. And they are individual to each substance, 
 meaning to each monad (from his aspect). The actual properties are 
 collective data of the universe. 
Hi Roger, 

 I do not assume a single physical universe that is independent of  
entities with 1p. I call this idea the Fish bowl model. I see the  
physical universe as a dream that is the same for many 1p, a literal  
mass delusion! 

--  
Onward! 

Stephen 


--  
You received this message because you are subscribed to the Google Groups 
Everything List group. 
To post to this group, send email to everything-list@googlegroups.com. 
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com. 
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-05 Thread Roger Clough
Hi Stephen P. King  

Simple. All truths can probably only be known by the One who
it seems generated them (not sure). 


Roger Clough, rclo...@verizon.net 
11/5/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-05, 13:43:57 
Subject: Re: Emergence of Properties 


On 11/5/2012 1:17 PM, Roger Clough wrote: 

Hi Stephen P. King  

I have no problem with that, although 
I do think that there are some eternal truths 
external to those minds. 

Dear Roger, 

OK, but what allows those 'external truths to be knowable? Maybe they are 
unknowable and if so what difference does their existence make? Think about 
what my claim below implies as we take the number of minds to infinity. Does 
the truth value increase to certainty of an arbitrary statement or not? Is it 
possible for an infinite number of minds to agree on the truth value of more 
than one sentence? 



Roger Clough, rclo...@verizon.net 
11/5/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-03, 13:31:14 
Subject: Re: Emergence of Properties 


On 11/3/2012 8:57 AM, Roger Clough wrote: 

The properties of spacetime things are what can be measured (ie facts). 
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths). 

Hi Roger, 

I do not assume that the can't be contradicted is an a priori fixed 
apartheid on truths. I define necessary truths to be contingent on many minds 
in agreement. 


--  



--  
Onward! 

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-05 Thread Roger Clough
Hi Stephen P. King  

Thanks for the heads up. We ask that question every four
years in the USA-- namely should the popular vote or should
the votes from the individual states  (the electoral vote) decide 
who becomes president ?

In the first Bush election, Gore won the popular vote but Bush
at the last moment narrowly squeezed out the electoral vote and 
so won at least officially. 


Roger Clough, rclo...@verizon.net 
11/5/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-05, 13:51:48 
Subject: Re: Emergence of Properties 


On 11/5/2012 1:19 PM, Roger Clough wrote: 

Hi Stephen P. King   

In the end, we must accept a truth, so in the end, 
all truth is pragmatic. We must cast our own vote. 

Dear Roger, 

Are you familiar with Kenneth Arrow's impossibility theorem and the voting 
paradox ? 

http://mindyourdecisions.com/blog/2008/02/12/game-theory-tuesdays-someone-is-going-to-be-unhappy-an-illustration-of-the-voting-paradox/
 

The executive summary is that whenever there are at least 2 people and at 
least 3 options, it? impossible to aggregate individual preferences without 
violating some desired conditions, like Pareto efficiency. You either have to 
accept that society will not act rationally like an individual would, or you 
have to accept that society? preferences will exactly mimic one person? 
preferences. In a sense, that makes the individual a dictator. 

I suspect that this impossibility might explain why people are so easily 
seduced by arguments like Einstein's quip: The moon still exists if I am not 
looking at it! We always over-value our own individual contribution to the 
definiteness of properties that we observe in the physical universe. It also 
might have something to do with the problem of Free Will and the absurd 
implications of the Quantum Suicide argument. 


--  
Onward! 

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-05 Thread meekerdb

On 11/5/2012 12:51 PM, Stephen P. King wrote:

On 11/5/2012 1:19 PM, Roger Clough wrote:

Hi Stephen P. King

In the end, we must accept a truth, so in the end,
all truth is pragmatic. We must cast our own vote.

Dear Roger,

Are you familiar with Kenneth Arrow's impossibility theorem 
http://en.wikipedia.org/wiki/Arrow%27s_impossibility_theorem and the voting paradox ?


http://mindyourdecisions.com/blog/2008/02/12/game-theory-tuesdays-someone-is-going-to-be-unhappy-an-illustration-of-the-voting-paradox/

The executive summary is that whenever there are at least 2 people and at least 3 
options, it's impossible to aggregate individual preferences without violating some 
desired conditions, like Pareto efficiency. You either have to accept that society will 
not act rationally like an individual would, or you have to accept that society's 
preferences will exactly mimic one person's preferences. In a sense, that makes the 
individual a dictator.


Which is why science is successful in reaching agreements.  It seeks to persuade people by 
evidence instead of just aggregating opinions.


Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-04 Thread Roger Clough
Hi Stephen P. King  

All that we can know of reality is in the experience of now.



Roger Clough, rclo...@verizon.net 
11/4/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-03, 13:26:12 
Subject: Re: Emergence of Properties 


On 11/3/2012 8:22 AM, Bruno Marchal wrote: 



On 03 Nov 2012, at 12:17, Stephen P. King wrote: 



?? After I wrote the above I can see how you would think of properties as being 
innate,  


I meant independent of us. Not innate in the sense of psychology. 


Dear Bruno, 

?? Please elaborate on what this independence implies that has to do with the 
definiteness of properties. 




but I see this as just a mental crutch that you are using to not think too 
deeply about the concept of property.  


I garee with what Leibiz said, and what Frege and the logicians have done with 
it. 

?? Any elaboration or link on this? 




The situation is the same for your difficulty with my hypothesis of meaning. We 
learn to associate meanings to words so that words are more than just 
combinations of letters, but this is just the internalization of the 
associations and relations within our thinking process. 



You are too much unclear, for me. I can agree and disagree. As long as you 
don't present your theory it is hard to find out what you mean.? 


Bruno 



?? Please understand that I am still developing my thesis, it is not yet born. 
It is like a jig-saw puzzle with most of the Big Picture on the box missing...  



?? Even today I realized a new piece of the picture, but I don't know how to 
explain it... It has to do with the way that the duality permutes under 
exponentiation in Pratt's theory in a way that might be a better way to connect 
it with comp. 
?? The canonical transformation of the duality, in Pratt's theory, is an exact 
or bijective chain of transformations ... - body - mind - body - mind - 
... This makes the isomorphism between the Stone spaces and Boolean algebras 
into a bijective map equivalent to an automorphism. If we consider the 
transformation for the case there it is almost but not quite bijective, then we 
get orbits that tend to be near the automorphism, like the orbits of a strange 
attractor and not exactly periodic in space/time. This can be taken to 
something like an ergodic map where the orbits of the transformation are never 
periodic and every body and mind in the chain is different. 
? 


--  
Onward! 

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-04 Thread Stephen P. King

On 11/4/2012 7:40 AM, Roger Clough wrote:

Hi Stephen P. King

All that we can know of reality is in the experience of now.


Hi Roger,

Yes, in our mutual consistency and individually, but we have to 
start with a 'now' at the 1p for each observer. Every observer perceived 
itself at the center of its own universe.






Roger Clough, rclo...@verizon.net
11/4/2012
Forever is a long time, especially near the end. -Woody Allen


- Receiving the following content -
From: Stephen P. King
Receiver: everything-list
Time: 2012-11-03, 13:26:12
Subject: Re: Emergence of Properties


On 11/3/2012 8:22 AM, Bruno Marchal wrote:



On 03 Nov 2012, at 12:17, Stephen P. King wrote:



?? After I wrote the above I can see how you would think of properties as being 
innate,


I meant independent of us. Not innate in the sense of psychology.


Dear Bruno,

?? Please elaborate on what this independence implies that has to do with the 
definiteness of properties.




but I see this as just a mental crutch that you are using to not think too 
deeply about the concept of property.


I garee with what Leibiz said, and what Frege and the logicians have done with 
it.

?? Any elaboration or link on this?




The situation is the same for your difficulty with my hypothesis of meaning. We 
learn to associate meanings to words so that words are more than just 
combinations of letters, but this is just the internalization of the 
associations and relations within our thinking process.



You are too much unclear, for me. I can agree and disagree. As long as you 
don't present your theory it is hard to find out what you mean.?


Bruno



?? Please understand that I am still developing my thesis, it is not yet born. 
It is like a jig-saw puzzle with most of the Big Picture on the box missing...



?? Even today I realized a new piece of the picture, but I don't know how to 
explain it... It has to do with the way that the duality permutes under 
exponentiation in Pratt's theory in a way that might be a better way to connect 
it with comp.
?? The canonical transformation of the duality, in Pratt's theory, is an exact or bijective 
chain of transformations ... - body - mind - body - mind - ... This makes 
the isomorphism between the Stone spaces and Boolean algebras into a bijective map equivalent 
to an automorphism. If we consider the transformation for the case there it is almost but not 
quite bijective, then we get orbits that tend to be near the automorphism, like the orbits of 
a strange attractor and not exactly periodic in space/time. This can be taken to something 
like an ergodic map where the orbits of the transformation are never periodic and every body 
and mind in the chain is different.
?




--
Onward!

Stephen


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-03 Thread Bruno Marchal


On 02 Nov 2012, at 20:48, Stephen P. King wrote:


On 11/2/2012 12:23 PM, Bruno Marchal wrote:

How can anything emerge from something having non properties? Magic?


Dear Bruno,

Why do you consider magic as a potential answer to your  
question? After thinking about your question while I was waiting to  
pick up my daughter from school, it occurred to me that we see in  
the Big Bang model and in almost all cosmogenesis myths before it,  
an attempt to answer your question. Do you believe that properties  
are innate in objects?


The arithmetical property of numbers are innate to the numbers, logic  
and the laws we assume.




If so, how do you propose the dependency on measurement, to 'make  
definite' the properties of objects that we see in quantum theory,  
works?


QM is not part of the theory.



My pathetic claim is that properties emerge from a 'subtractive  
process' (hat tip to Craig) between observers and that the One  
(totality of what exists) has all possible properties simultaneously  
(hat tip to Russell Standish).


?



I have never understood what aspects of QM theory are derivable  
from COMP.


Then study UDA. You must understand that the *whole* of physics is  
derivable, not from comp, but from elemntary arithmetic only. This is  
what is proved from comp. Ask question if you have a problem with any  
step.




Do you have any result that show the general non-commutativity  
between observables of QM,


Yes. That is testable in the Z1* comp quantum logic. It has not yet  
been completely justified, as the statement involve too many nesting  
of modal operator to be currently tractable.




or do you just show that the linear algebraic structure of  
observables (as we see in Hilbert spaces) can be derived from 1p  
indeterminacy?


Both.


The linear properties and the general non-commutativity properties  
of operators (representing physical observables) are not the same  
thing...


Of course. But the whole physics is given by the first order extension  
of the Z and X logic. This is necessary if we assume comp and the  
classical theory of knowledge (S4).


Bruno

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-03 Thread Stephen P. King

On 11/3/2012 5:26 AM, Bruno Marchal wrote:
The arithmetical property of numbers are innate to the numbers, logic 
and the laws we assume.



Dear Bruno,

How? How are properties innate? This idea makes no sense to me, it 
never has as it does not allow for any explanation of apprehension of 
properties in my consideration... The only explanation of properties 
that makes sense to me is that of Leibniz: Properties are given by 
relations. We might think of objects as bundles of properties but this 
is problematic as it implies that properties are objects themselves. I 
think of properties similar to what Leibniz did: 
http://plato.stanford.edu/entries/substance/#DesSpiLei


Leibniz's substances, however, are the bearers of change (criterion 
(iv)) in a very different way from Aristotle's individual substances. An 
Aristotelian individual possesses some properties essentially and some 
accidentally. The accidental properties of an object are ones that can 
be gained and lost over time, and which it might never have possessed at 
all: its essential properties are the only ones it had to possess and 
which it possesses throughout its existence. The situation is different 
for Leibniz's/monads/---which is the name he gives to individual 
substances, created or uncreated (so God is a monad). Whereas, for 
Aristotle, the properties that an object/has to/possess and those that 
it possesses/throughout its existence/coincide, they do not do so for 
Leibniz. That is, for Leibniz, even the properties that an object 
possesses only for a part of its existence are essential to it. Every 
monad bears each of its properties as part of its nature, so if it were 
to have been different in any respect, it would have been a different 
entity.


Furthermore, there is a sense in which all monads are exactly similar to 
each other, for they all reflect the whole world. They each do so, 
however, from a different perspective.


   For God, so to speak, turns on all sides and considers in all ways
   the general system of phenomena which he has found it good to
   produce...And he considers all the faces of the world in all
   possible ways...the result of each view of the universe, as looked
   at from a certain position, is...a substance which expresses the
   universe in conformity with that view. (1998: 66)

So each monad reflects the whole system, but with its own perspective 
emphasized. If a monad is at place p at time t, it will contain all the 
features of the universe at all times, but with those relating to its 
own time and place most vividly, and others fading out roughly in 
accordance with temporal and spatial distance. Because there is a 
continuum of perspectives on reality, there is an infinite number of 
these substances. Nevertheless, there is internal change in the monads, 
because the respect in which its content is vivid varies with time and 
with action. Indeed, the passage of time just is the change in which of 
the monad's contents are most vivid.


The difference in my thinking to that of Leibniz is that a monad is 
never at place p at time t (location is defined solely interns of 
mutuality of perspectives) and monads are only substances in that they 
are eternal. I find it best to drop the idea of substance altogether as 
it can be completely defined in terms of invariances.


After I wrote the above I can see how you would think of properties 
as being innate, but I see this as just a mental crutch that you are 
using to not think too deeply about the concept of property. The 
situation is the same for your difficulty with my hypothesis of meaning. 
We learn to associate meanings to words so that words are more than just 
combinations of letters, but this is just the internalization of the 
associations and relations within our thinking process.


--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-03 Thread Stephen P. King

On 11/3/2012 5:26 AM, Bruno Marchal wrote:
The arithmetical property of numbers are innate to the numbers, logic 
and the laws we assume.

Hi,

This paper might be interesting to any one that would like to see a 
nice discussion of who it is that we come to understand numbers: 
http://web.media.mit.edu/~stefanm/society/som_final.html 
http://web.media.mit.edu/%7Estefanm/society/som_final.html


--
Onward!

Stephen


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-03 Thread Bruno Marchal


On 03 Nov 2012, at 12:17, Stephen P. King wrote:


On 11/3/2012 5:26 AM, Bruno Marchal wrote:
The arithmetical property of numbers are innate to the numbers,  
logic and the laws we assume.



Dear Bruno,

How? How are properties innate? This idea makes no sense to me,  
it never has as it does not allow for any explanation of  
apprehension of properties in my consideration... The only  
explanation of properties that makes sense to me is that of Leibniz:  
Properties are given by relations. We might think of objects as  
bundles of properties but this is problematic as it implies that  
properties are objects themselves. I think of properties similar to  
what Leibniz did: http://plato.stanford.edu/entries/substance/#DesSpiLei


Leibniz's substances, however, are the bearers of change (criterion  
(iv)) in a very different way from Aristotle's individual  
substances. An Aristotelian individual possesses some properties  
essentially and some accidentally. The accidental properties of an  
object are ones that can be gained and lost over time, and which it  
might never have possessed at all: its essential properties are the  
only ones it had to possess and which it possesses throughout its  
existence. The situation is different for Leibniz's monads—which is  
the name he gives to individual substances, created or uncreated (so  
God is a monad). Whereas, for Aristotle, the properties that an  
object has to possess and those that it possesses throughout its  
existence coincide, they do not do so for Leibniz. That is, for  
Leibniz, even the properties that an object possesses only for a  
part of its existence are essential to it. Every monad bears each of  
its properties as part of its nature, so if it were to have been  
different in any respect, it would have been a different entity.


Furthermore, there is a sense in which all monads are exactly  
similar to each other, for they all reflect the whole world. They  
each do so, however, from a different perspective.


For God, so to speak, turns on all sides and considers in all ways  
the general system of phenomena which he has found it good to  
produce…And he considers all the faces of the world in all possible  
ways…the result of each view of the universe, as looked at from a  
certain position, is…a substance which expresses the universe in  
conformity with that view. (1998: 66)
So each monad reflects the whole system, but with its own  
perspective emphasized. If a monad is at place p at time t, it will  
contain all the features of the universe at all times, but with  
those relating to its own time and place most vividly, and others  
fading out roughly in accordance with temporal and spatial distance.  
Because there is a continuum of perspectives on reality, there is an  
infinite number of these substances. Nevertheless, there is internal  
change in the monads, because the respect in which its content is  
vivid varies with time and with action. Indeed, the passage of time  
just is the change in which of the monad's contents are most vivid.


The difference in my thinking to that of Leibniz is that a monad  
is never at place p at time t (location is defined solely interns  
of mutuality of perspectives) and monads are only substances in  
that they are eternal. I find it best to drop the idea of substance  
altogether as it can be completely defined in terms of invariances.


After I wrote the above I can see how you would think of  
properties as being innate,


I meant independent of us. Not innate in the sense of psychology.





but I see this as just a mental crutch that you are using to not  
think too deeply about the concept of property.


I garee with what Leibiz said, and what Frege and the logicians have  
done with it.




The situation is the same for your difficulty with my hypothesis of  
meaning. We learn to associate meanings to words so that words are  
more than just combinations of letters, but this is just the  
internalization of the associations and relations within our  
thinking process.


You are too much unclear, for me. I can agree and disagree. As long as  
you don't present your theory it is hard to find out what you mean.


Bruno




--
Onward!

Stephen

--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 

Re: Re: Emergence of Properties

2012-11-03 Thread Roger Clough
Hi Stephen P. King  


The properties of spacetime things are what can be measured (ie facts).
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths).

Roger Clough, rclo...@verizon.net 
11/3/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-03, 07:17:58 
Subject: Re: Emergence of Properties 


On 11/3/2012 5:26 AM, Bruno Marchal wrote: 

The arithmetical property of numbers are innate to the numbers, logic and the 
laws we assume. 


Dear Bruno, 

How? How are properties innate? This idea makes no sense to me, it never 
has as it does not allow for any explanation of apprehension of properties in 
my consideration... The only explanation of properties that makes sense to me 
is that of Leibniz: Properties are given by relations. We might think of 
objects as bundles of properties but this is problematic as it implies that 
properties are objects themselves. I think of properties similar to what 
Leibniz did: http://plato.stanford.edu/entries/substance/#DesSpiLei 


Leibniz's substances, however, are the bearers of change (criterion (iv)) in a 
very different way from Aristotle's individual substances. An Aristotelian 
individual possesses some properties essentially and some accidentally. The 
accidental properties of an object are ones that can be gained and lost over 
time, and which it might never have possessed at all: its essential properties 
are the only ones it had to possess and which it possesses throughout its 
existence. The situation is different for Leibniz's monads?hich is the name he 
gives to individual substances, created or uncreated (so God is a monad). 
Whereas, for Aristotle, the properties that an object has to possess and those 
that it possesses throughout its existence coincide, they do not do so for 
Leibniz. That is, for Leibniz, even the properties that an object possesses 
only for a part of its existence are essential to it. Every monad bears each of 
its properties as part of its nature, so if it were to have been different in 
any respect, it would have been a different entity. 
Furthermore, there is a sense in which all monads are exactly similar to each 
other, for they all reflect the whole world. They each do so, however, from a 
different perspective. 
For God, so to speak, turns on all sides and considers in all ways the general 
system of phenomena which he has found it good to produce?nd he considers all 
the faces of the world in all possible ways?he result of each view of the 
universe, as looked at from a certain position, is? substance which expresses 
the universe in conformity with that view. (1998: 66) 
So each monad reflects the whole system, but with its own perspective 
emphasized. If a monad is at place p at time t, it will contain all the 
features of the universe at all times, but with those relating to its own time 
and place most vividly, and others fading out roughly in accordance with 
temporal and spatial distance. Because there is a continuum of perspectives on 
reality, there is an infinite number of these substances. Nevertheless, there 
is internal change in the monads, because the respect in which its content is 
vivid varies with time and with action. Indeed, the passage of time just is the 
change in which of the monad's contents are most vivid. 
The difference in my thinking to that of Leibniz is that a monad is never 
at place p at time t (location is defined solely interns of mutuality of 
perspectives) and monads are only substances in that they are eternal. I find 
it best to drop the idea of substance altogether as it can be completely 
defined in terms of invariances. 

After I wrote the above I can see how you would think of properties as 
being innate, but I see this as just a mental crutch that you are using to not 
think too deeply about the concept of property. The situation is the same for 
your difficulty with my hypothesis of meaning. We learn to associate meanings 
to words so that words are more than just combinations of letters, but this is 
just the internalization of the associations and relations within our thinking 
process. 

--  
Onward! 

Stephen

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-03 Thread Roger Clough
Hi Stephen P. King  

Those are psychological versions of numbers etc,.
The innate properties are arithmetical.


Roger Clough, rclo...@verizon.net 
11/3/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Stephen P. King  
Receiver: everything-list  
Time: 2012-11-03, 07:20:37 
Subject: Re: Emergence of Properties 


On 11/3/2012 5:26 AM, Bruno Marchal wrote: 
 The arithmetical property of numbers are innate to the numbers, logic  
 and the laws we assume. 
Hi, 

 This paper might be interesting to any one that would like to see a  
nice discussion of who it is that we come to understand numbers:  
http://web.media.mit.edu/~stefanm/society/som_final.html  


--  
Onward! 

Stephen 


--  
You received this message because you are subscribed to the Google Groups 
Everything List group. 
To post to this group, send email to everything-list@googlegroups.com. 
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com. 
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence of Properties

2012-11-03 Thread Roger Clough
Hi Stephen, 
' 
Yes, Aristotle's substances and their properties do not change with time. 
But Leibniz's do very rapidly. And they are individual to each substance,
meaning to each monad (from his aspect).   The actual properties are
collective data of the universe.

Roger Clough, rclo...@verizon.net 
11/3/2012  
Forever is a long time, especially near the end. -Woody Allen 


- Receiving the following content -  
From: Bruno Marchal  
Receiver: everything-list  
Time: 2012-11-03, 08:22:27 
Subject: Re: Emergence of Properties 




On 03 Nov 2012, at 12:17, Stephen P. King wrote: 


On 11/3/2012 5:26 AM, Bruno Marchal wrote: 

The arithmetical property of numbers are innate to the numbers, logic and the 
laws we assume. 


Dear Bruno, 

How? How are properties innate? This idea makes no sense to me, it never 
has as it does not allow for any explanation of apprehension of properties in 
my consideration... The only explanation of properties that makes sense to me 
is that of Leibniz: Properties are given by relations. We might think of 
objects as bundles of properties but this is problematic as it implies that 
properties are objects themselves. I think of properties similar to what 
Leibniz did: http://plato.stanford.edu/entries/substance/#DesSpiLei 


Leibniz's substances, however, are the bearers of change (criterion (iv)) in a 
very different way from Aristotle's individual substances. An Aristotelian 
individual possesses some properties essentially and some accidentally. The 
accidental properties of an object are ones that can be gained and lost over 
time, and which it might never have possessed at all: its essential properties 
are the only ones it had to possess and which it possesses throughout its 
existence. The situation is different for Leibniz's monads?hich is the name he 
gives to individual substances, created or uncreated (so God is a monad). 
Whereas, for Aristotle, the properties that an object has to possess and those 
that it possesses throughout its existence coincide, they do not do so for 
Leibniz. That is, for Leibniz, even the properties that an object possesses 
only for a part of its existence are essential to it. Every monad bears each of 
its properties as part of its nature, so if it were to have been different in 
any respect, it would have been a different entity. 
Furthermore, there is a sense in which all monads are exactly similar to each 
other, for they all reflect the whole world. They each do so, however, from a 
different perspective. 
For God, so to speak, turns on all sides and considers in all ways the general 
system of phenomena which he has found it good to produce?nd he considers all 
the faces of the world in all possible ways?he result of each view of the 
universe, as looked at from a certain position, is? substance which expresses 
the universe in conformity with that view. (1998: 66) 
So each monad reflects the whole system, but with its own perspective 
emphasized. If a monad is at place p at time t, it will contain all the 
features of the universe at all times, but with those relating to its own time 
and place most vividly, and others fading out roughly in accordance with 
temporal and spatial distance. Because there is a continuum of perspectives on 
reality, there is an infinite number of these substances. Nevertheless, there 
is internal change in the monads, because the respect in which its content is 
vivid varies with time and with action. Indeed, the passage of time just is the 
change in which of the monad's contents are most vivid. 
The difference in my thinking to that of Leibniz is that a monad is never 
at place p at time t (location is defined solely interns of mutuality of 
perspectives) and monads are only substances in that they are eternal. I find 
it best to drop the idea of substance altogether as it can be completely 
defined in terms of invariances. 

After I wrote the above I can see how you would think of properties as 
being innate,  


I meant independent of us. Not innate in the sense of psychology. 










but I see this as just a mental crutch that you are using to not think too 
deeply about the concept of property.  


I garee with what Leibiz said, and what Frege and the logicians have done with 
it. 






The situation is the same for your difficulty with my hypothesis of meaning. We 
learn to associate meanings to words so that words are more than just 
combinations of letters, but this is just the internalization of the 
associations and relations within our thinking process. 



You are too much unclear, for me. I can agree and disagree. As long as you 
don't present your theory it is hard to find out what you mean.  


Bruno 






--  
Onward! 

Stephen 


--  
You received this message because you are subscribed to the Google Groups 
Everything List group. 
To post to this group, send email to everything-list@googlegroups.com. 
To unsubscribe from this group, send email to 
everything

Re: Emergence of Properties

2012-11-03 Thread Stephen P. King

On 11/3/2012 8:57 AM, Roger Clough wrote:

The properties of spacetime things are what can be measured (ie facts).
The properties of beyond spacetime things are propositions that can't be 
contradicted (necessary truths).

Hi Roger,

I do not assume that the can't be contradicted is an a priori 
fixed apartheid on truths. I define necessary truths to be contingent on 
_many minds_ in agreement.


--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-03 Thread Stephen P. King

On 11/3/2012 9:18 AM, Roger Clough wrote:

Yes, Aristotle's substances and their properties do not change with time.
But Leibniz's do very rapidly. And they are individual to each substance,
meaning to each monad (from his aspect).   The actual properties are
collective data of the universe.

Hi Roger,

I do not assume a single physical universe that is independent of 
entities with 1p. I call this idea the Fish bowl model. I see the 
physical universe as a dream that is the same for many 1p, a literal 
mass delusion!


--
Onward!

Stephen


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence of Properties

2012-11-02 Thread Stephen P. King

On 11/2/2012 12:23 PM, Bruno Marchal wrote:

How can anything emerge from something having non properties? Magic?


Dear Bruno,

Why do you consider magic as a potential answer to your question? 
After thinking about your question while I was waiting to pick up my 
daughter from school, it occurred to me that we see in the Big Bang 
model and in almost all cosmogenesis myths 
https://www.google.com/#hl=ensugexp=les%3Bgs_nf=3tok=1XoTsmBbCpme0mnC57FQ9Qcp=18gs_id=3xhr=tq=cosmogenesis+mythspf=poutput=searchsclient=psy-aboq=cosmogenesis+mythsgs_l=pbx=1bav=on.2,or.r_gc.r_pw.r_cp.r_qf.fp=41b2cca49596839ebpcl=37189454biw=1527bih=812 
before it, an attempt to answer your question. Do you believe that 
properties are innate in objects? If so, how do you propose the 
dependency on measurement, to 'make definite' the properties of objects 
that we see in quantum theory, works?
My pathetic claim is that properties emerge from a 'subtractive 
process' (hat tip to Craig) between observers and that the One (totality 
of what exists) has all possible properties simultaneously (hat tip to 
Russell Standish).
I have never understood what aspects of QM theory are derivable 
from COMP. Do you have any result that show the general 
non-commutativity between observables of QM, or do you just show that 
the linear algebraic structure of observables (as we see in Hilbert 
spaces) can be derived from 1p indeterminacy? The linear properties and 
the general non-commutativity properties of operators (representing 
physical observables) are not the same thing...


--
Onward!

Stephen

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-25 Thread Bruno Marchal


On 24 Aug 2012, at 12:39, Roger Clough wrote:


Hi Stephen P. King

H.  I guess I should have know this, but if there are  
unproveable statements,
couldn't that also mean that the axioms needed to prove them have  
simply been
overlooked in inventorying (or constructing) the a priori  ? If so,  
then couldn't these
missing axioms be suggested by simply asking what additional axioms  
are needed

to prove the supposedly unproveable propositions?


You can add the new statement, but then you get a transformed machine,  
and it will have new unprovable statement, or become inconsistent.


Tkae the machine/theory having the beliefs:axioms:

1)
2)

Suppose the machine is consistent.

Then the following below is a new consistent machine, much richer in  
probability abilities:


1)
2)
3) 1) + 2) is consistent.

But the one below:

1)
2)
3) 1) + 2) +  3) is consistent.

which can be defined (the circularity can be eliminated by use of some  
trick) will be inconsistent, as no machine can ever prove consistently  
his own consistency.


Bruno





Roger Clough, rclo...@verizon.net
8/24/2012
Leibniz would say, If there's no God, we'd have to invent him so  
everything could function.

- Receiving the following content -
From: Stephen P. King
Receiver: everything-list
Time: 2012-08-23, 13:28:00
Subject: Re: Emergence

Hi Richard,

You mean provable statements not truths per se... I guess.  
OK, I haven't given that trope much thought I try to keep  
Godel's theorems reserved for special occasions. It has my  
experience that they can be very easily misapplied.



On 8/23/2012 1:24 PM, Richard Ruquist wrote:

Stephan,

Strong emergence follows from Godel's incompleteness because in any  
consistent system there are truths that cannot be derived from the  
axioms of the system. That is what is meant by incompleteness.


Sounds like what you just said. No?
Richard

On Thu, Aug 23, 2012 at 1:20 PM, Stephen P. King stephe...@charter.net 
 wrote:

Hi Richard,

Ah! http://en.wikipedia.org/wiki/Strong_emergence

Strong emergence is a type of emergence in which the emergent  
property is irreducible to its individual constituents.


OK, but irreducibility would have almost the same meaning as  
implying the non-existence of relations between the constituents  
and the emergent. It makes a mathematical description of the pair  
impossible... I don't think that I agree that it is derivable from  
Godel Incompleteness; I will be agnostic on this for now. Could you  
explain how it might?




On 8/23/2012 1:10 PM, Richard Ruquist wrote:

It is said that strong emergence comes from Godel incompleteness.
Weak emergence is like your grains of sand.

On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King stephe...@charter.net 
 wrote:

Hi Richard,

Pratt's theory does not address this. Could emergence be the  
result of inter-communications between monads and not an objective  
process at all? It is useful to think about how to solve the  
Sorites paradox to see what I mean here. A heap is said to emerge  
from a collection of grains, but is there a number or discrete or  
smooth process that generates the heap? No! The heap is just an  
abstract category that we assign. It is a name.


On 8/23/2012 9:44 AM, Richard Ruquist wrote:
Now if only someone could explain how emergence works.
Can Pratt theory do that?







--
Onward!

Stephen

Nature, to be commanded, must be obeyed.
~ Francis Bacon

--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence

2012-08-24 Thread Roger Clough
Hi Stephen P. King 

H.  I guess I should have know this, but if there are unproveable 
statements,
couldn't that also mean that the axioms needed to prove them have simply been
overlooked in inventorying (or constructing) the a priori  ? If so, then 
couldn't these
missing axioms be suggested by simply asking what additional axioms are needed 
to prove the supposedly unproveable propositions?

Roger Clough, rclo...@verizon.net
8/24/2012 
Leibniz would say, If there's no God, we'd have to invent him so everything 
could function.
- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-08-23, 13:28:00
Subject: Re: Emergence


Hi Richard,

You mean provable statements not truths per se... I guess. OK, I 
haven't given that trope much thought I try to keep Godel's theorems 
reserved for special occasions. It has my experience that they can be very 
easily misapplied.


On 8/23/2012 1:24 PM, Richard Ruquist wrote:

Stephan,


Strong emergence follows from Godel's incompleteness because in any consistent 
system there are truths that cannot be derived from the axioms of the system. 
That is what is meant by incompleteness. 


Sounds like what you just said. No?
Richard


On Thu, Aug 23, 2012 at 1:20 PM, Stephen P. King stephe...@charter.net wrote:

Hi Richard,

Ah! http://en.wikipedia.org/wiki/Strong_emergence 

Strong emergence is a type of emergence in which the emergent property is 
irreducible to its individual constituents.

OK, but irreducibility would have almost the same meaning as implying the 
non-existence of relations between the constituents and the emergent. It makes 
a mathematical description of the pair impossible... I don't think that I agree 
that it is derivable from Godel Incompleteness; I will be agnostic on this for 
now. Could you explain how it might? 



On 8/23/2012 1:10 PM, Richard Ruquist wrote:

It is said that strong emergence comes from Godel incompleteness. 
Weak emergence is like your grains of sand.


On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King stephe...@charter.net wrote:

Hi Richard,

Pratt's theory does not address this. Could emergence be the result of 
inter-communications between monads and not an objective process at all? It is 
useful to think about how to solve the Sorites paradox to see what I mean here. 
A heap is said to emerge from a collection of grains, but is there a number or 
discrete or smooth process that generates the heap? No! The heap is just an 
abstract category that we assign. It is a name.

On 8/23/2012 9:44 AM, Richard Ruquist wrote:

Now if only someone could explain how emergence works.
Can Pratt theory do that?









-- 
Onward!

Stephen

Nature, to be commanded, must be obeyed. 
~ Francis Bacon

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-24 Thread Stephen P. King

Hi Roger,

The point is that there exist (provably!) statements that are 
infinite and thus would require proofs that can effectively inspect 
their infinite extent. We could argue that induction allows us to 
shorten the length to a finite version but this does not cover all. For 
instance, consider a proposed theorem that states that there exists a 
certain sequence of digits in the n-ary expansion of pi. How does one 
consider the proof of such a theorem? Constructability (by fiite means) 
is the key to our notions of understanding, etc. and have lead some 
people to reject all math that does not admit constructable proofs. This 
is a HUGE problem in mathematics and by extension philosophy.



On 8/24/2012 6:39 AM, Roger Clough wrote:

Hi Stephen P. King
H.  I guess I should have know this, but if there are unproveable 
statements,
couldn't that also mean that the axioms needed to prove them have 
simply been
overlooked in inventorying (or constructing) the a priori  ? If so, 
then couldn't these
missing axioms be suggested by simply asking what additional axioms 
are needed

to prove the supposedly unproveable propositions?
Roger Clough, rclo...@verizon.net mailto:rclo...@verizon.net
8/24/2012
Leibniz would say, If there's no God, we'd have to invent him so 
everything could function.


- Receiving the following content -
*From:* Stephen P. King mailto:stephe...@charter.net
*Receiver:* everything-list mailto:everything-list@googlegroups.com
*Time:* 2012-08-23, 13:28:00
*Subject:* Re: Emergence

Hi Richard,

You mean provable statements not truths per se... I guess.
OK, I haven't given that trope much thought I try to keep
Godel's theorems reserved for special occasions. It has my
experience that they can be very easily misapplied.


On 8/23/2012 1:24 PM, Richard Ruquist wrote:

Stephan,

Strong emergence follows from Godel's incompleteness because in
any consistent system there are truths that cannot be derived
from the axioms of the system. That is what is meant by
incompleteness.

Sounds like what you just said. No?
Richard

On Thu, Aug 23, 2012 at 1:20 PM, Stephen P. King
stephe...@charter.net mailto:stephe...@charter.net wrote:

Hi Richard,

Ah! http://en.wikipedia.org/wiki/Strong_emergence

Strong emergence is a type of emergence in which the
emergent property is irreducible to its individual constituents.

OK, but irreducibility would have almost the same meaning
as implying the non-existence of relations between the
constituents and the emergent. It makes a mathematical
description of the pair impossible... I don't think that I
agree that it is derivable from Godel Incompleteness; I will
be agnostic on this for now. Could you explain how it might?



On 8/23/2012 1:10 PM, Richard Ruquist wrote:

It is said that strong emergence comes from Godel
incompleteness.
Weak emergence is like your grains of sand.

On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King
stephe...@charter.net mailto:stephe...@charter.net wrote:

Hi Richard,

Pratt's theory does not address this. Could
emergence be the result of inter-communications between
monads and not an objective process at all? It is useful
to think about how to solve the Sorites paradox to see
what I mean here. A heap is said to emerge from a
collection of grains, but is there a number or discrete
or smooth process that generates the heap? No! The heap
is just an abstract category that we assign. It is a name.

On 8/23/2012 9:44 AM, Richard Ruquist wrote:

Now if only someone could explain how emergence works.
Can Pratt theory do that?







-- 
Onward!


Stephen

Nature, to be commanded, must be obeyed.
~ Francis Bacon

--
You received this message because you are subscribed to the Google 
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



--
Onward!

Stephen

http://webpages.charter.net/stephenk1/Outlaw/Outlaw.html

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-23 Thread Stephen P. King

Hi Richard,

Pratt's theory does not address this. Could emergence be the result 
of inter-communications between monads and not an objective process at 
all? It is useful to think about how to solve the Sorites paradox to see 
what I mean here. A heap is said to emerge from a collection of grains, 
but is there a number or discrete or smooth process that generates the 
heap? No! The heap is just an abstract category that we assign. It is a 
name.


On 8/23/2012 9:44 AM, Richard Ruquist wrote:

Now if only someone could explain how emergence works.
Can Pratt theory do that?




--
Onward!

Stephen

Nature, to be commanded, must be obeyed.
~ Francis Bacon


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Re: Emergence

2012-08-23 Thread Roger Clough
Hi Stephen P. King 

Complexity seems to be the threshold of a magical transformation.
The more commonsense solution or explanation is to invoke Leibniz-like
downward causation.
 


Roger Clough, rclo...@verizon.net
8/23/2012 
Leibniz would say, If there's no God, we'd have to invent him so everything 
could function.
- Receiving the following content - 
From: Stephen P. King 
Receiver: everything-list 
Time: 2012-08-23, 12:48:51
Subject: Re: Emergence


Hi Richard,

 Pratt's theory does not address this. Could emergence be the result 
of inter-communications between monads and not an objective process at 
all? It is useful to think about how to solve the Sorites paradox to see 
what I mean here. A heap is said to emerge from a collection of grains, 
but is there a number or discrete or smooth process that generates the 
heap? No! The heap is just an abstract category that we assign. It is a 
name.

On 8/23/2012 9:44 AM, Richard Ruquist wrote:
 Now if only someone could explain how emergence works.
 Can Pratt theory do that?



-- 
Onward!

Stephen

Nature, to be commanded, must be obeyed.
~ Francis Bacon


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-23 Thread Richard Ruquist
It is said that strong emergence comes from Godel incompleteness.
Weak emergence is like your grains of sand.

On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King stephe...@charter.netwrote:

 Hi Richard,

 Pratt's theory does not address this. Could emergence be the result of
 inter-communications between monads and not an objective process at all? It
 is useful to think about how to solve the Sorites paradox to see what I
 mean here. A heap is said to emerge from a collection of grains, but is
 there a number or discrete or smooth process that generates the heap? No!
 The heap is just an abstract category that we assign. It is a name.

 On 8/23/2012 9:44 AM, Richard Ruquist wrote:

 Now if only someone could explain how emergence works.
 Can Pratt theory do that?



 --
 Onward!

 Stephen

 Nature, to be commanded, must be obeyed.
 ~ Francis Bacon


 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to 
 everything-list@googlegroups.**comeverything-list@googlegroups.com
 .
 To unsubscribe from this group, send email to everything-list+unsubscribe@
 **googlegroups.com everything-list%2bunsubscr...@googlegroups.com.
 For more options, visit this group at http://groups.google.com/**
 group/everything-list?hl=enhttp://groups.google.com/group/everything-list?hl=en
 .



-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-23 Thread Stephen P. King

Hi Richard,

Ah! http://en.wikipedia.org/wiki/Strong_emergence

Strong emergence is a type of emergence in which the emergent property 
is irreducible to its individual constituents.


OK, but irreducibility would have almost the same meaning as implying 
the non-existence of relations between the constituents and the 
emergent. It makes a mathematical description of the pair impossible... 
I don't think that I agree that it is derivable from Godel 
Incompleteness; I will be agnostic on this for now. Could you explain 
how it might?



On 8/23/2012 1:10 PM, Richard Ruquist wrote:

It is said that strong emergence comes from Godel incompleteness.
Weak emergence is like your grains of sand.

On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King 
stephe...@charter.net mailto:stephe...@charter.net wrote:


Hi Richard,

Pratt's theory does not address this. Could emergence be the
result of inter-communications between monads and not an objective
process at all? It is useful to think about how to solve the
Sorites paradox to see what I mean here. A heap is said to emerge
from a collection of grains, but is there a number or discrete or
smooth process that generates the heap? No! The heap is just an
abstract category that we assign. It is a name.

On 8/23/2012 9:44 AM, Richard Ruquist wrote:

Now if only someone could explain how emergence works.
Can Pratt theory do that?




--
Onward!

Stephen

Nature, to be commanded, must be obeyed.
~ Francis Bacon

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-23 Thread Richard Ruquist
Stephan,

Strong emergence follows from Godel's incompleteness because in any
consistent system there are truths that cannot be derived from the axioms
of the system. That is what is meant by incompleteness.

Sounds like what you just said. No?
Richard

On Thu, Aug 23, 2012 at 1:20 PM, Stephen P. King stephe...@charter.netwrote:

  Hi Richard,

 Ah! http://en.wikipedia.org/wiki/Strong_emergence

 Strong emergence is a type of emergence in which the emergent property is
 irreducible to its individual constituents.

 OK, but irreducibility would have almost the same meaning as implying
 the non-existence of relations between the constituents and the emergent.
 It makes a mathematical description of the pair impossible... I don't think
 that I agree that it is derivable from Godel Incompleteness; I will be
 agnostic on this for now. Could you explain how it might?



 On 8/23/2012 1:10 PM, Richard Ruquist wrote:

 It is said that strong emergence comes from Godel incompleteness.
 Weak emergence is like your grains of sand.

 On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King 
 stephe...@charter.netwrote:

 Hi Richard,

 Pratt's theory does not address this. Could emergence be the result
 of inter-communications between monads and not an objective process at all?
 It is useful to think about how to solve the Sorites paradox to see what I
 mean here. A heap is said to emerge from a collection of grains, but is
 there a number or discrete or smooth process that generates the heap? No!
 The heap is just an abstract category that we assign. It is a name.

 On 8/23/2012 9:44 AM, Richard Ruquist wrote:

 Now if only someone could explain how emergence works.
 Can Pratt theory do that?



 --
 Onward!

 Stephen

 Nature, to be commanded, must be obeyed.
 ~ Francis Bacon

  --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at
 http://groups.google.com/group/everything-list?hl=en.


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Emergence

2012-08-23 Thread Stephen P. King

Hi Richard,

You mean provable statements not truths per se... I guess. OK, 
I haven't given that trope much thought I try to keep Godel's 
theorems reserved for special occasions. It has my experience that they 
can be very easily misapplied.



On 8/23/2012 1:24 PM, Richard Ruquist wrote:

Stephan,

Strong emergence follows from Godel's incompleteness because in any 
consistent system there are truths that cannot be derived from the 
axioms of the system. That is what is meant by incompleteness.


Sounds like what you just said. No?
Richard

On Thu, Aug 23, 2012 at 1:20 PM, Stephen P. King 
stephe...@charter.net mailto:stephe...@charter.net wrote:


Hi Richard,

Ah! http://en.wikipedia.org/wiki/Strong_emergence

Strong emergence is a type of emergence in which the emergent
property is irreducible to its individual constituents.

OK, but irreducibility would have almost the same meaning as
implying the non-existence of relations between the constituents
and the emergent. It makes a mathematical description of the pair
impossible... I don't think that I agree that it is derivable from
Godel Incompleteness; I will be agnostic on this for now. Could
you explain how it might?



On 8/23/2012 1:10 PM, Richard Ruquist wrote:

It is said that strong emergence comes from Godel incompleteness.
Weak emergence is like your grains of sand.

On Thu, Aug 23, 2012 at 12:48 PM, Stephen P. King
stephe...@charter.net mailto:stephe...@charter.net wrote:

Hi Richard,

Pratt's theory does not address this. Could emergence be
the result of inter-communications between monads and not an
objective process at all? It is useful to think about how to
solve the Sorites paradox to see what I mean here. A heap is
said to emerge from a collection of grains, but is there a
number or discrete or smooth process that generates the heap?
No! The heap is just an abstract category that we assign. It
is a name.

On 8/23/2012 9:44 AM, Richard Ruquist wrote:

Now if only someone could explain how emergence works.
Can Pratt theory do that?







--
Onward!

Stephen

Nature, to be commanded, must be obeyed.
~ Francis Bacon

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-11 Thread Evgenii Rudnyi

On 11.02.2012 04:27 Russell Standish said the following:

On Fri, Feb 10, 2012 at 09:39:50PM +0100, Evgenii Rudnyi wrote:


Let me ask you the same question that I have recently asked Brent.
Could you please tell me, the thermodynamic entropy of what is
discussed in Jason's example below?

Evgenii



If you're asking what is the conversion constant between bits and
J/K, the answer is k_B log(2) / log(10).

I'm not sure what else to tell you...

Cheers



I am asking what a thermodynamic system is to be considered in this 
case. I understand that you can convert it his way, the question would 
be the thermodynamic entropy of what you receive this way.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Evgenii Rudnyi

On 09.02.2012 00:44 1Z said the following:





On Feb 7, 7:04 pm, Evgenii Rudnyiuse...@rudnyi.ru  wrote:


Let us take a closed vessel with oxygen and hydrogen at room
temperature. Then we open a platinum catalyst in the vessel and
the reaction starts. Will then the information in the vessel be
conserved?

Evgenii


What's the difference between  in-principle, and for-all-practical
purposes.?



What is the relationship between your question and mine?

Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Evgenii Rudnyi

On 08.02.2012 22:44 Russell Standish said the following:

On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...



What I observe personally is that there is information in
informatics and information in physics (if we say that the
thermodynamic entropy is the information). If you would agree,
that these two informations are different, it would be fine with
me, I am flexible with definitions.

Yet, if I understand you correctly you mean that the information
in informatics and the thermodynamic entropy are the same. This
puzzles me as I believe that the same physical values should have
the same numerical values. Hence my wish to understand what you
mean. Unfortunately you do not want to disclose it, you do not want
to apply your theory to examples that I present.

Evgenii


Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and
so on. These are all the same concept (logarithm of a probability).
Numerically, they differ, because the context differs in each
situation.

Entropy is related in a very simple way to information. S=S_max - I.
So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max
is the capacity of the drive eg 100GB for a 100GB drive. If you
store 10GB of data on it, the entropy of the drive is 90GB. That's
it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.


Let me ask you the same question that I have recently asked Brent. Could 
you please tell me, the thermodynamic entropy of what is discussed in 
Jason's example below?


Evgenii


On 03.02.2012 00:14 Jason Resch said the following:
...
 Evgenii,

 Sure, I could give a few examples as this somewhat intersects with my
 line of work.

 The NIST 800-90 recommendation (
 http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
 for random number generators is a document for engineers implementing
 secure pseudo-random number generators.  An example of where it is
 important is when considering entropy sources for seeding a random
 number generator.  If you use something completely random, like a
 fair coin toss, each toss provides 1 bit of entropy.  The formula is
 -log2(predictability).  With a coin flip, you have at best a .5
 chance of correctly guessing it, and -log2(.5) = 1.  If you used a
 die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
 entropy.  The ability to measure unpredictability is necessary to
 ensure, for example, that a cryptographic key is at least as
 difficult to predict the random inputs that went into generating it
 as it would be to brute force the key.

 In addition to security, entropy is also an important concept in the
 field of data compression.  The amount of entropy in a given bit
 string represents the theoretical minimum number of bits it takes to
 represent the information.  If 100 bits contain 100 bits of entropy,
 then there is no compression algorithm that can represent those 100
 bits with fewer than 100 bits.  However, if a 100 bit string contains
 only 50 bits of entropy, you could compress it to 50 bits.  For
 example, let's say you had 100 coin flips from an unfair coin.  This
 unfair coin comes up heads 90% of the time.  Each flip represents
 -log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
 flips with this biased coin could be represent with 16 bits.  There
 is only 15.2 bits of information / entropy contained in that 100 bit
 long sequence.

 Jason



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Evgenii Rudnyi

On 09.02.2012 07:49 meekerdb said the following:

...



There's an interesting paper by Bennett that I ran across, which
discusses the relation of Shannon entropy, thermodynamic entropy, and
 algorithmic entropy in the context of DNA and RNA replication:

http://qi.ethz.ch/edu/qisemFS10/papers/81_Bennett_Thermodynamics_of_computation.pdf


Thank you for the link. I like the first sentence

Computers may be thought of as engines for transforming free energy 
into waste heat and mathematical work.


I am not sure though if this is more as a metaphor. I will read the 
paper, the abstract looks nice.


I believe that there was a chapter on reversible computation in

Nanoelectronics and Information Technology, ed Rainer Waser

I guess, reversible computation is kind of a strange attractor for 
engineers.


As for DNA, RNA, and proteins, I have recently read

Barbieri, M. (2007). Is the cell a semiotic system? In: Introduction to 
Biosemiotics: The New Biological Synthesis. Eds.: M. Barbieri, Springer: 
179-208.


If the author is right, it well might be that the language was developed 
even before the consciousness. By the way, the paper is written very 
well and I have to think it over.


A related discussion

http://embryogenesisexplained.com/2012/02/is-the-cell-a-semiotic-system.html

Evgenii






Brent



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-10 Thread Russell Standish
On Fri, Feb 10, 2012 at 09:39:50PM +0100, Evgenii Rudnyi wrote:
 
 Let me ask you the same question that I have recently asked Brent.
 Could you please tell me, the thermodynamic entropy of what is
 discussed in Jason's example below?
 
 Evgenii
 

If you're asking what is the conversion constant between bits and J/K,
the answer is k_B log(2) / log(10).

I'm not sure what else to tell you...

Cheers

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread Evgenii Rudnyi

On 07.02.2012 23:06 Russell Standish said the following:

On Tue, Feb 07, 2012 at 08:15:10PM +0100, Evgenii Rudnyi wrote:

Russell,


This is circular - temperature is usually defined in terms of
entropy:

T^{-1} = dS/dE


This is wrong. The temperature is defined according to the Zeroth
Law. The Second Law just allows us to define the absolute
temperature, but the temperature as such is defined independently
from the entropy.



This is hardly a consensus view. See
http://en.wikipedia.org/wiki/Temperature for a discussion. I don't
personally have a stake in this, having left thermodynamics as a
field more than 20 years ago.


You are right there are different approaches. You may want for example 
look at


Teaching the Second Law
http://mitworld.mit.edu/video/540

Different people, different options.


But I will point out that the zeroth law definition is limited to
equilibrium situations only, which is probably the main reason why
entropy is taken to be more fundamental in modern formulations of
statistical mechaanics.


I am not sure I understand the problem here. First one defines a 
temperature for thermal equilibrium between two subsystems. Yet, after 
that it is not a big deal to introduce a local temperature and the 
thermal field.





dependent. As far as I remember, you have used this term in
respect to informational capacity of some modern information
carrier and its number of physical states. I would suggest to
stay with this example as the definition of context dependent.
Otherwise, it does not make much sense.


It makes just as much sense with Boltzmann-Gibbs entropy. Unless
you're saying that is not connected with thermodynamics
entropy..


Unfortunately I do not get your point. In the example, with the
information carrier we have different numerical values for the
information capacity on the carrier according to the producer and
the values derived from the thermodynamic entropy.



It sounds to me like you are arguing for a shift back to how
thermodynamics was before the Bolztmann's theoretical understanding.
A back-to-roots movement, as it were.


I would like rather to understand the meaning of your words.

By the way at the Boltzmann time the information was not there. So why 
before Boltzmann?





I still do not understand what surface effects on the carrier has
to do with this difference. Do you mean that if you consider
surface effects you derive an exact equation that will connect the
information capacity of the carrier with the thermodynamic
entropy? If yes, could you please give such an equation?

Evgenii



Why do you ask for such an equation when the a) the situation being
physically described as not been fully described, and b) it may well
be pragmatically impossible to write, even though it may exist in
principle.

This seems like a cheap rhetorical trick.



As I have mentioned, I would like to understand what you mean. In order 
to achieve this, I suggest to consider simple problems to apply your 
theory. I think it is the best to understand a theory by means of simple 
practical applications. Why do you consider this as a chip rhetorical trick?


What I observe personally is that there is information in informatics 
and information in physics (if we say that the thermodynamic entropy is 
the information). If you would agree, that these two informations are 
different, it would be fine with me, I am flexible with definitions.


Yet, if I understand you correctly you mean that the information in 
informatics and the thermodynamic entropy are the same. This puzzles me 
as I believe that the same physical values should have the same 
numerical values. Hence my wish to understand what you mean. 
Unfortunately you do not want to disclose it, you do not want to apply 
your theory to examples that I present.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread Russell Standish
On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...

 It sounds to me like you are arguing for a shift back to how
 thermodynamics was before the Bolztmann's theoretical understanding.
 A back-to-roots movement, as it were.
 
 I would like rather to understand the meaning of your words.
 
 By the way at the Boltzmann time the information was not there. So
 why before Boltzmann?
 

Yes, in Boltzmann's time, the concept of information was not
understood. But probability was (at least to some extent). Now, we
know that information is essentially the logarithm of a probability. I
don't know whether information or probability is logically prior - its
probably a matter of taste.

 
 What I observe personally is that there is information in
 informatics and information in physics (if we say that the
 thermodynamic entropy is the information). If you would agree, that
 these two informations are different, it would be fine with me, I am
 flexible with definitions.
 
 Yet, if I understand you correctly you mean that the information in
 informatics and the thermodynamic entropy are the same. This puzzles
 me as I believe that the same physical values should have the same
 numerical values. Hence my wish to understand what you mean.
 Unfortunately you do not want to disclose it, you do not want to
 apply your theory to examples that I present.
 
 Evgenii

Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and so
on. These are all the same concept (logarithm of a
probability). Numerically, they differ, because the context differs in
each situation.

Entropy is related in a very simple way to information. S=S_max -
I. So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max is
the capacity of the drive eg 100GB for a 100GB drive. If you store
10GB of data on it, the entropy of the drive is 90GB. That's it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.

PS

Your comment that Jaynes noted the similarity between Gibbs entropy
and Shannon entropy, which therefore motivated him to develop the
information theoretic foundation of statistical mechanics may well be
historically accurate. But this is not how the subject is presented in
a modern way, such as how Denbigh and Denbigh present it (their book
being fresh off the press the last time I really looked at this subject).

One could also note that historically, Shannon wrestled with calling
his information quantity entropy. At that time, it was pure
analogical thinking - the precise connection between his concept and
the thermodynamic one was elucidated until at least two decades later.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread 1Z




On Feb 7, 7:04 pm, Evgenii Rudnyi use...@rudnyi.ru wrote:

 Let us take a closed vessel with oxygen and hydrogen at room
 temperature. Then we open a platinum catalyst in the vessel and the
 reaction starts. Will then the information in the vessel be conserved?

 Evgenii

What's the difference between  in-principle, and for-all-practical
purposes.?

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-08 Thread meekerdb

On 2/8/2012 1:44 PM, Russell Standish wrote:

On Wed, Feb 08, 2012 at 08:32:16PM +0100, Evgenii Rudnyi wrote:

...


It sounds to me like you are arguing for a shift back to how
thermodynamics was before the Bolztmann's theoretical understanding.
A back-to-roots movement, as it were.

I would like rather to understand the meaning of your words.

By the way at the Boltzmann time the information was not there. So
why before Boltzmann?


Yes, in Boltzmann's time, the concept of information was not
understood. But probability was (at least to some extent). Now, we
know that information is essentially the logarithm of a probability. I
don't know whether information or probability is logically prior - its
probably a matter of taste.


What I observe personally is that there is information in
informatics and information in physics (if we say that the
thermodynamic entropy is the information). If you would agree, that
these two informations are different, it would be fine with me, I am
flexible with definitions.

Yet, if I understand you correctly you mean that the information in
informatics and the thermodynamic entropy are the same. This puzzles
me as I believe that the same physical values should have the same
numerical values. Hence my wish to understand what you mean.
Unfortunately you do not want to disclose it, you do not want to
apply your theory to examples that I present.

Evgenii

Given the above paragraph, I would say we're closer than you've
previously intimated.

Of course there is information in informatics, and there is
information in physics, just as there's information in biology and so
on. These are all the same concept (logarithm of a
probability). Numerically, they differ, because the context differs in
each situation.

Entropy is related in a very simple way to information. S=S_max -
I. So provided an S_max exists (which it will any finite system), so
does entropy. In the example of a hard drive, the informatics S_max is
the capacity of the drive eg 100GB for a 100GB drive. If you store
10GB of data on it, the entropy of the drive is 90GB. That's it.

Just as information is context dependent, then so must entropy.

Thermodynamics is just one use (one context) of entropy and
information. Usually, the context is one of homogenous bulk
materials. If you decide to account for surface effects, you change
the context, and entropy should change accordingly.

PS

Your comment that Jaynes noted the similarity between Gibbs entropy
and Shannon entropy, which therefore motivated him to develop the
information theoretic foundation of statistical mechanics may well be
historically accurate. But this is not how the subject is presented in
a modern way, such as how Denbigh and Denbigh present it (their book
being fresh off the press the last time I really looked at this subject).

One could also note that historically, Shannon wrestled with calling
his information quantity entropy. At that time, it was pure
analogical thinking - the precise connection between his concept and
the thermodynamic one was elucidated until at least two decades later.



There's an interesting paper by Bennett that I ran across, which discusses the relation of 
Shannon entropy, thermodynamic entropy, and algorithmic entropy in the context of DNA and 
RNA replication:


http://qi.ethz.ch/edu/qisemFS10/papers/81_Bennett_Thermodynamics_of_computation.pdf

Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Bruno Marchal


On 06 Feb 2012, at 20:42, meekerdb wrote:


On 2/6/2012 9:03 AM, 1Z wrote:



 There is also a conservation of information.  It is
 apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is hotly
debated.



It's the same as the question of wave-function collapse.  QM without  
collapse is time-reversible and so conserves information.  With  
collapse it doesn't.  But even without collapse information may  
become unavailable to us due to statistical diffusion into the  
environment or crossing and event horizon.



That's why if QM (without collapse) is 100% correct, black hole must  
reversibly evaporate.
Amazingly the presence of (p - []  p) in the material hypostases  
could explain why the core of the apparently primitive physics has to  
be given by a group or a very symmetrical group like object.
It might be related to the modular form in the general math of the  
diophantine equation (like in Fermat theorem).
In term of Smullyan singing birds (= combinators), there are no  
Kestrels (eliminators), nor Starlings (duplicators) in the core  
physical forest.


Kestrel = K. Their law is Kxy = x
Starling = S. Their law is Sxyz = xz(yz)

Then, if that is confirmed, we have the nice feature that the breaking  
of symmetries is only due to first person indeterminacy and the laws  
of big numbers.
Note that such a core physics would not been Turing complete. Forest  
(system of combinators) without both K and S (or equivalent  
eliminators and duplicators) cannot be Turing universal, although K  
can be simulated in some local way.





Brent

--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Evgenii Rudnyi

On 06.02.2012 20:42 meekerdb said the following:

On 2/6/2012 9:03 AM, 1Z wrote:

There is also a conservation of information. It is

apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is
hotly debated.



It's the same as the question of wave-function collapse. QM without
collapse is time-reversible and so conserves information. With
collapse it doesn't. But even without collapse information may become
unavailable to us due to statistical diffusion into the environment
or crossing and event horizon.

Brent



Let us take a closed vessel with oxygen and hydrogen at room 
temperature. Then we open a platinum catalyst in the vessel and the 
reaction starts. Will then the information in the vessel be conserved?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Evgenii Rudnyi

On 06.02.2012 22:19 Russell Standish said the following:

On Mon, Feb 06, 2012 at 08:20:53PM +0100, Evgenii Rudnyi wrote:

On 05.02.2012 22:46 Russell Standish said the following:

On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:


In this respect your question is actually nice, as now, I
believe, we see that it is possible to have a case when the
information capacity will be more than the number of physical
states.

Evgenii


How so?



Take a coin and cool it to zero Kelvin. Here it was my question
that you have not answered yet. Do you assume that the text on the
coin will be destroyed during cooling?



No. Previously, I mistakenly assumed that S=0 at T=0, which implies
the text being destroyed. But as I said - I withdraw that comment,
and any comment based on that mistaken assumption.




So, what happens with the entropy of the coin when the temperature goes 
to zero? Even you have withdrawn your comment, the question remains.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Evgenii Rudnyi

Russell,

 This is circular - temperature is usually defined in terms of
 entropy:

 T^{-1} = dS/dE

This is wrong. The temperature is defined according to the Zeroth Law. 
The Second Law just allows us to define the absolute temperature, but 
the temperature as such is defined independently from the entropy.


 dependent. As far as I remember, you have used this term in
 respect to informational capacity of some modern information
 carrier and its number of physical states. I would suggest to stay
 with this example as the definition of context dependent.
 Otherwise, it does not make much sense.

 It makes just as much sense with Boltzmann-Gibbs entropy. Unless
 you're saying that is not connected with thermodynamics entropy..

Unfortunately I do not get your point. In the example, with the 
information carrier we have different numerical values for the 
information capacity on the carrier according to the producer and the 
values derived from the thermodynamic entropy.


I still do not understand what surface effects on the carrier has to do 
with this difference. Do you mean that if you consider surface effects 
you derive an exact equation that will connect the information capacity 
of the carrier with the thermodynamic entropy? If yes, could you please 
give such an equation?


Evgenii


On 06.02.2012 22:17 Russell Standish said the following:

On Mon, Feb 06, 2012 at 08:36:44PM +0100, Evgenii Rudnyi wrote:

On 05.02.2012 23:05 Russell Standish said the following:


The context is there - you will just have to look for it. I
rather suspect that use of these tables refers to homogenous bulk
samples of the material, in thermal equilibrium with a heat bath
at some given temperature.


I do not get your point. Do you mean that sometimes the surface
effects could be important? Every thermodynamicist know this.
However I do not understand your problem. The thermodynamics of
surface phenomena is well established and to work with it you need
to extend the JANAF Tables with other tables. What is the problem?


The entropy will depend on what surface effect you consider
significant. Is it significant that the surface's boundary bumps an
dimples are so arranged to spell out a message in English? What if
you happen to not speak English, but only Chinese? Or might they not
be significant at all? All of these are different contexts.

Ignoring surface effects altogether is a perfectly viable model of
the physical system. Whether this is useful or not is going to
depend, well, on the context.



It would be good if you define better what do you mean by context
dependent. As far as I remember, you have used this term in
respect to informational capacity of some modern information
carrier and its number of physical states. I would suggest to stay
with this example as the definition of context dependent.
Otherwise, it does not make much sense.


It makes just as much sense with Boltzmann-Gibbs entropy. Unless
you're saying that is not connected with thermodynamics entropy...




If we were to take you at face value, we would have to conclude
that entropy is ill-defined in nonequlibrium systems.


The entropy is well-defined for a nonequilibrium system as soon as
one can use local temperature. There are some rare occasions where
local temperature is ambiguous, for example in plasma where one
defines different temperatures for electrons and molecules. Yet,
the two temperatures being defined, the entropy becomes again
well-defined.


This is circular - temperature is usually defined in terms of
entropy:

T^{-1} = dS/dE




More to the point - consider milling whatever material you have
chosen into small particles. Then consider what happens to a
container of the stuff in the Earth's gravity well, compared with
the microgravity situation on the ISS. In the former, the stuff
forms a pile on the bottom of the container - in the latter, the
stuff will be more or less uniformly distributed throughout the
containers volume. In the former case, shaking the container will
flatten the pile - but at all stages the material is in thermal
equilibrium.

In your thermodynamic context, the entropy is the same
throughout.


No it is not. As I have mentioned in this case one just must
consider surface effects.


Hence the context.




It only depends on bulk material properties, and temperature.
But most physicists would say that the milled material is in a
higher entropy state in microgravity, and that shaking the pile
in Earth's gravity raises the entropy.



Furthermore, lets assume that the particles are milled in the
form of tiny Penrose replicators (named after Lionel Penrose,
Roger's dad). When shaken, these particles stick together,
forming quite specific structures that replicate, entraining all
the replicators in the material.
(http://docs.huihoo.com/reprap/Revolutionary.pdf).

Most physicists would say that shaking a container of Penrose
replicators actually reduces the system's entropy. Yet, the
thermodynamic entropy of the 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread meekerdb

On 2/7/2012 11:04 AM, Evgenii Rudnyi wrote:

On 06.02.2012 20:42 meekerdb said the following:

On 2/6/2012 9:03 AM, 1Z wrote:

There is also a conservation of information. It is

apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is
hotly debated.



It's the same as the question of wave-function collapse. QM without
collapse is time-reversible and so conserves information. With
collapse it doesn't. But even without collapse information may become
unavailable to us due to statistical diffusion into the environment
or crossing and event horizon.

Brent



Let us take a closed vessel with oxygen and hydrogen at room temperature. Then we open a 
platinum catalyst in the vessel and the reaction starts. Will then the information in 
the vessel be conserved?


Evgenii



No, because the vessel can't be isolated at the microscopic level.

Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-07 Thread Russell Standish
On Tue, Feb 07, 2012 at 08:15:10PM +0100, Evgenii Rudnyi wrote:
 Russell,
 
  This is circular - temperature is usually defined in terms of
  entropy:
 
  T^{-1} = dS/dE
 
 This is wrong. The temperature is defined according to the Zeroth
 Law. The Second Law just allows us to define the absolute
 temperature, but the temperature as such is defined independently
 from the entropy.
 

This is hardly a consensus view. See
http://en.wikipedia.org/wiki/Temperature for a discussion. I don't
personally have a stake in this, having left thermodynamics as a field
more than 20 years ago.

But I will point out that the zeroth law definition is limited to
equilibrium situations only, which is probably the main reason why
entropy is taken to be more fundamental in modern formulations of
statistical mechaanics.

  dependent. As far as I remember, you have used this term in
  respect to informational capacity of some modern information
  carrier and its number of physical states. I would suggest to stay
  with this example as the definition of context dependent.
  Otherwise, it does not make much sense.
 
  It makes just as much sense with Boltzmann-Gibbs entropy. Unless
  you're saying that is not connected with thermodynamics entropy..
 
 Unfortunately I do not get your point. In the example, with the
 information carrier we have different numerical values for the
 information capacity on the carrier according to the producer and
 the values derived from the thermodynamic entropy.
 

It sounds to me like you are arguing for a shift back to how
thermodynamics was before the Bolztmann's theoretical understanding. A
back-to-roots movement, as it were.

 I still do not understand what surface effects on the carrier has to
 do with this difference. Do you mean that if you consider surface
 effects you derive an exact equation that will connect the
 information capacity of the carrier with the thermodynamic entropy?
 If yes, could you please give such an equation?
 
 Evgenii
 

Why do you ask for such an equation when the a) the situation being
physically described as not been fully described, and b) it may well
be pragmatically impossible to write, even though it may exist in
principle.

This seems like a cheap rhetorical trick.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Jason Resch
Informational laws and physical laws are, in my mind, closely  
related.  Laws related to information seem to supercede physical law.   
For example,  the impossibility of encoding information in fewer  
symbols or trying to send more over a channel in a given time period,  
than allowed.  There is also a conservation of information.  It is  
apparently industrictable.  There is a minimum physical energy  
expenditure associate with irreversible computation.  E.g. Setting a  
memory register from 1 to 0.  Other informational laws, prevent any  
compression algorithm from having any net decrease in size when  
considered over the set of all possible inputs.  You can also do  
really cool things with information, such as forward error correction:  
a file of size 1 mb can be encoded to 1.5 mb.  Then this encoded file  
can be split into 15 equally sized pieces.  The cool part is that any  
10 of these pieces (corresponding to 1 mb of information) may be used  
to recover the entire original file.  Any less than 1 mb worth of  
pieces is insufficient.


Jason

On Feb 5, 2012, at 3:46 PM, Russell Standish li...@hpcoders.com.au  
wrote:



On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:


First, we have not to forget the Third Law that states that the
change in entropy in any reaction, as well its derivatives, goes to
zero as the temperatures goes to zero Kelvin.

In this respect your question is actually nice, as now, I believe,
we see that it is possible to have a case when the information
capacity will be more than the number of physical states.

Evgenii


How so?

--

--- 
--- 
--

Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au
--- 
--- 
--


--
You received this message because you are subscribed to the Google  
Groups Everything List group.

To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com 
.
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en 
.




--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread 1Z


On Feb 6, 4:55 pm, Jason Resch jasonre...@gmail.com wrote:
 Informational laws and physical laws are, in my mind, closely
 related.  Laws related to information seem to supercede physical law.
 For example,  the impossibility of encoding information in fewer
 symbols or trying to send more over a channel in a given time period,
 than allowed.

Those transcend physics inasmuch as they are mathematical .

 There is also a conservation of information.  It is
 apparently industrictable.

Is there? if there is , it is a phsycial law, and AFAIK it is hotly
debated.

 There is a minimum physical energy
 expenditure associate with irreversible computation.  E.g. Setting a
 memory register from 1 to 0.  Other informational laws, prevent any
 compression algorithm from having any net decrease in size when
 considered over the set of all possible inputs.  You can also do
 really cool things with information, such as forward error correction:
 a file of size 1 mb can be encoded to 1.5 mb.  Then this encoded file
 can be split into 15 equally sized pieces.  The cool part is that any
 10 of these pieces (corresponding to 1 mb of information) may be used
 to recover the entire original file.  Any less than 1 mb worth of
 pieces is insufficient.

 Jason

You information laws seem to have mixed origins.

 On Feb 5, 2012, at 3:46 PM, Russell Standish li...@hpcoders.com.au
 wrote:







  On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:

  First, we have not to forget the Third Law that states that the
  change in entropy in any reaction, as well its derivatives, goes to
  zero as the temperatures goes to zero Kelvin.

  In this respect your question is actually nice, as now, I believe,
  we see that it is possible to have a case when the information
  capacity will be more than the number of physical states.

  Evgenii

  How so?

  --

  ---
  ---
  --
  Prof Russell Standish                  Phone 0425 253119 (mobile)
  Principal, High Performance Coders
  Visiting Professor of Mathematics      hpco...@hpcoders.com.au
  University of New South Wales          http://www.hpcoders.com.au
  ---
  ---
  --

  --
  You received this message because you are subscribed to the Google
  Groups Everything List group.
  To post to this group, send email to everything-list@googlegroups.com.
  To unsubscribe from this group, send email to 
  everything-list+unsubscr...@googlegroups.com
  .
  For more options, visit this group 
  athttp://groups.google.com/group/everything-list?hl=en
  .

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Evgenii Rudnyi

On 05.02.2012 23:05 Russell Standish said the following:

On Fri, Feb 03, 2012 at 08:50:40PM +0100, Evgenii Rudnyi wrote:


I guess that you have never done a lab in experimental
thermodynamics. There are classical experiment where people
measure heat of combustion, heat capacity, equilibrium pressure,
equilibrium constants and then determine the entropy. If you do it,
you see that you can measure the entropy the same way as other
properties, there is no difference. A good example to this end is
JANAF Thermochemical Tables (Joint Army-Naval-Air Force
Thermochemical Tables). You will find a pdf here

http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf

It is about 230 Mb but I guess it is doable to download it. Please
open it and explain what is the difference between the tabulated
entropy and other properties there. How your personal viewpoint on
a thermodynamic system will influence numerical values of the
entropy tabulated in JANAF? What is the difference with the mass or
length? I do not see it.

You see, the JANAF Tables has started by military. They needed it
to compute for example the combustion process in rockets and they
have been successful. What part then in a rocket is context
dependent?

This is the main problem with the books on entropy and
information. They do not consider thermodynamic tables, they do not
work out simple thermodynamic examples. For example let us consider
the next problem:

--- Problem. Given
temperature, pressure, and initial number of moles of NH3, N2 and
H2, compute the equilibrium composition.

To solve the problem one should find thermodynamic properties of
NH3, N2 and H2 for example in the JANAF Tables and then compute
the equilibrium constant.

From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity
but it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2
Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)

To make life simple, I will assume below that Del_Cp_r = 0, but it
is not a big deal to extend the equations to include heat
capacities as well.

Del_G_r_T = Del_H_r_298 - T Del_S_r_298

Del_G_r_T = - R T ln Kp

When Kp, total pressure and the initial number of moles are given,
it is rather straightforward to compute equilibrium composition.
If you need help, please just let me know.
---

So, the entropy is there. What is context dependent here? Where is
the difference with mass and length?

Evgenii



The context is there - you will just have to look for it. I rather
suspect that use of these tables refers to homogenous bulk samples
of the material, in thermal equilibrium with a heat bath at some
given temperature.


I do not get your point. Do you mean that sometimes the surface effects 
could be important? Every thermodynamicist know this. However I do not 
understand your problem. The thermodynamics of surface phenomena is well 
established and to work with it you need to extend the JANAF Tables with 
other tables. What is the problem?


It would be good if you define better what do you mean by context 
dependent. As far as I remember, you have used this term in respect to 
informational capacity of some modern information carrier and its number 
of physical states. I would suggest to stay with this example as the 
definition of context dependent. Otherwise, it does not make much sense.



If we were to take you at face value, we would have to conclude that
entropy is ill-defined in nonequlibrium systems.


The entropy is well-defined for a nonequilibrium system as soon as one 
can use local temperature. There are some rare occasions where local 
temperature is ambiguous, for example in plasma where one defines 
different temperatures for electrons and molecules. Yet, the two 
temperatures being defined, the entropy becomes again well-defined.



More to the point - consider milling whatever material you have
chosen into small particles. Then consider what happens to a
container of the stuff in the Earth's gravity well, compared with the
microgravity situation on the ISS. In the former, the stuff forms a
pile on the bottom of the container - in the latter, the stuff will
be more or less uniformly distributed throughout the containers
volume. In the former case, shaking the container will flatten the
pile - but at all stages the material is in thermal equilibrium.

In your thermodynamic context, the entropy is the same throughout.


No it is not. As I have mentioned in this case one just must consider 
surface effects.



It only depends on bulk material properties, and temperature. But
most physicists would say that the milled material is in a higher
entropy state in 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Russell Standish
On Mon, Feb 06, 2012 at 08:36:44PM +0100, Evgenii Rudnyi wrote:
 On 05.02.2012 23:05 Russell Standish said the following:
 
 The context is there - you will just have to look for it. I rather
 suspect that use of these tables refers to homogenous bulk samples
 of the material, in thermal equilibrium with a heat bath at some
 given temperature.
 
 I do not get your point. Do you mean that sometimes the surface
 effects could be important? Every thermodynamicist know this.
 However I do not understand your problem. The thermodynamics of
 surface phenomena is well established and to work with it you need
 to extend the JANAF Tables with other tables. What is the problem?

The entropy will depend on what surface effect you consider
significant. Is it significant that the surface's boundary bumps an
dimples are so arranged to spell out a message in English? What if you
happen to not speak English, but only Chinese? Or might they not be
significant at all? All of these are different contexts.

Ignoring surface effects altogether is a perfectly viable model of the
physical system. Whether this is useful or not is going to depend,
well, on the context.

 
 It would be good if you define better what do you mean by context
 dependent. As far as I remember, you have used this term in respect
 to informational capacity of some modern information carrier and its
 number of physical states. I would suggest to stay with this example
 as the definition of context dependent. Otherwise, it does not make
 much sense.

It makes just as much sense with Boltzmann-Gibbs entropy. Unless
you're saying that is not connected with thermodynamics entropy...

 
 If we were to take you at face value, we would have to conclude that
 entropy is ill-defined in nonequlibrium systems.
 
 The entropy is well-defined for a nonequilibrium system as soon as
 one can use local temperature. There are some rare occasions where
 local temperature is ambiguous, for example in plasma where one
 defines different temperatures for electrons and molecules. Yet, the
 two temperatures being defined, the entropy becomes again
 well-defined.

This is circular - temperature is usually defined in terms of entropy:

T^{-1} = dS/dE

 
 More to the point - consider milling whatever material you have
 chosen into small particles. Then consider what happens to a
 container of the stuff in the Earth's gravity well, compared with the
 microgravity situation on the ISS. In the former, the stuff forms a
 pile on the bottom of the container - in the latter, the stuff will
 be more or less uniformly distributed throughout the containers
 volume. In the former case, shaking the container will flatten the
 pile - but at all stages the material is in thermal equilibrium.
 
 In your thermodynamic context, the entropy is the same throughout.
 
 No it is not. As I have mentioned in this case one just must
 consider surface effects.

Hence the context.

 
 It only depends on bulk material properties, and temperature. But
 most physicists would say that the milled material is in a higher
 entropy state in microgravity, and that shaking the pile in Earth's
 gravity raises the entropy.
 
 Furthermore, lets assume that the particles are milled in the form
 of tiny Penrose replicators (named after Lionel Penrose, Roger's
 dad). When shaken, these particles stick together, forming quite
 specific structures that replicate, entraining all the replicators
 in the material. (http://docs.huihoo.com/reprap/Revolutionary.pdf).
 
 Most physicists would say that shaking a container of Penrose
 replicators actually reduces the system's entropy. Yet, the
 thermodynamic entropy of the JNAF context does not change, as that
 only depends on bulk material properties.
 
 We are again at the definition of context dependent. What are saying
 now is that when you have new physical effects, it is necessary to
 take them into account. What it has to do with your example when
 information on an information carrier was context dependent?
 

Who decides what physical effects to take into account? This is not a
question of pure relativism - I'm well aware that some models are much
better than others at describing the situtaion, but even in the case
of Penrose replicators described above, their ability to adhere and
fragment may or may not be relevant to the situation you are trying to
model.


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-06 Thread Russell Standish
On Mon, Feb 06, 2012 at 08:20:53PM +0100, Evgenii Rudnyi wrote:
 On 05.02.2012 22:46 Russell Standish said the following:
 On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
 
 In this respect your question is actually nice, as now, I believe,
 we see that it is possible to have a case when the information
 capacity will be more than the number of physical states.
 
 Evgenii
 
 How so?
 
 
 Take a coin and cool it to zero Kelvin. Here it was my question that
 you have not answered yet. Do you assume that the text on the coin
 will be destroyed during cooling?
 

No. Previously, I mistakenly assumed that S=0 at T=0, which implies the
text being destroyed. But as I said - I withdraw that comment, and any
comment based on that mistaken assumption.


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-05 Thread Russell Standish
On Fri, Feb 03, 2012 at 08:56:10PM +0100, Evgenii Rudnyi wrote:
 
 First, we have not to forget the Third Law that states that the
 change in entropy in any reaction, as well its derivatives, goes to
 zero as the temperatures goes to zero Kelvin.
 
 In this respect your question is actually nice, as now, I believe,
 we see that it is possible to have a case when the information
 capacity will be more than the number of physical states.
 
 Evgenii

How so?

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-05 Thread Russell Standish
On Fri, Feb 03, 2012 at 08:50:40PM +0100, Evgenii Rudnyi wrote:
 
 I guess that you have never done a lab in experimental
 thermodynamics. There are classical experiment where people measure
 heat of combustion, heat capacity, equilibrium pressure, equilibrium
 constants and then determine the entropy. If you do it, you see that
 you can measure the entropy the same way as other properties, there
 is no difference. A good example to this end is JANAF Thermochemical
 Tables (Joint Army-Naval-Air Force Thermochemical Tables). You will
 find a pdf here
 
 http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf
 
 It is about 230 Mb but I guess it is doable to download it. Please
 open it and explain what is the difference between the tabulated
 entropy and other properties there. How your personal viewpoint on a
 thermodynamic system will influence numerical values of the entropy
 tabulated in JANAF? What is the difference with the mass or length?
 I do not see it.
 
 You see, the JANAF Tables has started by military. They needed it to
 compute for example the combustion process in rockets and they have
 been successful. What part then in a rocket is context dependent?
 
 This is the main problem with the books on entropy and information.
 They do not consider thermodynamic tables, they do not work out
 simple thermodynamic examples. For example let us consider the next
 problem:
 
 ---
 Problem. Given temperature, pressure, and initial number of moles of
 NH3, N2 and H2, compute the equilibrium composition.
 
 To solve the problem one should find thermodynamic properties of
 NH3, N2 and H2 for example in the JANAF Tables and then compute the
 equilibrium constant.
 
 From thermodynamics tables (all values are molar values for the
 standard pressure 1 bar, I have omitted the symbol o for simplicity but
 it is very important not to forget it):
 
 Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
 Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)
 
 2NH3 = N2 + 3H2
 
 Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)
 
 Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)
 
 Del_Cp_r = Cp(N2) + 3 Cp(H2) - 2 Cp(NH3)
 
 To make life simple, I will assume below that Del_Cp_r = 0, but it is
 not a big deal to extend the equations to include heat capacities as well.
 
 Del_G_r_T = Del_H_r_298 - T Del_S_r_298
 
 Del_G_r_T = - R T ln Kp
 
 When Kp, total pressure and the initial number of moles are given,
 it is rather straightforward to compute equilibrium composition. If
 you need help, please just let me know.
 ---
 
 So, the entropy is there. What is context dependent here? Where is
 the difference with mass and length?
 
 Evgenii
 

The context is there - you will just have to look for it. I rather
suspect that use of these tables refers to homogenous bulk samples of
the material, in thermal equilibrium with a heat bath at some given
temperature.

If we were to take you at face value, we would have to conclude that
entropy is ill-defined in nonequlibrium systems.

More to the point - consider milling whatever material you have chosen
into small particles. Then consider what happens to a container of the
stuff in the Earth's gravity well, compared with the microgravity
situation on the ISS. In the former, the stuff forms a pile on the
bottom of the container - in the latter, the stuff will be more or
less uniformly distributed throughout the containers volume. In the
former case, shaking the container will flatten the pile - but at all
stages the material is in thermal equilibrium.

In your thermodynamic context, the entropy is the same
throughout. It only depends on bulk material properties, and
temperature. But most physicists would say that the milled material is
in a higher entropy state in microgravity, and that shaking the pile
in Earth's gravity raises the entropy.

Furthermore, lets assume that the particles are milled in the form of
tiny Penrose replicators (named after Lionel Penrose, Roger's
dad). When shaken, these particles stick together, forming quite
specific structures that replicate, entraining all the replicators in
the material. (http://docs.huihoo.com/reprap/Revolutionary.pdf). 

Most physicists would say that shaking a container of Penrose
replicators actually reduces the system's entropy. Yet, the
thermodynamic entropy of the JNAF context does not change, as that
only depends on bulk material properties.

We can follow your line of thinking, and have a word entropy that is
only useful in certain contexts, then we'll need to make up a
different word for other contexts.  Alternatively, we can have a word
that applies over all macroscopic contexts, and explicitly qualify
what that context is. The underlying concept is the same in all cases
though. It appears to me, that standard scientific usage has become to
use the same word for that concept, rather than coin different words
to 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-03 Thread Evgenii Rudnyi

On 03.02.2012 00:14 Jason Resch said the following:

On Sun, Jan 22, 2012 at 3:04 AM, Evgenii Rudnyiuse...@rudnyi.ru
wrote:


On 21.01.2012 22:03 Evgenii Rudnyi said the following:

On 21.01.2012 21:01 meekerdb said the following:



On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:


On 21.01.2012 20:00 meekerdb said the following:


On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:





...

2) If physicists say that information is the entropy, they

must take it literally and then apply experimental
thermodynamics to measure information. This however
seems not to happen.



It does happen. The number of states, i.e. the
information, available from a black hole is calculated from
it's thermodynamic properties as calculated by Hawking. At
a more conventional level, counting the states available to
molecules in a gas can be used to determine the specific
heat of the gas and vice-verse. The reason the
thermodynamic measures and the information measures are
treated separately in engineering problems is that the
information that is important to engineering is
infinitesimal compared to the information stored in the
microscopic states. So the latter is considered only in
terms of a few macroscopic averages, like temperature and
pressure.

Brent



Doesn't this mean that by information engineers means
something different as physicists?



I don't think so. A lot of the work on information theory was
done by communication engineers who were concerned with the
effect of thermal noise on bandwidth. Of course engineers
specialize more narrowly than physics, so within different
fields of engineering there are different terminologies and
different measurement methods for things that are unified in
basic physics, e.g. there are engineers who specialize in
magnetism and who seldom need to reflect that it is part of EM,
there are others who specialize in RF and don't worry about
static fields.



Do you mean that engineers use experimental thermodynamics to
determine information?


Evgenii


To be concrete. This is for example a paper from control

J.C. Willems and H.L. Trentelman H_inf control in a behavioral
context: The full information case IEEE Transactions on Automatic
Control Volume 44, pages 521-536, 1999
http://homes.esat.kuleuven.be/**~jwillems/Articles/**
JournalArticles/1999.4.pdfhttp://homes.esat.kuleuven.be/%7Ejwillems/Articles/JournalArticles/1999.4.pdf




The term information is there but the entropy not. Could you please

explain why? Or alternatively could you please point out to papers
where engineers use the concept of the equivalence between the
entropy and information?




Evgenii,

Sure, I could give a few examples as this somewhat intersects with my
line of work.

The NIST 800-90 recommendation (
http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf )
for random number generators is a document for engineers implementing
secure pseudo-random number generators.  An example of where it is
important is when considering entropy sources for seeding a random
number generator.  If you use something completely random, like a
fair coin toss, each toss provides 1 bit of entropy.  The formula is
-log2(predictability).  With a coin flip, you have at best a .5
chance of correctly guessing it, and -log2(.5) = 1.  If you used a
die roll, then each die roll would provide -log2(1/6) = 2.58 bits of
entropy.  The ability to measure unpredictability is necessary to
ensure, for example, that a cryptographic key is at least as
difficult to predict the random inputs that went into generating it
as it would be to brute force the key.

In addition to security, entropy is also an important concept in the
field of data compression.  The amount of entropy in a given bit
string represents the theoretical minimum number of bits it takes to
represent the information.  If 100 bits contain 100 bits of entropy,
then there is no compression algorithm that can represent those 100
bits with fewer than 100 bits.  However, if a 100 bit string contains
only 50 bits of entropy, you could compress it to 50 bits.  For
example, let's say you had 100 coin flips from an unfair coin.  This
unfair coin comes up heads 90% of the time.  Each flip represents
-log2(.9) = 0.152 bits of entropy.  Thus, a sequence of 100 coin
flips with this biased coin could be represent with 16 bits.  There
is only 15.2 bits of information / entropy contained in that 100 bit
long sequence.

Jason



Jason,

Sorry, for being unclear. In my statement I have meant the thermodynamic 
entropy. No doubt, in the information theory engineers, starting from 
Shannon, use the information entropy. Yet, I wanted to point out that I 
have not seen engineering works where engineers employ the equivalence 
between the thermodynamic entropy and the informational entropy.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-03 Thread Evgenii Rudnyi

On 02.02.2012 22:18 Russell Standish said the following:

On Thu, Feb 02, 2012 at 07:45:53PM +0100, Evgenii Rudnyi wrote:

On 01.02.2012 21:51 Stephen P. King said the following:

On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:

First the thermodynamic entropy is not context depended. This
must mean that if it is the same as information, then the
latter must not be context dependent as well. Could you please
give me an example of a physical property that is context
dependent?



Temperature is context dependent. If we consider physics at the
level of atoms there is no such a quantity as temperature.
Additionally, thermodynamic entropy does require Boltzmann's
constant to be defined with is a form of context dependency as it
specifies the level at which we are to take micro-states as
macroscopically indistinguishable.


The Boltzmann's constant, as far as I understand, is defined
uniquely. If you talk about some other universe (or Platonia)
where one could imagine something else, then it could be. Yet, in
the world that we know according to empirical scientific studies,
the Boltmann's constant is a fundamental constant. Hence I do not
understand you in this respect.


Boltzmann's constant is a unit conversion constant like c an Plank's
constant, nothing more. It has no fundamental significance.



Indeed, temperature is not available directly at the level of
particles obeying classical or quantum laws. However for example
it could be not a problem with the temperature but rather with the
description at the particle level.

Anyway, I would suggest to stick to empirical scientific knowledge
that we have. Then I do not understand what do you mean that
temperature is context dependent either.



Temperature is an averaged quantity, so whilst technically an
example of emergence, it is the weakest form of emergence.

Evgenii is stating an oft-repeated meme that entropy is not
context-dependent.

It is context dependent because it (possibly implicitly) depends on
what we mean by a thermodynamic state. In thermodynamics, we usually
mean a state defined by temperature, pressure, volume, number of
particles, and so on. The and so on is the context dependent part.
There are actually an enormous number of possible independent
thermodyamic variables that may be relevant in different situations.
In an electrical device, the arrangement of charges might be another
such thermodynamic variable.

Also, even in classic schoolbook thermodynamics, not all of
temperature, pressue, volume and particle number are relevant.
Dropping various of these terms leads to different ensembles
(microcanonical, canonical and grand canonical).

Of course, context dependence does not mean subjective. If two
observers agree on the context, the entropy is quite objective. But
it is a little more complex than something like mass or length.

This is explained very well in Denbigh  Denbigh.




I guess that you have never done a lab in experimental thermodynamics. 
There are classical experiment where people measure heat of combustion, 
heat capacity, equilibrium pressure, equilibrium constants and then 
determine the entropy. If you do it, you see that you can measure the 
entropy the same way as other properties, there is no difference. A good 
example to this end is JANAF Thermochemical Tables (Joint Army-Naval-Air 
Force Thermochemical Tables). You will find a pdf here


http://www.nist.gov/data/PDFfiles/jpcrdM9.pdf

It is about 230 Mb but I guess it is doable to download it. Please open 
it and explain what is the difference between the tabulated entropy and 
other properties there. How your personal viewpoint on a thermodynamic 
system will influence numerical values of the entropy tabulated in 
JANAF? What is the difference with the mass or length? I do not see it.


You see, the JANAF Tables has started by military. They needed it to 
compute for example the combustion process in rockets and they have been 
successful. What part then in a rocket is context dependent?


This is the main problem with the books on entropy and information. They 
do not consider thermodynamic tables, they do not work out simple 
thermodynamic examples. For example let us consider the next problem:


---
Problem. Given temperature, pressure, and initial number of moles of 
NH3, N2 and H2, compute the equilibrium composition.


To solve the problem one should find thermodynamic properties of NH3, N2 
and H2 for example in the JANAF Tables and then compute the equilibrium 
constant.


From thermodynamics tables (all values are molar values for the
standard pressure 1 bar, I have omitted the symbol o for simplicity but
it is very important not to forget it):

Del_f_H_298(NH3), S_298(NH3), Cp(NH3), Del_f_H_298(N2), S_298(N2),
Cp(N2), Del_f_H_298(H2), S_298(H2), Cp(H2)

2NH3 = N2 + 3H2

Del_H_r_298 = Del_f_H_298(N2) + 3 Del_f_H_298(H2) - 2 Del_f_H_298(NH3)

Del_S_r_298 = S_298(N2) + 3 S_298(H2) - 2 S_298(NH3)

Del_Cp_r = Cp(N2

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-03 Thread Evgenii Rudnyi

On 02.02.2012 22:35 Russell Standish said the following:

On Wed, Feb 01, 2012 at 09:17:41PM +0100, Evgenii Rudnyi wrote:

On 29.01.2012 23:06 Russell Standish said the following:


Absolutely! But at zero kelvin, the information storage capacity
of the device is precisely zero, so cooling only works to a
certain point.



I believe that you have mentioned once that information is
negentropy. If yes, could you please comment on that? What
negentropy would mean?


Scheodinger first pointed out that living systems must export
entropy, and coined the term negative entropy to refer to this.
Brillouin shortened this to negentropy.

The basic formula is S_max = S + I.

S_max is the maximum possible value for entropy to take - the value
of entropy at thermodynamic equilibrium for a microcanonical
ensemble. S is the usual entropy, which for non-equilibrium systems
will be typically lower than S_max, and even for equilibrium systems
can be held lower by physical constraints. I is the difference, and
this is what Brillouin called negentropy. It is an information - the
information encoded in that state.

Try looking up http://en.wikipedia.org/wiki/Negentropy


Could you please explain how the negentropy is related to experimental 
thermodynamics? You will find in the previous message the link to the 
JANAF tables and a basic thermodynamic problem. Could you please 
demonstrate how the negentropy will help there?






In general, I do not understand what does it mean that information
at zero Kelvin is zero. Let us take a coin and cool it down. Do
you mean that the text on the coin will disappear? Or you mean that
no one device can read this text at zero Kelvin?



I vaguely remembered that S_max=0 at absolute zero. If it were, then
both S and I must be zero, because these are all nonnegative
quantities. But http://en.wikipedia.org/wiki/Absolute_zero states
only that entropy is at a minimum, not stricly zero. In which case,
I withdraw that comment.

Cheers


First, we have not to forget the Third Law that states that the change 
in entropy in any reaction, as well its derivatives, goes to zero as the 
temperatures goes to zero Kelvin.


In this respect your question is actually nice, as now, I believe, we 
see that it is possible to have a case when the information capacity 
will be more than the number of physical states.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread John Clark
On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyi use...@rudnyi.ru wrote:

 Could you please give me an example of a physical property that is
 context dependent?


Off the top of my head, mass, velocity, duration and length.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Evgenii Rudnyi

On 01.02.2012 22:51 John Mikes said the following:

Evgenii, I am not sure if it is your text, or Russell's:

***In general, I do not understand what does it mean that
information at zero Kelvin is zero. Let us take a coin and cool it
down. Do you mean that the text on the coin will disappear? Or you
mean that no one device can read this text at zero Kelvin?*


This was my question to Russell.


 ** I
doubt that the text embossed on a coin is its *information*. It
is part of the physical structure as e.g. the roundness. size, or
material(?) characteristics - all, what nobody can imagine how to
change for  the condition of 0-Kelvin. The abs. zero temp. conditions


Yes, but when we speak about information carrier (book, a hard drive, 
DVD, flash memory) it is exactly the same. And it has nothing to do with 
the total number of physical states in the device, as this example with 
zero temperature nicely shows.


Evgenii


are extrapolated the best way we could muster. A matter of (sci.)
faith. Maybe the so called 'interstitial' spaces also collapse? I am
not for a 'physicalistic' worldview - rather an agnostic about
'explanations' of diverse epochs based on then recent 'findings'
(mostly mathematically justified??? - realizing that we may be up to
lots of novelties we have no idea about today, not even of the
directions they may shove our views into. I say that in comparison to
our 'conventional scientific' - even everyday's - views of the world
in the past, before and after fundamental knowledge-domains were
added to our inventory. I do not condone evidences that must be,
because THERE IS NO OTHER WAY - in our existing ignorance of course.
Atoms? well, if there *is* 'matter'? (MASS??) even my
(macro)molecules I invented are suspect. So 'entropy' is a nice term
in (classical?) thermodynamics what I coined in 1942 as *the science
that tells us how things would proceed wouldn't they proceed as they
do indeed* thinking of Carnot and the isotherm/reversible
equilibria, etc. - way before the irreversible kind was taught in
college courses. Information is another rather difficult term, I like
to use 'relation' and leave it open what so far unknown relations may
affect our processes we assign to 'causes' known within the model of
the world we think we are in. The rest (including our misunderstood
model - domain) is what I may call an 'infinite complexity' of which
we are part - mostly ignorant about the 'beyond model' everything.

We 'fabricate' our context, try to explain by the portion we know of
- as if it was the totality - and live in our happy conventional
scientific terms. Human ingenuity constructed a miraculous science
and technology that is ALMOST good (some mistakes notwithstanding
occurring), then comes M. Curie, Watson-Crick, Fleming, Copernicus,
Volta, etc. and we re-write the schoolbooks.

John M

**

On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyiuse...@rudnyi.ru
wrote:


On 29.01.2012 22:49 Russell Standish said the following:


On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:


On 28.01.2012 23:26 meekerdb said the following:


On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:



A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show
that

1) There is information


and entropy

that engineers employ.





Some engineers employ information, some the thermodynamic entropy.
I have not seen though an engineering paper where both information
and the thermodynamic entropy have been used as synonyms.

2) There is the thermodynamic entropy.




+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each
other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence
is that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).




First the thermodynamic entropy is not context depended. This must
mean that if it is the same as information, then the latter must
not be context dependent as well. Could you please give me an
example of a physical property that is context dependent?

Second, when I have different numerical values, this could mean
that the units are different. Yet, if this is not the case, then in
my view we are talking about two different entities.

Could you please explain then what is common between 1) and 2)?

Evgenii





-- You received this message because you are subscribed to the
Google Groups Everything List group. To post to this group, send
email to
everything-list@googlegroups.**comeverything-list@googlegroups.com



.

To unsubscribe from this group, send email to
everything-list+unsubscribe@
**googlegroups.comeverything-list%2bunsubscr...@googlegroups.com.



For more options, visit this group at http

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Evgenii Rudnyi

On 01.02.2012 21:51 Stephen P. King said the following:

On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:

On 29.01.2012 22:49 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show
that

1) There is information

and entropy


that engineers employ.


Some engineers employ information, some the thermodynamic entropy.
I have not seen though an engineering paper where both information
and the thermodynamic entropy have been used as synonyms.


2) There is the thermodynamic entropy.


+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each
other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence
is that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).



First the thermodynamic entropy is not context depended. This must
 mean that if it is the same as information, then the latter must
not be context dependent as well. Could you please give me an
example of a physical property that is context dependent?



Temperature is context dependent. If we consider physics at the level
of atoms there is no such a quantity as temperature. Additionally,
thermodynamic entropy does require Boltzmann's constant to be defined
 with is a form of context dependency as it specifies the level at
which we are to take micro-states as macroscopically
indistinguishable.


The Boltzmann's constant, as far as I understand, is defined uniquely. 
If you talk about some other universe (or Platonia) where one could 
imagine something else, then it could be. Yet, in the world that we know 
according to empirical scientific studies, the Boltmann's constant is a 
fundamental constant. Hence I do not understand you in this respect.


Indeed, temperature is not available directly at the level of particles 
obeying classical or quantum laws. However for example it could be not a 
problem with the temperature but rather with the description at the 
particle level.


Anyway, I would suggest to stick to empirical scientific knowledge that 
we have. Then I do not understand what do you mean that temperature is 
context dependent either.


We can imagine very different worlds indeed. Yet, right now we discuss 
the question (I will repeat from the email to John) as follows:


When Russell says that information is context dependent, we talk about 
for example a DVD. Then information capacity as defined by the company 
and the number of physical states are completely different. Hence the 
notation from Russell that information is context dependent.


If you mean that the temperature and the Boltzmann constant are context 
depended in the same way, could you please give practical examples?


Evgenii



Onward!

Stephen



Second, when I have different numerical values, this could mean
that the units are different. Yet, if this is not the case, then in
my view we are talking about two different entities.

Could you please explain then what is common between 1) and 2)?

Evgenii









--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread meekerdb

On 2/2/2012 10:35 AM, Evgenii Rudnyi wrote:
Yes, but when we speak about information carrier (book, a hard drive, DVD, flash memory) 
it is exactly the same. And it has nothing to do with the total number of physical 
states in the device, as this example with zero temperature nicely shows.


That's not true.  The arrangement of ink on the page, the embossed face of the coin, do 
contribute to the physical states.  It's just that the information encoded by them are 
infinitesimal compared to the information required to define the microscopic states, e.g. 
the vibrational mode of every atom.  So when we're concerned with heat energy that changes 
the vibrational modes we neglect the pattern of ink and the emobossing.  And when we're 
reading we are only interested in the information conveyed by a well defined channel, and 
we ignored what information might be encoded in the mircroscopic states.  But the two are 
both present.


Brent.

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Evgenii Rudnyi

On 02.02.2012 20:00 meekerdb said the following:

On 2/2/2012 10:35 AM, Evgenii Rudnyi wrote:

Yes, but when we speak about information carrier (book, a hard
drive, DVD, flash memory) it is exactly the same. And it has
nothing to do with the total number of physical states in the
device, as this example with zero temperature nicely shows.


That's not true. The arrangement of ink on the page, the embossed
face of the coin, do contribute to the physical states. It's just
that the information encoded by them are infinitesimal compared to
the information required to define the microscopic states, e.g. the
vibrational mode of every atom. So when we're concerned with heat
energy that changes the vibrational modes we neglect the pattern of
ink and the emobossing. And when we're reading we are only interested
in the information conveyed by a well defined channel, and we ignored
what information might be encoded in the mircroscopic states. But the
two are both present.

Brent.



Yes, I agree with this, but I think it changes nothing with the term 
information. We have a number of physical states in a carrier (that is 
influenced indeed by for example the arrangement of ink on the page) and 
we have the information capability as defined by the company that sells 
the carrier.


By the way, the example with the zero temperature (or strictly speaking 
with temperature going to zero Kelvin) seems to show that the 
information capability could be even more than the number of physical 
states.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Russell Standish
On Thu, Feb 02, 2012 at 07:45:53PM +0100, Evgenii Rudnyi wrote:
 On 01.02.2012 21:51 Stephen P. King said the following:
 On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:
 First the thermodynamic entropy is not context depended. This must
  mean that if it is the same as information, then the latter must
 not be context dependent as well. Could you please give me an
 example of a physical property that is context dependent?
 
 
 Temperature is context dependent. If we consider physics at the level
 of atoms there is no such a quantity as temperature. Additionally,
 thermodynamic entropy does require Boltzmann's constant to be defined
  with is a form of context dependency as it specifies the level at
 which we are to take micro-states as macroscopically
 indistinguishable.
 
 The Boltzmann's constant, as far as I understand, is defined
 uniquely. If you talk about some other universe (or Platonia) where
 one could imagine something else, then it could be. Yet, in the
 world that we know according to empirical scientific studies, the
 Boltmann's constant is a fundamental constant. Hence I do not
 understand you in this respect.

Boltzmann's constant is a unit conversion constant like c an Plank's
constant, nothing more. It has no fundamental significance.

 
 Indeed, temperature is not available directly at the level of
 particles obeying classical or quantum laws. However for example it
 could be not a problem with the temperature but rather with the
 description at the particle level.
 
 Anyway, I would suggest to stick to empirical scientific knowledge
 that we have. Then I do not understand what do you mean that
 temperature is context dependent either.
 

Temperature is an averaged quantity, so whilst technically an example
of emergence, it is the weakest form of emergence.

Evgenii is stating an oft-repeated meme that entropy is not
context-dependent. 

It is context dependent because it (possibly implicitly) depends on
what we mean by a thermodynamic state. In thermodynamics, we usually
mean a state defined by temperature, pressure, volume, number of
particles, and so on. The and so on is the context dependent
part. There are actually an enormous number of possible independent
thermodyamic variables that may be relevant in different
situations. In an electrical device, the arrangement of charges might
be another such thermodynamic variable.

Also, even in classic schoolbook thermodynamics, not all of
temperature, pressue, volume and particle number are
relevant. Dropping various of these terms leads to different ensembles
(microcanonical, canonical and grand canonical).

Of course, context dependence does not mean subjective. If two
observers agree on the context, the entropy is quite objective. But it
is a little more complex than something like mass or length.

This is explained very well in Denbigh  Denbigh.


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Russell Standish
On Wed, Feb 01, 2012 at 09:17:41PM +0100, Evgenii Rudnyi wrote:
 On 29.01.2012 23:06 Russell Standish said the following:
 
 Absolutely! But at zero kelvin, the information storage capacity of
 the device is precisely zero, so cooling only works to a certain
 point.
 
 
 I believe that you have mentioned once that information is
 negentropy. If yes, could you please comment on that? What
 negentropy would mean?

Scheodinger first pointed out that living systems must export entropy,
and coined the term negative entropy to refer to this. Brillouin
shortened this to negentropy.

The basic formula is S_max = S + I.

S_max is the maximum possible value for entropy to take - the value of
entropy at thermodynamic equilibrium for a microcanonical ensemble. S
is the usual entropy, which for non-equilibrium systems will be
typically lower than S_max, and even for equilibrium systems can be
held lower by physical constraints. I is the difference, and this is what
Brillouin called negentropy. It is an information - the information
encoded in that state.

Try looking up http://en.wikipedia.org/wiki/Negentropy

 
 In general, I do not understand what does it mean that information
 at zero Kelvin is zero. Let us take a coin and cool it down. Do you
 mean that the text on the coin will disappear? Or you mean that no
 one device can read this text at zero Kelvin?
 

I vaguely remembered that S_max=0 at absolute zero. If it were, then
both S and I must be zero, because these are all nonnegative
quantities. But http://en.wikipedia.org/wiki/Absolute_zero states only
that entropy is at a minimum, not stricly zero. In which case, I
withdraw that comment.

Cheers
-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-02 Thread Jason Resch
On Sun, Jan 22, 2012 at 3:04 AM, Evgenii Rudnyi use...@rudnyi.ru wrote:

 On 21.01.2012 22:03 Evgenii Rudnyi said the following:

  On 21.01.2012 21:01 meekerdb said the following:

 On 1/21/2012 11:23 AM, Evgenii Rudnyi wrote:

 On 21.01.2012 20:00 meekerdb said the following:

 On 1/21/2012 4:25 AM, Evgenii Rudnyi wrote:



 ...

  2) If physicists say that information is the entropy, they
 must take it literally and then apply experimental
 thermodynamics to measure information. This however seems
 not to happen.


 It does happen. The number of states, i.e. the information,
 available from a black hole is calculated from it's
 thermodynamic properties as calculated by Hawking. At a more
 conventional level, counting the states available to molecules
 in a gas can be used to determine the specific heat of the gas
 and vice-verse. The reason the thermodynamic measures and the
 information measures are treated separately in engineering
 problems is that the information that is important to
 engineering is infinitesimal compared to the information stored
 in the microscopic states. So the latter is considered only in
 terms of a few macroscopic averages, like temperature and
 pressure.

 Brent


 Doesn't this mean that by information engineers means something
 different as physicists?


 I don't think so. A lot of the work on information theory was done
 by communication engineers who were concerned with the effect of
 thermal noise on bandwidth. Of course engineers specialize more
 narrowly than physics, so within different fields of engineering
 there are different terminologies and different measurement
 methods for things that are unified in basic physics, e.g. there
 are engineers who specialize in magnetism and who seldom need to
 reflect that it is part of EM, there are others who specialize in
 RF and don't worry about static fields.


 Do you mean that engineers use experimental thermodynamics to
 determine information?

 
  Evgenii

 To be concrete. This is for example a paper from control

 J.C. Willems and H.L. Trentelman
 H_inf control in a behavioral context: The full information case
 IEEE Transactions on Automatic Control
 Volume 44, pages 521-536, 1999
 http://homes.esat.kuleuven.be/**~jwillems/Articles/**
 JournalArticles/1999.4.pdfhttp://homes.esat.kuleuven.be/%7Ejwillems/Articles/JournalArticles/1999.4.pdf

 The term information is there but the entropy not. Could you please
 explain why? Or alternatively could you please point out to papers where
 engineers use the concept of the equivalence between the entropy and
 information?



Evgenii,

Sure, I could give a few examples as this somewhat intersects with my line
of work.

 The NIST 800-90 recommendation (
http://csrc.nist.gov/publications/nistpubs/800-90A/SP800-90A.pdf ) for
random number generators is a document for engineers implementing secure
pseudo-random number generators.  An example of where it is important is
when considering entropy sources for seeding a random number generator.  If
you use something completely random, like a fair coin toss, each toss
provides 1 bit of entropy.  The formula is -log2(predictability).  With a
coin flip, you have at best a .5 chance of correctly guessing it, and
-log2(.5) = 1.  If you used a die roll, then each die roll would provide
-log2(1/6) = 2.58 bits of entropy.  The ability to measure unpredictability
is necessary to ensure, for example, that a cryptographic key is at least
as difficult to predict the random inputs that went into generating it as
it would be to brute force the key.

In addition to security, entropy is also an important concept in the field
of data compression.  The amount of entropy in a given bit string
represents the theoretical minimum number of bits it takes to represent the
information.  If 100 bits contain 100 bits of entropy, then there is no
compression algorithm that can represent those 100 bits with fewer than 100
bits.  However, if a 100 bit string contains only 50 bits of entropy, you
could compress it to 50 bits.  For example, let's say you had 100 coin
flips from an unfair coin.  This unfair coin comes up heads 90% of the
time.  Each flip represents -log2(.9) = 0.152 bits of entropy.  Thus, a
sequence of 100 coin flips with this biased coin could be represent with 16
bits.  There is only 15.2 bits of information / entropy contained in that
100 bit long sequence.

Jason

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Evgenii Rudnyi

On 29.01.2012 22:49 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show that

1) There is information

and entropy


that engineers employ.


Some engineers employ information, some the thermodynamic entropy. I 
have not seen though an engineering paper where both information and the 
thermodynamic entropy have been used as synonyms.



2) There is the thermodynamic entropy.


+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence is
that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).



First the thermodynamic entropy is not context depended. This must mean 
that if it is the same as information, then the latter must not be 
context dependent as well. Could you please give me an example of a 
physical property that is context dependent?


Second, when I have different numerical values, this could mean that the 
units are different. Yet, if this is not the case, then in my view we 
are talking about two different entities.


Could you please explain then what is common between 1) and 2)?

Evgenii





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Evgenii Rudnyi

On 29.01.2012 23:00 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:30:38PM +0100, Evgenii Rudnyi wrote:


The problem that I see is that the entropy changes when the
temperature changes. Or do you claim that the entropy of the
memory stick/DVD/hard disc remains the same when its temperature
changes for example from 15 to 25 degrees?


The entropy changes.



Anyway, I do not see how one can obtain the information capacity
of the storage devices from the thermodynamic entropy and this is
my point.



Who was ever claiming that? The theoretically maximum possible
information storage is related, though.


Do you claim, that the information capacity for which we pay money
of a memory stick/DVD/hard disk is equivalent to the thermodynamic
entropy of the device?



Never. The best you have is I=S_max-S, where I is the theoretical


What are S_max and S in this equation?

Evgenii


maximum possible information storage. The value C (capacity of the
storage device) must satisfy

C= I.

Usually C  I, for technological reasons. Also, it is undesirable
to have C vary with temperature, whereas I does vary in general
(particularly across phase transitions).

The information content of a drive is another number D= C, usually
much less, and very dependent on the user of that drive. If the
drive is encrypted, and the user has lost the key, the information
content is close to zero.

The quantities I, C and D are all numerical quantities having the
name information.

Cheers


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Evgenii Rudnyi

On 29.01.2012 23:06 Russell Standish said the following:

On Sat, Jan 28, 2012 at 09:41:27PM -0800, meekerdb wrote:

On 1/28/2012 3:42 PM, Russell Standish wrote:

On the other hand, if you just gave me the metallic platter from
the hard disk, and did not restrict in any way the technology
used to read and write the data, then in principle, the higher
the temperature, the more information is capable of being encoded
on the disk.


I don't think this is quite right. A higher temperature means that
there are more energy states available.  But the concept of
'temperature' implies that these are occupied in a random way
(according to the micro-canonical ensemble). For us to read and
write data requires that the act of reading or writing a bit moves
the distribution of states in phase space enough that it is
distinguishable from the random fluctuations due to temperature.
So if the medium is hotter, you need to use more energy to read
and write a bit.  This of course runs into the problems you note
below.


Hence the requirement that technology not be fixed. It is a
theoretician's answer :).


So in practice it is often colder systems that allow us to store
more data because then we can use small energy differences to
encode bits.


Absolutely! But at zero kelvin, the information storage capacity of
the device is precisely zero, so cooling only works to a certain
point.



I believe that you have mentioned once that information is negentropy. 
If yes, could you please comment on that? What negentropy would mean?


In general, I do not understand what does it mean that information at 
zero Kelvin is zero. Let us take a coin and cool it down. Do you mean 
that the text on the coin will disappear? Or you mean that no one device 
can read this text at zero Kelvin?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread Stephen P. King

On 2/1/2012 3:10 PM, Evgenii Rudnyi wrote:

On 29.01.2012 22:49 Russell Standish said the following:

On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


A good suggestion. It well might be that I express my thoughts
unclear, sorry for that. Yet, I think that my examples show that

1) There is information

and entropy


that engineers employ.


Some engineers employ information, some the thermodynamic entropy. I 
have not seen though an engineering paper where both information and 
the thermodynamic entropy have been used as synonyms.



2) There is the thermodynamic entropy.


+ thermodynamic information



3) Numerical values in 1) and 2) are not related to each other.



Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence is
that notions of information and entropy are complete context
sensitive (that is not to say their subjective as such - people
agreeing on the context will agree on the numerical values).



First the thermodynamic entropy is not context depended. This must 
mean that if it is the same as information, then the latter must not 
be context dependent as well. Could you please give me an example of a 
physical property that is context dependent?




Temperature is context dependent. If we consider physics at the 
level of atoms there is no such a quantity as temperature. Additionally, 
thermodynamic entropy does require Boltzmann's constant to be defined 
with is a form of context dependency as it specifies the level at which 
we are to take micro-states as macroscopically indistinguishable.


Onward!

Stephen


Second, when I have different numerical values, this could mean that 
the units are different. Yet, if this is not the case, then in my view 
we are talking about two different entities.


Could you please explain then what is common between 1) and 2)?

Evgenii







--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-02-01 Thread John Mikes
Evgenii, I am not sure if it is your text, or Russell's:

   ***In general, I do not understand what does it mean that information
at zero Kelvin is zero. Let us take a coin and cool it down. Do you mean
that the text on the coin will disappear? Or you mean that no one device
can read this text at zero Kelvin?*
**
I doubt that the text embossed on a coin is its *information*. It is
part of the physical structure as e.g. the roundness. size, or
material(?) characteristics - all, what nobody can imagine how to change
for  the condition of 0-Kelvin. The abs. zero temp. conditions are
extrapolated the best way we could muster. A matter of (sci.) faith. Maybe
the so called 'interstitial' spaces also collapse? I am not for a
'physicalistic' worldview - rather an agnostic about 'explanations' of
diverse epochs based on then recent 'findings' (mostly mathematically
justified??? -
realizing that we may be up to lots of novelties we have no idea about
today, not even of the directions they may shove our views into. I say that
in comparison to our 'conventional scientific' - even everyday's - views of
the world in the past, before and after fundamental knowledge-domains were
added to our inventory.
I do not condone evidences that must be, because THERE IS NO OTHER WAY -
in our existing ignorance of course. Atoms? well, if there *is* 'matter'?
(MASS??) even my (macro)molecules I invented are suspect.
So 'entropy' is a nice term in (classical?) thermodynamics what I coined in
1942 as *the science that tells us how things would proceed wouldn't they
proceed as they do indeed* thinking of Carnot and the isotherm/reversible
equilibria, etc. - way before the irreversible kind was taught in college
courses. Information is another rather difficult term, I like to use
'relation' and leave it open what so far unknown relations may affect our
processes we assign to 'causes' known within the model of the world we
think we are in. The rest (including our misunderstood model - domain) is
what I may call an 'infinite complexity' of which we are part - mostly
ignorant about the 'beyond model' everything.

We 'fabricate' our context, try to explain by the portion we know of - as
if it was the totality - and live in our happy conventional scientific
terms.
Human ingenuity constructed a miraculous science and technology that is
ALMOST good (some mistakes notwithstanding occurring), then comes M. Curie,
Watson-Crick, Fleming, Copernicus, Volta, etc. and we re-write the
schoolbooks.

John M

**

On Wed, Feb 1, 2012 at 3:10 PM, Evgenii Rudnyi use...@rudnyi.ru wrote:

 On 29.01.2012 22:49 Russell Standish said the following:

 On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:

 On 28.01.2012 23:26 meekerdb said the following:

 On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


 A good suggestion. It well might be that I express my thoughts
 unclear, sorry for that. Yet, I think that my examples show that

 1) There is information

 and entropy

 that engineers employ.


 Some engineers employ information, some the thermodynamic entropy. I have
 not seen though an engineering paper where both information and the
 thermodynamic entropy have been used as synonyms.

  2) There is the thermodynamic entropy.


 + thermodynamic information


 3) Numerical values in 1) and 2) are not related to each other.


 Fixed that for you. Why should you expect the different types of
 information that come from different contexts to have the same
 numerical value? The whole point of On complexity and emergence is
 that notions of information and entropy are complete context
 sensitive (that is not to say their subjective as such - people
 agreeing on the context will agree on the numerical values).



 First the thermodynamic entropy is not context depended. This must mean
 that if it is the same as information, then the latter must not be context
 dependent as well. Could you please give me an example of a physical
 property that is context dependent?

 Second, when I have different numerical values, this could mean that the
 units are different. Yet, if this is not the case, then in my view we are
 talking about two different entities.

 Could you please explain then what is common between 1) and 2)?

 Evgenii



 --
 You received this message because you are subscribed to the Google Groups
 Everything List group.
 To post to this group, send email to 
 everything-list@googlegroups.**comeverything-list@googlegroups.com
 .
 To unsubscribe from this group, send email to everything-list+unsubscribe@
 **googlegroups.com everything-list%2bunsubscr...@googlegroups.com.
 For more options, visit this group at http://groups.google.com/**
 group/everything-list?hl=enhttp://groups.google.com/group/everything-list?hl=en
 .



-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-31 Thread John Clark
On Mon, Jan 30, 2012  Craig Weinberg whatsons...@gmail.com wrote:

 I just explained


3 days after learning that the subject even existed here we sit at your
feet while you explain all about it to us.


  that Shannon information has nothing to do with anything except data
 compression.


Except for data compression? Except for identifying the core, must have,
part of any message. Except for telling us exactly what's important and
what is not. Except for showing how to build things like the internet.

Except for that Mrs. Lincoln how did you like the play?

Shannon can tell you how many books can be sent over a noisy wire in a
given amout of time without error, and if you're willing to tolerate a few
errors Shannon can tell you how to send even more. If the contents of books
is not information what do you call the contents of books?

 Nothing can become a 'file' without irreversible loss.


Ah, well, that explains why I can't make heads or tails out of your ideas,
all I've seen is your mail files, now if I'd seen your original glorious
Email just as it was as you typed it on your computer screen with no
irreversible loss I would have long ago become convinced you were right and
were in fact the second coming of Issac Newton. So when you respond to this
post please don't send me a file full of irreversible loss, send me your
ORIGINAL, send me the real deal.


  The terms signal and noise refer to information (signal) and entropy
 (noise). Get it straight.


One man's signal is another man's noise, to a fan of hisses and clicks and
pops the music is the noise.  First you decide what you want to call the
signal and then Shannon can tell you what the signal to noise ratio is and
he can show you ways to improve it.

 And your way of dealing with it is to say it (bits electrons information
 logic etc) does not exist. I would never have guessed that coming up with a
 theory of everything could be so easy.


 If you understand my hypothesis then you will see there is no reason to
 think they exist.


Then I dearly hope my mind never goes so soft that I understand your
hypothesis.

 Just as you think free will has no reason to exist.


No no a thousand times no! Free will would have to improve dramatically
before it could have the lofty property of nonexistence; free will is a
idea so bad its not even wrong.


  I thought Foucault's Discipline and Punish was one of the most
 interesting books I've ever read.


I don't consider social criticism a part of philosophy even if I agree with
it because it always includes matters of taste. Professional philosophers
might write interesting books about history or about what society should or
should not do, but none of them have contributed to our understanding of
the nature of reality in centuries. That's not to say philosophy hasn't
made progress, it just wasn't made by philosophers.

 Feynman I think would have been intrigued by my ideas


Delusions of grandeur.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-30 Thread Craig Weinberg
On Jan 30, 12:03 am, John Clark johnkcl...@gmail.com wrote:
 On Sun, Jan 29, 2012 Craig Weinberg whatsons...@gmail.com wrote:

  I'm not talking about fluid flow,

 OK

  I'm talking about simulating everything - potential and actual chemical
  reactions, etc.

 OK

  Water can be described by multiplying the known interactions of H2O,

 But many, probably most, of water's interactions are unknown to this
 day. Virtually all of organic chemistry (including DNA reactions!)
 involves water somewhere in the chain of reaction, but organic chemistry
 is very far from a closed subject, there is still much to learn.

cool. I didn't know that. What about DNA though? Why would it be any
less mysterious?

 Another
 example, up to now nobody has derived the temperature that water freezes
 at from first principles because the resulting quantum mechanical
 equations are so mathematically complicated that nobody has yet figured
 out how to solve them.

Water is strange stuff. It's blue color comes from inside of it too.
Intramolecular collisions rather than reflection.


  DNA would need many more variables.

 BULLSHIT!


Why?

  Non-Shannon information would be anything that is not directly involved
  in the compression of a digitally sampled description into another digital
  description.

 In other words non-Shannon information is gaseous philosophical flatulence.

Uhh, what? I just explained that Shannon information has nothing to do
with anything except data compression. It's like I just explained what
a catalytic converter is and you said 'in other words non-catalytic
converters are gaseous philosophical flatulence.'


         Shannon information is not information in general, it is [...]



 Shannon published his work in 1948 but you never even heard about it
 until 3 days ago, and now you're a great world authority on the subject
 telling us all exactly what it does and does not mean.

I'm only the expert compared to you, since your explanation which you
argued with all the authority of a seasoned expert was dead wrong.
Precisely wrong.

 I don't mind
 ignorance, I'm ignorant about a lot of stuff myself, but there is a
 certain kind of arrogant aggressive ignorance that I find very distasteful.

That sentence embodies it perfectly.


 In contrast Richard Feynman displayed humble ignorance, he did as much
 as anyone to develop Quantum Mechanics but he said I think it's safe to
 say that nobody understands Quantum Mechanics, in describing the work
 that won him the Nobel Prize he said he found a way to sweep
 mathematical difficulties under the rug. He also said I know how hard
 it is to really know something; how careful you have to be about
 checking the experiments; how easy it is to make mistakes and fool
 yourself. I know what it means to know something.

Yes, I'm familiar with Feynman.


         Compression and encryption are deformations.



 If you can get the exact same file out after compression or encryption
 then obviously nothing has been lost and all deformations or shrinkage
 are reversible.

Nothing can become a 'file' without irreversible loss. Once it's a
file, sure you can do all kinds of transformations to it, but you'll
never get the original live band playing a song off of an mp3.


         I understand what you mean completely



 Apparently not

No, I have understood you from the start. I knew you were wrong about
information and entropy and you were. You don't understand my position
though, so you assume it's senseless and throw things in my general
direction.


         White noise is called noise for a reason.



 And its called white for a reason, a evil occidental mindset
 conspiracy created by round eyed white devils.

I would imagine it's called white because it is additive interference.
My point still stands. The terms signal and noise refer to information
(signal) and entropy (noise). Get it straight. Or don't.


  How do you expect mathematics to deal with anything as subjective as
  quality? A novel that's high quality to you may be junk to me.

  I don't expect mathematics to deal with it. I expect a theory of
  everything to deal with it.

 And your way of dealing with it is to say it (bits electrons information
 logic etc) does not exist. I would never have guessed that coming up
 with a theory of everything could be so easy.

If you understand my hypothesis then you will see there is no reason
to think they exist. Just as you think free will has no reason to
exist.


  I'm not a big philosophy or religion fan myself but Wittgenstein,
  Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
  impressive things to say.

 As I've said before nearly everything they and all other recent
 philosophers say can be put into one of four categories:

 1) False.
 2) True but obvious, a truism disguised in pretentious language.
 3) True and deep but discovered first and explained better by a
 mathematician or scientist or someone else who didn't write
 philosopher in the box 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Evgenii Rudnyi

On 28.01.2012 23:26 meekerdb said the following:

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:


...


You disagree that engineers do not use thermodynamic entropy



Yes. I disagreed that information has nothing to do with
thermodynamic entropy, as you wrote above. You keep switching
formulations. You write X and ask if I agree. I disagree. Then you
claim I've disagreed with Y. Please pay attention to your own
writing. There's a difference between X is used in place of Y and
X has nothing to do with Y.


A good suggestion. It well might be that I express my thoughts unclear, 
sorry for that. Yet, I think that my examples show that


1) There is information that engineers employ.

2) There is the thermodynamic entropy.

3) Numerical values in 1) and 2) are not related to each other.

Otherwise I would appreciate if you express the relationship between 
information that engineers use and the thermodynamic entropy in your own 
words, as this is the question that I would like to understand.


I understand you when you say about the number of microstates. I do not 
understand though how they are related to the information employed by 
engineers. I would be glad to hear your comment on that.


Evgenii


but you have not shown yet how information in engineering is
related with the thermodynamic entropy. Form the Millipede example


http://en.wikipedia.org/wiki/Millipede_memory


The earliest generation millipede devices used probes 10
nanometers in diameter and 70 nanometers in length, producing pits
about 40 nm in diameter on fields 92 µm x 92 µm. Arranged in a 32 x
32 grid, the resulting 3 mm x 3 mm chip stores 500 megabits of data
or 62.5 MB, resulting in an areal density, the number of bits per
square inch, on the order of 200 Gbit/in².

If would be much easier to understand you if you say to what
thermodynamic entropy corresponds the value of 62.5 MB in
Millipede.



The Shannon information capacity is 5e8 bits. The thermodynamic
entropy depends on the energy used to switch a memory element. I'd
guess it must correspond to at least few tens of thousands of
electrons at 9v, so

S ~ [5e8 * 9e4 eV]/[8.6e-5 eV/degK * 300degK]~17e15

So the total entropy is about 17e15+5e8, and the information portion
is numerically (but not functionally) negligible compared to the
thermodynamic.

Brent



The only example on Thermodynamic Entropy == Information so far
from you was the work on a black hole. However, as far as I know,
there is no theory yet to describe a black hole, as from one side
you need gravitation, from the other side quantum effects. The
theory that unites them seems not to exist.

Evgenii




My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do
not employ the thermodynamic entropy to estimate its
information capabilities. Also, the increase of temperature
would be destroy saved information there.

Well, I might be deliberately obtuse indeed. Yet with the only
goal to reach a clear definition of what the information is.
Right now I would say that there is information in engineering
and in physics and they are different. The first I roughly
understand and the second not.

Evgenii




Brent











--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Evgenii Rudnyi

On 29.01.2012 00:42 Russell Standish said the following:

On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:


...


In general we are surrounded devices that store information (hard
discs, memory sticks, DVD, etc.). The information that these
devices can store, I believe, is known with accuracy to one bit.


Because they're engineered that way. It would be rather inconvenient
if one's information storage varied with temperature.


Can you suggest a thermodynamic state which entropy gives us
exactly that amount of information?

Here would be again a question about temperature. If I operate my
memory stick in some reasonable range of temperatures, the
information it contains does not change. Yet, the entropy in my
view changes.


Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that matter.
What's the problem with that?


The problem that I see is that the entropy changes when the temperature 
changes. Or do you claim that the entropy of the memory stick/DVD/hard 
disc remains the same when its temperature changes for example from 15 
to 25 degrees?


Anyway, I do not see how one can obtain the information capacity of the 
storage devices from the thermodynamic entropy and this is my point.


Do you claim, that the information capacity for which we pay money of a 
memory stick/DVD/hard disk is equivalent to the thermodynamic entropy of 
the device?


Evgenii



So these are my doubts for which I do not see an answer.

Evgenii





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Russell Standish
On Sun, Jan 29, 2012 at 04:23:12PM +0100, Evgenii Rudnyi wrote:
 On 28.01.2012 23:26 meekerdb said the following:
 On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:
 
 A good suggestion. It well might be that I express my thoughts
 unclear, sorry for that. Yet, I think that my examples show that
 
 1) There is information 
and entropy 

 that engineers employ.
 
 2) There is the thermodynamic entropy.

+ thermodynamic information

 
 3) Numerical values in 1) and 2) are not related to each other.
 

Fixed that for you. Why should you expect the different types of
information that come from different contexts to have the same
numerical value? The whole point of On complexity and emergence is
that notions of information and entropy are complete context sensitive
(that is not to say their subjective as such - people agreeing on the
context will agree on the numerical values).


-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Russell Standish
On Sun, Jan 29, 2012 at 04:30:38PM +0100, Evgenii Rudnyi wrote:
 
 The problem that I see is that the entropy changes when the
 temperature changes. Or do you claim that the entropy of the memory
 stick/DVD/hard disc remains the same when its temperature changes
 for example from 15 to 25 degrees?

The entropy changes.

 
 Anyway, I do not see how one can obtain the information capacity of
 the storage devices from the thermodynamic entropy and this is my
 point.
 

Who was ever claiming that? The theoretically maximum possible
information storage is related, though.

 Do you claim, that the information capacity for which we pay money
 of a memory stick/DVD/hard disk is equivalent to the thermodynamic
 entropy of the device?
 

Never. The best you have is I=S_max-S, where I is the theoretical
maximum possible information storage. The value C (capacity of the
storage device) must satisfy

C = I.

Usually C  I, for technological reasons. Also, it is undesirable to
have C vary with temperature, whereas I does vary in general
(particularly across phase transitions).

The information content of a drive is another number D = C, usually
much less, and very dependent on the user of that drive. If the drive
is encrypted, and the user has lost the key, the information content
is close to zero.

The quantities I, C and D are all numerical quantities having the name
information. 

Cheers
-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread Russell Standish
On Sat, Jan 28, 2012 at 09:41:27PM -0800, meekerdb wrote:
 On 1/28/2012 3:42 PM, Russell Standish wrote:
 On the other hand, if you just gave me the metallic platter from the
 hard disk, and did not restrict in any way the technology used to read
 and write the data, then in principle, the higher the temperature, the
 more information is capable of being encoded on the disk.
 
 I don't think this is quite right. A higher temperature means that
 there are more energy states available.  But the concept of
 'temperature' implies that these are occupied in a random way
 (according to the micro-canonical ensemble). For us to read and
 write data requires that the act of reading or writing a bit moves
 the distribution of states in phase space enough that it is
 distinguishable from the random fluctuations due to temperature. 
  So
 if the medium is hotter, you need to use more energy to read and
 write a bit.  This of course runs into the problems you note below.

Hence the requirement that technology not be fixed. It is a
theoretician's answer :).

 So in practice it is often colder systems that allow us to store
 more data because then we can use small energy differences to encode
 bits.

Absolutely! But at zero kelvin, the information storage capacity of the
device is precisely zero, so cooling only works to a certain point.

 
 Brent
 
 
 In practice, various phase transitions will make this more difficult
 to achieve as temperature is increased. Passing the curie point, for
 instance, will mean we can no longer rely on magnetism, although
 presumably even below the curie point we can increase the information
 storage in some other way (eg moving atoms around by an STM) and
 ignoring the ferromagnetic behaviour. By the same token, passing the
 freezing and boiling points will make it even harder - but still
 doable with sufficiently advanced technology.
 
  From an engineering viewpoint it looks a bit strange.
 How so?
 
 If engineers would take the statement the maximum possible value
 for information increases with temperature literally, they should
 operate a hard disk at higher temperatures (the higher the better
 according to such a statement). Yet this does not happens. Do you
 know why?
 
 In general we are surrounded devices that store information (hard
 discs, memory sticks, DVD, etc.). The information that these devices
 can store, I believe, is known with accuracy to one bit.
 Because they're engineered that way. It would be rather inconvenient if
 one's information storage varied with temperature.
 
 Can you
 suggest a thermodynamic state which entropy gives us exactly that
 amount of information?
 
 Here would be again a question about temperature. If I operate my
 memory stick in some reasonable range of temperatures, the
 information it contains does not change. Yet, the entropy in my view
 changes.
 Sure - because they're engineered that way, and they operate a long
 way from the theoretical maximum storage capability of that
 matter. What's the problem with that?
 
 So these are my doubts for which I do not see an answer.
 
 Evgenii
 
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to 
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/everything-list?hl=en.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-29 Thread John Clark
On Sun, Jan 29, 2012 Craig Weinberg whatsons...@gmail.com wrote:

 I'm not talking about fluid flow,


OK

 I'm talking about simulating everything - potential and actual chemical
 reactions, etc.


OK

 Water can be described by multiplying the known interactions of H2O,


But many, probably most, of water's interactions are unknown to this
day. Virtually all of organic chemistry (including DNA reactions!)
involves water somewhere in the chain of reaction, but organic chemistry
is very far from a closed subject, there is still much to learn. Another
example, up to now nobody has derived the temperature that water freezes
at from first principles because the resulting quantum mechanical
equations are so mathematically complicated that nobody has yet figured
out how to solve them.

 DNA would need many more variables.


BULLSHIT!

 Non-Shannon information would be anything that is not directly involved
 in the compression of a digitally sampled description into another digital
 description.


In other words non-Shannon information is gaseous philosophical flatulence.

Shannon information is not information in general, it is [...]


Shannon published his work in 1948 but you never even heard about it
until 3 days ago, and now you're a great world authority on the subject
telling us all exactly what it does and does not mean. I don't mind
ignorance, I'm ignorant about a lot of stuff myself, but there is a
certain kind of arrogant aggressive ignorance that I find very distasteful.

In contrast Richard Feynman displayed humble ignorance, he did as much
as anyone to develop Quantum Mechanics but he said I think it's safe to
say that nobody understands Quantum Mechanics, in describing the work
that won him the Nobel Prize he said he found a way to sweep
mathematical difficulties under the rug. He also said I know how hard
it is to really know something; how careful you have to be about
checking the experiments; how easy it is to make mistakes and fool
yourself. I know what it means to know something.

Compression and encryption are deformations.


If you can get the exact same file out after compression or encryption
then obviously nothing has been lost and all deformations or shrinkage
are reversible.

I understand what you mean completely


Apparently not

White noise is called noise for a reason.


And its called white for a reason, a evil occidental mindset
conspiracy created by round eyed white devils.


 How do you expect mathematics to deal with anything as subjective as
 quality? A novel that's high quality to you may be junk to me.


 I don't expect mathematics to deal with it. I expect a theory of
 everything to deal with it.


And your way of dealing with it is to say it (bits electrons information
logic etc) does not exist. I would never have guessed that coming up
with a theory of everything could be so easy.

 I'm not a big philosophy or religion fan myself but Wittgenstein,
 Heidegger, Sarte, Foucault, Kierkegaard were recent and had some
 impressive things to say.


As I've said before nearly everything they and all other recent
philosophers say can be put into one of four categories:

1) False.
2) True but obvious, a truism disguised in pretentious language.
3) True and deep but discovered first and explained better by a
mathematician or scientist or someone else who didn't write
philosopher in the box labeled occupation on his tax form.
4) So bad its not even wrong.

 Here's some sample articles on the subject:


I know how to look up things on Google too, and I wonder how many of the
authors of those articles graduated from high school.

 Science begins when you distrust experts. - Richard Feynman. You're
 right, I'll trust Feynman.


If you think Feynman would treat your ideas with anything other than
contempt you're nuts. And you should look at the short one minute video
by Feynman called You don't like it? Go somewhere else!:

http://www.youtube.com/watch?v=iMDTcMD6pOw


 John K Clark
YouTube - Videos from this email

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Evgenii Rudnyi

On 28.01.2012 00:24 Craig Weinberg said the following:

On Jan 27, 1:31 pm, John Clarkjohnkcl...@gmail.com  wrote:

On Thu, Jan 26, 2012 at 8:03 PM, Craig
Weinbergwhatsons...@gmail.comwrote:


With the second law of thermodynamics, it seems like heat could
only dissipate by heating something else up.


The second law says that energy will tend to get diluted in space
over time, and heat conducting to other matter is one way for this
to happen but it is not the only way. Photons radiating outward in
all directions from a hot object is another way energy can get
diluted. But among many other things, you don't think photons, or
logic, exist so I doubt this answer will satisfy you.


It would satisfy me if I you had some examples, but I don't think
that you know the answer for sure. If a vacuum is a good insulator
(like a vacuum thermos) and a perfect vacuum, as far as I have been
able to read online, is a perfect insulator. Electricity and heat
pass from object to object, not from space to space. Please point out
any source you can find to the contrary. What little I find agrees
with vacuums being insulators of heat and electricity.


Graig,

Radiation does happen. If you need more detail, there is a nice free 
book from MIT


A Heat Transfer Textbook,  4th edition
John H. Lienhard IV, Professor, University of Houston
John H. Lienhard V, Professor, Massachusetts Institute of Technology

http://web.mit.edu/lienhard/www/ahtt.html

Some disadvantage is that it is thick but you go directly to Part IV 
Thermal Radiation Heat Transfer. Vacuum is a good insulator but thermal 
radiation gets through.


It is pretty important for example to include radiation in the case of 
free convection as it may account up to 40% of heat transfer in this case.


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Russell Standish
On Sat, Jan 28, 2012 at 08:58:54AM +0100, Evgenii Rudnyi wrote:
 On 27.01.2012 23:46 Russell Standish said the following:
 For one thing, it indicates to storing just two bits of information
 on these physical substrates is grossly inefficient!
 
 Well, you could contact governments then and try to convince them
 that coins in use are highly inefficient. It would be a good chance
 to have funding.

Chuckle. Maybe we can persuade them to get behind bitcoin :).

 
 By the way, at what temperature there will be possible to save more
 information, at higher or at lower one. 

What does this mean?

 Brent and John are talking
 about the entropy and the higher temperature the higher the entropy.

True. But information has no such relationship with temperature, other
than that the maximum possible value for information increases with temperature.

Remember the equation S+I = S_max. S_max obviously increases with
temperature. So usually does S, but S can be decreased by organisation
of the matter - by the input of information.

 From an engineering viewpoint it looks a bit strange.

How so?

 
 Evgenii
 
 Cheers
 
 
 -- 
 You received this message because you are subscribed to the Google Groups 
 Everything List group.
 To post to this group, send email to everything-list@googlegroups.com.
 To unsubscribe from this group, send email to 
 everything-list+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/everything-list?hl=en.

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Evgenii Rudnyi

On 28.01.2012 11:20 Russell Standish said the following:

On Sat, Jan 28, 2012 at 08:58:54AM +0100, Evgenii Rudnyi wrote:

On 27.01.2012 23:46 Russell Standish said the following:

For one thing, it indicates to storing just two bits of
information on these physical substrates is grossly inefficient!


Well, you could contact governments then and try to convince them
that coins in use are highly inefficient. It would be a good
chance to have funding.


Chuckle. Maybe we can persuade them to get behind bitcoin :).



By the way, at what temperature there will be possible to save
more information, at higher or at lower one.


What does this mean?


Let us take a hard disk. Can I save more information on it at higher or 
lower temperatures?





Brent and John are talking about the entropy and the higher
temperature the higher the entropy.


True. But information has no such relationship with temperature,
other than that the maximum possible value for information increases
with temperature.

Remember the equation S+I = S_max. S_max obviously increases with
temperature. So usually does S, but S can be decreased by
organisation of the matter - by the input of information.


From an engineering viewpoint it looks a bit strange.


 How so?


If engineers would take the statement the maximum possible value for 
information increases with temperature literally, they should operate a 
hard disk at higher temperatures (the higher the better according to 
such a statement). Yet this does not happens. Do you know why?


In general we are surrounded devices that store information (hard discs, 
memory sticks, DVD, etc.). The information that these devices can store, 
I believe, is known with accuracy to one bit. Can you suggest a 
thermodynamic state which entropy gives us exactly that amount of 
information?


Here would be again a question about temperature. If I operate my memory 
stick in some reasonable range of temperatures, the information it 
contains does not change. Yet, the entropy in my view changes.


So these are my doubts for which I do not see an answer.

Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread meekerdb

On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:

On 27.01.2012 23:03 meekerdb said the following:

On 1/27/2012 12:43 PM, Evgenii Rudnyi wrote:

On 27.01.2012 21:22 meekerdb said the following:

On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what
you are saying. Let us consider a string 10 for
simplicity. Let us consider the next cases. I will cite
first the thermodynamic properties of Ag and Al from CODATA
tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 =
2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then
the thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a
string 10 and the thermodynamic entropy is different. If
we take the statement literally then the information must
be different in all four cases and defined uniquely as the
thermodynamic entropy is already there. Yet in my view this
makes little sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the
phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the
*change* in entropy per degree at the given temperature. It's
a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More
available phase space means more uncertainty of the exact
actual state and hence more information entropy. This
information is enormous compared to the 01 stamped on the
plate, the shape of the plate or any other aspects that we
would normally use to convey information. It would only be in
case we cooled the plate to near absolute zero and then tried
to encode information in its microscopic vibrational states
that the thermodynamic and the encoded information entropy
would become similar.



I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy.
Don't you agree?


Obviously not since I wrote above that the thermodynamic entropy
is a measure of how much information it would take to locate the
exact state within the phase space allowed by the thermodynamic
parameters.


Does this what engineers use when they develop communication
devices?





It would certainly interesting to consider what happens when
we decrease the temperature (in the limit to zero Kelvin).
According to the Third Law the entropy will be zero then. What
do you think, can we save less information on a copper plate at
low temperatures as compared with higher temperatures? Or
more?


Are you being deliberately obtuse? Information encoded in the
shape of the plate is not accounted for in the thermodynamic
tables - they are just based on ideal bulk material (ignoring
boundaries).


I am just trying to understand the meaning of the term information
 that you use. I would say that there is the thermodynamic entropy
and then the Shannon information entropy. The Shannon has developed
a theory to help engineers to deal with communication (I believe
that you have also recently a similar statement). Yet, in my view
when we talk about communication devices and mechatronics, the
information that engineers are interested in has nothing to do with
the thermodynamic entropy. Do you agree or disagree with that? If
you disagree, could you please give an example from engineering
where engineers do employ the thermodynamic entropy as the estimate
of information.


I already said I disagreed. You are confusing two different things.
Because structural engineers don't employ the theory of interatomic
forces it doesn't follow that interactomic forces have nothing to do
 with sturctural properties.

Brent


You disagree that engineers do not use thermodynamic entropy



Yes. I disagreed that information has nothing to do with thermodynamic entropy, as you 
wrote above. You keep switching formulations.  You write X and ask if I agree. I 
disagree.  Then you claim I've disagreed with Y. Please pay attention to your own 
writing.  There's a difference between X is used in place of Y and X has nothing to do 
with Y.


but you have not shown yet how information in engineering is related with the 
thermodynamic entropy. Form the Millipede example


 http://en.wikipedia.org/wiki/Millipede_memory


Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread meekerdb

On 1/27/2012 11:58 PM, Evgenii Rudnyi wrote:

On 27.01.2012 23:46 Russell Standish said the following:

On Fri, Jan 27, 2012 at 08:27:31PM +0100, Evgenii Rudnyi wrote:

On 26.01.2012 12:00 Russell Standish said the following:

If you included these two bits, the thermodynamic entropy is two
bits less, = 4.15 x 10^{-24} J/K less

This is so many orders of magnitude less than the entropy due to
the material, its probably not worth including, but it is there.


I do not believe that effects below the experimental noise are
important for empirical science. You probably mean then some other
science, it would be good if you define what science you mean.

Evgenii


For one thing, it indicates to storing just two bits of information
on these physical substrates is grossly inefficient!


Well, you could contact governments then and try to convince them that coins in use are 
highly inefficient. It would be a good chance to have funding.


By the way, at what temperature there will be possible to save more information, at 
higher or at lower one. Brent and John are talking about the entropy and the higher 
temperature the higher the entropy. From an engineering viewpoint it looks a bit strange.


At a higher temperature there are more microstates accessible and hence more uncertainty 
as to which state is actually realized.  But if you're storing information, which you want 
to retrieve, this uncertainty is noise and you have to use larger increments of energy to 
reliably switch states.  So for storage it is more efficient (takes less energy per bit) 
to be colder.


Brent



Evgenii


Cheers





--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Russell Standish
On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:
 
 Let us take a hard disk. Can I save more information on it at higher
 or lower temperatures?

This is a strictly ambiguous question. If we take the usual meaning of
hard disk as including a particular apparatus (heads, controller
logic, SATA interface and so on) to read and write the data, then
there will be a limited range of temperatures over which that
apparatus will operate. Outside of that range, (both higher and lower)
the information storage will fall to zero. That is a purely
engineering question.

On the other hand, if you just gave me the metallic platter from the
hard disk, and did not restrict in any way the technology used to read
and write the data, then in principle, the higher the temperature, the
more information is capable of being encoded on the disk. 

In practice, various phase transitions will make this more difficult
to achieve as temperature is increased. Passing the curie point, for
instance, will mean we can no longer rely on magnetism, although
presumably even below the curie point we can increase the information
storage in some other way (eg moving atoms around by an STM) and
ignoring the ferromagnetic behaviour. By the same token, passing the
freezing and boiling points will make it even harder - but still
doable with sufficiently advanced technology.

 
 From an engineering viewpoint it looks a bit strange.
 
  How so?
 
 
 If engineers would take the statement the maximum possible value
 for information increases with temperature literally, they should
 operate a hard disk at higher temperatures (the higher the better
 according to such a statement). Yet this does not happens. Do you
 know why?
 
 In general we are surrounded devices that store information (hard
 discs, memory sticks, DVD, etc.). The information that these devices
 can store, I believe, is known with accuracy to one bit. 

Because they're engineered that way. It would be rather inconvenient if
one's information storage varied with temperature. 

 Can you
 suggest a thermodynamic state which entropy gives us exactly that
 amount of information?
 
 Here would be again a question about temperature. If I operate my
 memory stick in some reasonable range of temperatures, the
 information it contains does not change. Yet, the entropy in my view
 changes.

Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that
matter. What's the problem with that?

 
 So these are my doubts for which I do not see an answer.
 
 Evgenii
 

-- 


Prof Russell Standish  Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Professor of Mathematics  hpco...@hpcoders.com.au
University of New South Wales  http://www.hpcoders.com.au


-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread meekerdb

On 1/28/2012 3:42 PM, Russell Standish wrote:

On Sat, Jan 28, 2012 at 12:05:57PM +0100, Evgenii Rudnyi wrote:

Let us take a hard disk. Can I save more information on it at higher
or lower temperatures?

This is a strictly ambiguous question. If we take the usual meaning of
hard disk as including a particular apparatus (heads, controller
logic, SATA interface and so on) to read and write the data, then
there will be a limited range of temperatures over which that
apparatus will operate. Outside of that range, (both higher and lower)
the information storage will fall to zero. That is a purely
engineering question.

On the other hand, if you just gave me the metallic platter from the
hard disk, and did not restrict in any way the technology used to read
and write the data, then in principle, the higher the temperature, the
more information is capable of being encoded on the disk.


I don't think this is quite right. A higher temperature means that there are more energy 
states available.  But the concept of 'temperature' implies that these are occupied in a 
random way (according to the micro-canonical ensemble). For us to read and write data 
requires that the act of reading or writing a bit moves the distribution of states in 
phase space enough that it is distinguishable from the random fluctuations due to 
temperature.  So if the medium is hotter, you need to use more energy to read and write a 
bit.  This of course runs into the problems you note below.  So in practice it is often 
colder systems that allow us to store more data because then we can use small energy 
differences to encode bits.


Brent



In practice, various phase transitions will make this more difficult
to achieve as temperature is increased. Passing the curie point, for
instance, will mean we can no longer rely on magnetism, although
presumably even below the curie point we can increase the information
storage in some other way (eg moving atoms around by an STM) and
ignoring the ferromagnetic behaviour. By the same token, passing the
freezing and boiling points will make it even harder - but still
doable with sufficiently advanced technology.


 From an engineering viewpoint it looks a bit strange.
How so?


If engineers would take the statement the maximum possible value
for information increases with temperature literally, they should
operate a hard disk at higher temperatures (the higher the better
according to such a statement). Yet this does not happens. Do you
know why?

In general we are surrounded devices that store information (hard
discs, memory sticks, DVD, etc.). The information that these devices
can store, I believe, is known with accuracy to one bit.

Because they're engineered that way. It would be rather inconvenient if
one's information storage varied with temperature.


Can you
suggest a thermodynamic state which entropy gives us exactly that
amount of information?

Here would be again a question about temperature. If I operate my
memory stick in some reasonable range of temperatures, the
information it contains does not change. Yet, the entropy in my view
changes.

Sure - because they're engineered that way, and they operate a long
way from the theoretical maximum storage capability of that
matter. What's the problem with that?


So these are my doubts for which I do not see an answer.

Evgenii



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-28 Thread Craig Weinberg
On Jan 28, 1:48 pm, John Clark johnkcl...@gmail.com wrote:
 On Fri, Jan 27, 2012 at 5:51 PM, Craig Weinberg whatsons...@gmail.comwrote:

  You could much more easily write a probabilistic equation to simulate any
  given volume of water than the same volume of DNA, especially

 The motion of both can be well described by Napier-Stokes  equations which
 describe fluid flow using Newton's laws, and DNA being more viscous than
 water the resulting equations would be simpler than the ones for water.

I'm not talking about fluid flow, I'm talking about simulating
everything - potential and actual chemical reactions, etc. Water can
be described by multiplying the known interactions of H2O, DNA would
need many more variables.


   when you get into secondary and tertiary structure.

 You've got to play fair, it you talk about micro states for DNA I get to
 talk about micro states for water.

  I had not heard of Shannon information.

 Somehow I'm not surprised, and it's Shannon Information Theory.

No, I've heard of Shannon Information Theory. I didn't realize that it
was such an instrumental special case theory though.


  The key phrase for me here is the thermodynamic entropy is interpreted as
  being proportional to the amount of further Shannon information needed to
  define the detailed microscopic state of the
  system.

 OK, although I don't see what purpose the word further serves in the
 above, and although I know all about Claude Shannon the term Shannon
 information is nonstandard. What would Non-Shannon information be?

Non-Shannon information would be anything that is not directly
involved in the compression of a digitally sampled description into
another digital description. Further means that if you add x calories
of heat, you need x more units of Shannon information to define the
effect of the added heat/motion.


   This confirms what I have been saying and is the opposite of what you
  are saying.

 What on Earth are you talking about?? The more entropy a system has the
 more information needed to describe it.

Yes. It is information that lets you describe patterns more easily.
The more pattern there is, the more you can say 'yes, I get it, add
500 0s and then another 1'. When there is less information, less
pattern, more energy, it takes more information to describe it. There
are no patterns to give your compression a shortcut.


   This means that DNA, having low entropy compared with pure water, has
  high pattern content, high information, and less Shannon information

 I see, it has high information and less information. No I take that back, I
 don't see, although it is consistent with your usual logical standards.

Shannon information is not information in general, it is a specific
kind of information about information which is really inversely
proportional to information in any other sense. It's uninformability
is what it is. Drag. Entropy. Resistance to the process (not
thermodynamic resistance).


  Easier to compress does *not* mean less information

 It means a message has been inflated with useless gas and a compression
 program can remove that gas and recover the small kernel of information
 undamaged.

Hahaha. The useless gas is what separates coherence and sanity from
garbage. It's useless to a computer, sure, but without the gas it's
useless to us. Next time you want to look at a picture, try viewing it
in it's compressed form in a hex editor. Get rid of all that useless
gas.

 White noise has no gas in it for a compression program to
 deflate, that's why if you don't know the specific compression program used
 the resulting file ( like a zip or gif file) would look like random white
 noise, and yet its full of useful information if you know how to get it.
 The same thing is true of encrypted files, if the encription is good then
 the file will look completely random, just white noise, to anyone who does
 not have the secret key.

I understand what you mean completely, and that is indeed how
computers treat data, but it is the opposite of what it means to
inform in general terms. Compression and encryption are deformations.
Decryption is how we get any information out of it. White noise is
called noise for a reason. The opposite of noise is signal. Signals
are signifying and informing, thus information.


  The compressibility of a novel or picture does not relate to the quality
  of information

 How do you expect mathematics to deal with anything as subjective as
 quality? A novel that's high quality to you may be junk to me.

I don't expect mathematics to deal with it. I expect a theory of
everything to deal with it.


  Knowledge and wisdom are already owned by philosophy and religion,

 I've never heard of religion saying anything wise, philosophy does contain
 wisdom but none of it came from professional philosophers, at least not in
 the last 300 years.

I'm not a big philosophy or religion fan myself but Wittgenstein,
Heidegger, Sarte, Foucault, Kierkegaard were recent 

Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Craig Weinberg
On Jan 26, 11:11 pm, meekerdb meeke...@verizon.net wrote:
 On 1/26/2012 5:03 PM, Craig Weinberg wrote:

  Ok, so how does it effect the entropy of the structures? The red
  house, the white house, and the mixed house (even if an interesting
  pattern is made in the bricks), all behave in a physically identical
  way, do they not?
  No they don't.  They reflect photons differently; which is why you could 
  use the pattern
  to send a message.
  True, although it's only relevant if you have photons to reflect. If I
  turn out the lights (completely) does that change the entropy of the
  red house? What if I turn the lights back on, has entropy been
  suddenly reduced? Would a brighter light put more information or less
  entropy onto the white house than the red house, ie, does the pattern
  cost something in photons?

 Yes.

That doesn't make sense to me. I think if two houses had two different
patterns with the same numbers of each brick, neither one could
possibly have a different cost in photons than the other. In a house
of four bricks, Red Red White White cannot have a different photon
absorption than Red White White Red.




  I'm just curious, not trying to argue with you about it. On a similar
  note, I was wondering about heat loss in a vacuum today. With the
  second law of thermodynamics, it seems like heat could only dissipate
  by heating something else up. If there was nothing in the universe
  except a blob of molten nickel, would it cool off over time in an
  infinite vacuum? It seems like it wouldn't. It seems like you would
  need some other matter at a different temperature to seek a common
  equilibrium with. Or is the heat just lost over time no matter what?

 The heat would be lost by infrared radiation.

Lost to where? Energy is neither created nor...lost.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread John Clark
On Thu, Jan 26, 2012  Craig Weinberg whatsons...@gmail.com wrote:

 If a bucket of water has more of it than DNA, then the word information
 is meaningless.


You would need to send more, far far more, dots and dashes down a wire to
inform a intelligent entity what the position and velocity of every
molecule in bucket of water is than to inform it exactly what the human
genome is. Now what word didn't you understand.


  A symphony then would have less information and more entropy than random
 noise.


No, a symphony would have less information but LESS entropy than random
white noise. That's why lossless computer image and sound compression
programs don't work with white noise, there is no redundancy to remove
because white noise has no redundancy.  It would take many more dots and
dashes sent down a wire to describe every pop and click in a piece of white
noise than to describe a symphony of equal length.

 If the word information is to have any meaning, quantity and
 compressibility of data must be distinguished from quality of it's
 interpretation.



If you want to clearly distinguish these things, and I agree that is a very
good idea, then you need separate words for the separate ideas. Quality is
subjective so mathematics can not deal with it, mathematics can work with
quantity however, so if quality comes into play you can not use the word
information because mathematics already owns that word; but there are
plenty of other words that you can use, words like knowledge or
wisdom.


 Let's say your definition were true though. What does it have to do with
 information being directly proportionate to entropy?


The larger the entropy something has the more information it has.

 If entropy were equal or proportionate to information, then are saying
 that the more information something contains, the less it matters.


Whether it matters or not is subjective so you should not use the word
information in the above. A bucket of water contains far more information
than the human genome but the human genome has far more knowledge, at least
I think so, although a bucket of water might disagree with me.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread John Clark
On Thu, Jan 26, 2012 at 8:03 PM, Craig Weinberg whatsons...@gmail.comwrote:

 With the second law of thermodynamics, it seems like heat could only
 dissipate by heating something else up.


The second law says that energy will tend to get diluted in space over
time, and heat conducting to other matter is one way for this to happen but
it is not the only way. Photons radiating outward in all directions from a
hot object is another way energy can get diluted. But among many other
things, you don't think photons, or logic, exist so I doubt this answer
will satisfy you.

 John K Clark

-- 
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 25.01.2012 21:25 meekerdb said the following:

On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:

...

Let me suggest a very simple case to understand better what you are
 saying. Let us consider a string 10 for simplicity. Let us
consider the next cases. I will cite first the thermodynamic
properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered on
it (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered on it
 (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string 10
and the thermodynamic entropy is different. If we take the
statement literally then the information must be different in all
four cases and defined uniquely as the thermodynamic entropy is
already there. Yet in my view this makes little sense.

Could you please comment on this four cases?


The thermodynamic entropy is a measure of the information required to
 locate the possible states of the plates in the phase space of
atomic configurations constituting them. Note that the thermodynamic
entropy you quote is really the *change* in entropy per degree at the
given temperature. It's a measure of how much more phase space
becomes available to the atomic states when the internal energy is
increased. More available phase space means more uncertainty of the
exact actual state and hence more information entropy. This
information is enormous compared to the 01 stamped on the plate,
the shape of the plate or any other aspects that we would normally
use to convey information. It would only be in case we cooled the
plate to near absolute zero and then tried to encode information in
its microscopic vibrational states that the thermodynamic and the
encoded information entropy would become similar.



I would say that from your answer it follows that engineering 
information has nothing to do with the thermodynamic entropy. Don't you 
agree?


It would certainly interesting to consider what happens when we decrease 
the temperature (in the limit to zero Kelvin). According to the Third 
Law the entropy will be zero then. What do you think, can we save less 
information on a copper plate at low temperatures as compared with 
higher temperatures? Or more?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread meekerdb

On 1/27/2012 3:56 AM, Craig Weinberg wrote:

On Jan 26, 11:11 pm, meekerdbmeeke...@verizon.net  wrote:

On 1/26/2012 5:03 PM, Craig Weinberg wrote:


Ok, so how does it effect the entropy of the structures? The red
house, the white house, and the mixed house (even if an interesting
pattern is made in the bricks), all behave in a physically identical
way, do they not?

No they don't.  They reflect photons differently; which is why you could use 
the pattern
to send a message.

True, although it's only relevant if you have photons to reflect. If I
turn out the lights (completely) does that change the entropy of the
red house? What if I turn the lights back on, has entropy been
suddenly reduced? Would a brighter light put more information or less
entropy onto the white house than the red house, ie, does the pattern
cost something in photons?

Yes.

That doesn't make sense to me. I think if two houses had two different
patterns with the same numbers of each brick, neither one could
possibly have a different cost in photons than the other. In a house
of four bricks, Red Red White White cannot have a different photon
absorption than Red White White Red.





I'm just curious, not trying to argue with you about it. On a similar
note, I was wondering about heat loss in a vacuum today. With the
second law of thermodynamics, it seems like heat could only dissipate
by heating something else up. If there was nothing in the universe
except a blob of molten nickel, would it cool off over time in an
infinite vacuum? It seems like it wouldn't. It seems like you would
need some other matter at a different temperature to seek a common
equilibrium with. Or is the heat just lost over time no matter what?

The heat would be lost by infrared radiation.

Lost to where? Energy is neither created nor...lost.


The reason I seldom respond to your posts is that you seem unwilling to put any effort 
into understanding what is written to you.


Lost to the photons.

Brent

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 26.01.2012 12:00 Russell Standish said the following:

On Wed, Jan 25, 2012 at 08:47:03PM +0100, Evgenii Rudnyi wrote:


Let me suggest a very simple case to understand better what you
are saying. Let us consider a string 10 for simplicity. Let us
consider the next cases. I will cite first the thermodynamic
properties of Ag and Al from CODATA tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag  cr  42.55 ą 0.20 Al  cr  28.30 ą 0.10

In J K-1 cm-3 it will be

Ag  cr  42.55/107.87*10.49 = 4.14 Al  cr  28.30/26.98*2.7 = 2.83

1) An abstract string 10 as the abstract book above.

2) Let us make now an aluminum plate (a page) with 10 hammered
on it (as on a coin) of the total volume 10 cm^3. The
thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with 10 hammered on
it (as on a coin) of the total volume 10 cm^3. The thermodynamic
entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then the
thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a string 10
and the thermodynamic entropy is different. If we take the
statement literally then the information must be different in all
four cases and defined uniquely as the thermodynamic entropy is
already there. Yet in my view this makes little sense.

Could you please comment on this four cases?



Brent commented quite aptly on these cases in another post. The fact
that you calculate the thermodynamic entropy the way you do implies
you are disregarding the information contained in the symbols
embossed on the coin.


Well, I do disregard the surface effects. However, the statement was 
that the informational entropy is the same as thermodynamic entropy, so 
we must consider the total entropy.



If you included these two bits, the thermodynamic entropy is two
bits less, = 4.15 x 10^{-24} J/K less

This is so many orders of magnitude less than the entropy due to the
material, its probably not worth including, but it is there.


I do not believe that effects below the experimental noise are important 
for empirical science. You probably mean then some other science, it 
would be good if you define what science you mean.


Evgenii


--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 26.01.2012 19:01 John Clark said the following:

On Thu, Jan 19, 2012 at 5:28 PM, Craig
Weinbergwhatsons...@gmail.comwrote:


...


If I have red legos and white legos, and I build two opposite
monochrome

houses and one of mixed blocks, how in the world does that effect
the entropy of the plastic bricks in any way?



It does not effect the entropy of the plastic bricks but it does
change the entropy of the structures built with those plastic bricks.


This change in the entropy is below of experimental noise. Just estimate 
what difference it makes and the difference in what digit in the total 
entropy you will have. Hence the talk about the thermodynamic entropy as 
the information source in this case is just meaningless, as you cannot 
experimentally measure what you are talking about.


Evgenii


For a single part in isolation entropy is not defined, a single water
molecule has no entropy but a trillion trillion of them in a drop of
water does.

John K Clark



--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Information: a basic physical quantity or rather emergence/supervenience phenomenon

2012-01-27 Thread Evgenii Rudnyi

On 27.01.2012 05:11 meekerdb said the following:

On 1/26/2012 5:03 PM, Craig Weinberg wrote:


...



I'm just curious, not trying to argue with you about it. On a
similar note, I was wondering about heat loss in a vacuum today.
With the second law of thermodynamics, it seems like heat could
only dissipate by heating something else up. If there was nothing
in the universe except a blob of molten nickel, would it cool off
over time in an infinite vacuum? It seems like it wouldn't. It
seems like you would need some other matter at a different
temperature to seek a common equilibrium with. Or is the heat just
lost over time no matter what?


The heat would be lost by infrared radiation.



Brent,

if we consider a heated block in an infinite universe, then does its 
temperature go then to zero Kelvin?


Evgenii

--
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



  1   2   >