Re: [Fis] It From Bit video

2015-05-27 Thread Srinandan Dasmahapatra
Dear John,

That makes it clearer, thanks.  

The notion of symmetry is at the basis of the definition of probabilities 
(exchangeabilty (de Finetti), which operationalises symmetry, for variables 
that lie in the orbit of some group action, but whose transformed values do not 
have observable consequences such as changes of energy).  Any state defined by 
the values taken by variables, such that they deviate from the equiprobable 
distribution required for exchangeablity, necessarily has a different value 
taken by any measure of distances between distributions.  In that sense, one 
can correlate information and lack of symmetry, when the symmetric state is 
taken as reference.

Howver, there is more to symmetry than merely providing a reference state.  The 
way the world is described by physics is via symmetry, indeed via local 
symmetry.  The freedom to allow group transformations to variables locally must 
be coupled with compensatory transformations elsewhere.  And this is how 
interactions get generated, and we have light and other bosonic force 
mediators. Further, what is facilitated by appeals to notions of symmetry as a 
primitive principle, are not only ieas that rely on invariance, when the 
observables under scrutiny are unaffected by the symmetry transformations, but 
also covariance, where observables get transformed in a particular manner that 
respects its algebraic/geometric status.  


Cheers,
Srinandan


 On 26 May 2015, at 22:19, John Collier colli...@ukzn.ac.za wrote:
 
 Dear Srinandan,
  
 He relation of geometry to information theory (and also of particle theory in 
 the Standard Theory) is by way of group theory. Groups describe symmetries, 
 which are reversible. What is left over are the asymmetries, which are the 
 differences that can be identified as information. This is worked out in some 
 detail by my former student, Scott Muller, in Asymmetry: The Foundation of 
 Information. Springer: Berlin. 2007. Seth Lloyd relates the information 
 concept to quantum mechanics via group theory and other means in his 
 Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos. 
 More direct connections can be made via the entropy concept where the 
 information is the difference between the entropy of a system and its entropy 
 with all internal constraints relaxed, but it comes to the same thing in the 
 end. There are several convergent ways to relate information to form, then, 
 in contemporary physics. But basically it is in the asymmetries.
  
 As far as the relation between the asymmetries and symmetries go, I think 
 this is still a bit open, since the symmetries represent the laws. Some 
 physicists like Paul Davies talk as if the symmetries add nothing once you 
 have all the asymmetries, so the laws are a result of information as well. I 
 don’t see through this adequately myself as yet, though.
  
 John
  
   
 From: Srinandan Dasmahapatra [mailto:s...@ecs.soton.ac.uk 
 mailto:s...@ecs.soton.ac.uk] 
 Sent: May 26, 2015 10:20 PM
 To: u...@umces.edu mailto:u...@umces.edu; John Collier
 Cc: fis
 Subject: Re: [Fis] It From Bit video
  
 Re: boundary conditions, etc.
  
 I struggle to understand many/most of the posts on this list, and the 
 references to boundary conditions, geometry and information leave me quite 
 befuddled as well. Is it being claimed that geometry the same as information? 
 That the requirement of predictions makes the focus on physical laws 
 irrelevant unless the boundary conditions are specified? Or even that the 
 continuum is at odds with the speed of light, considering classical 
 electromagnetism is a well-defined continuum field theory. As for galactic 
 distances, the only scientific basis upon which we conceive of the large 
 scale structure of the universe is via the field equations of gravity, which 
 brings a coherent package of causal thinking built into it. I did understand 
 the bit on Noether, as energy conservation is indeed a consequence of time 
 translation invariance, but that comes embedded in a continuum description, 
 typically.
  
 In biological systems, energy input makes the picture specific to the system 
 one cordons off for study, and often it is hard to adequately describe 
 phenomena by scalar potentials alone due to the currents in the system. And 
 Noether cannot deliver reversibility. 
  
 To me the message of Sean Carroll in the YouTube video that an equivalent 
 redescription of physics (or biology) in terms of information is not enough, 
 strikes me as sane.  
  
 Cheers, 
 Srinandan
 
 
  Original message 
 From: Robert E. Ulanowicz 
 Date:26/05/2015 16:16 (GMT+00:00) 
 To: John Collier 
 Cc: fis 
 Subject: Re: [Fis] It From Bit video 
 
 I would like to strongly reinforce John's comments about boundary
 conditions. We tend to obsess over the laws and ignore the boundary
 statements. (Sort of a shell game, IMHO.) If boundary conditions cannot be
 stated in closed form, the physical 

Re: [Fis] It From Bit video

2015-05-27 Thread Francesco Rizzo
Caro John e Cari colleghi,
Stephen Hawking nel 1975 riteneva che i buchi neri fagocitassero tutto ciò
che si ritrovava nelle loro vicinanze, all'interno di una regione detta
orizzonte degli eventi. Fin da allora diventò evidente che questa
proprietà portasse a un paradosso. Infatti se i buchi neri inghiottono
tutto, allora dovrebbero fagocitare e distruggere anche l'informazione,
perdendo di ciò che ingoiano qualsiasi traccia. Secondo la meccanica
quantistica, però, l'informazione contenuta nella materia non può andare
persa del tutto. Circa trent'anni dopo Hawking ha affermato che sui buchi
neri aveva torto. Rivedendo la sua teoria sostiene che i buchi neri non si
limitano a perdere massa attraverso una radiazione di energia, ma evaporano
o rilasciano informazione. Con-tengono un'informazione sulla materia di cui
sono fatti che consente di pre-dirne il futuro. In tal modo i buchi neri
non  evaporano o irradiano un'energia invisibile o enigmatica priva di
informazione come se fossero delle inafferrabili e indecifrabili entità
cosmiche,  e non sfuggono alla (mia) super-legge della combinazione
creativa (anche se talvolta stupefacente) di energia e in-formazione. I
buchi neri quindi possono considerarsi come speciali scatole nere o magici
processi di tras-in-formazione produttivi ( i cui input  e output sono
materia, energia e informazione) e prospettici.
Questo significa che da economista ho:
-elaborato una legge che vale anche per l'astronomia e l'intera fisica;
-preceduto di circa vent'anni quel che Hawking ha scoperto nel 1998
(Gravitational  entropy) e nel 2005 (Information loss in black holes,
Phisical review. D 72).
 Quindi all'INTERNO dei buchi neri si avrebbe una minore entropia (o una
maggiore neg-entropia) rispetto alla maggiore entropia (o minore
neg-entropia) ESTERNA. La formazione di maggiore entropia ESTERNA
(corrispondente ad una minore informazione) dovrebbe essere necessariamente
bilanciata da una maggiore informazione INTERNA (corrispondente ad una
minore entropia). In base a questo ragionamento o bilanciamento - coerente
con la logica della Nuova economia - i buchi neri dovrebbero produrre ed
 emettere informazione netta al pari di qualunque processo produttivo. Tale
asimmetria ESTERNA-INTERNA fa una differenza che è proprio l'informazione.
Non sono pochi i saggi che ho dedicato alla capacità creativa
dell'asimmetria in qualunque processo di avanzamento scientifico (cfr.
soprattutto Incontro d'amore tra il cuore della fede e l'intelligenza
della scienza, Aracne, Roma, 2014).
Quel che ho descritto schematicamente e sinteticamente, cosa di cui mi
scuso, di-mostra la mirabile e meravigliosa armonia che governa il mondo.
Grazie.
Francesco Rizzo.


2015-05-26 23:19 GMT+02:00 John Collier colli...@ukzn.ac.za:

  Dear Srinandan,



 He relation of geometry to information theory (and also of particle theory
 in the Standard Theory) is by way of group theory. Groups describe
 symmetries, which are reversible. What is left over are the asymmetries,
 which are the differences that can be identified as information. This is
 worked out in some detail by my former student, Scott Muller, in *Asymmetry:
 The Foundation of Information*. Springer: Berlin. 2007. Seth Lloyd
 relates the information concept to quantum mechanics via group theory and
 other means in his *Programming the Universe: A Quantum Computer
 Scientist Takes on the Cosmos*. More direct connections can be made via
 the entropy concept where the information is the difference between the
 entropy of a system and its entropy with all internal constraints relaxed,
 but it comes to the same thing in the end. There are several convergent
 ways to relate information to form, then, in contemporary physics. But
 basically it is in the asymmetries.



 As far as the relation between the asymmetries and symmetries go, I think
 this is still a bit open, since the symmetries represent the laws. Some
 physicists like Paul Davies talk as if the symmetries add nothing once you
 have all the asymmetries, so the laws are a result of information as well.
 I don’t see through this adequately myself as yet, though.



 John





 *From:* Srinandan Dasmahapatra [mailto:s...@ecs.soton.ac.uk]
 *Sent:* May 26, 2015 10:20 PM
 *To:* u...@umces.edu; John Collier

 *Cc:* fis
 *Subject:* Re: [Fis] It From Bit video



 Re: boundary conditions, etc.



 I struggle to understand many/most of the posts on this list, and the
 references to boundary conditions, geometry and information leave me quite
 befuddled as well. Is it being claimed that geometry the same as
 information? That the requirement of predictions makes the focus on
 physical laws irrelevant unless the boundary conditions are specified? Or
 even that the continuum is at odds with the speed of light, considering
 classical electromagnetism is a well-defined continuum field theory. As for
 galactic distances, the only scientific basis upon which we conceive of the
 large scale structure of the universe is 

Re: [Fis] It From Bit video

2015-05-27 Thread John Collier
That is most interesting, Francesco. It agrees with my understanding, but there 
are people reluctant to call it ‘inofrmation’. I don’t know what else to call 
it.
Cheers,
John

From: Francesco Rizzo [mailto:13francesco.ri...@gmail.com]
Sent: May 27, 2015 8:27 AM
To: John Collier
Cc: Srinandan Dasmahapatra; u...@umces.edu; fis
Subject: Re: [Fis] It From Bit video

Caro John e Cari colleghi,
Stephen Hawking nel 1975 riteneva che i buchi neri fagocitassero tutto ciò che 
si ritrovava nelle loro vicinanze, all'interno di una regione detta orizzonte 
degli eventi. Fin da allora diventò evidente che questa proprietà portasse a 
un paradosso. Infatti se i buchi neri inghiottono tutto, allora dovrebbero 
fagocitare e distruggere anche l'informazione, perdendo di ciò che ingoiano 
qualsiasi traccia. Secondo la meccanica quantistica, però, l'informazione 
contenuta nella materia non può andare persa del tutto. Circa trent'anni dopo 
Hawking ha affermato che sui buchi neri aveva torto. Rivedendo la sua teoria 
sostiene che i buchi neri non si limitano a perdere massa attraverso una 
radiazione di energia, ma evaporano o rilasciano informazione. Con-tengono 
un'informazione sulla materia di cui sono fatti che consente di pre-dirne il 
futuro. In tal modo i buchi neri non  evaporano o irradiano un'energia 
invisibile o enigmatica priva di informazione come se fossero delle 
inafferrabili e indecifrabili entità cosmiche,  e non sfuggono alla (mia) 
super-legge della combinazione creativa (anche se talvolta stupefacente) di 
energia e in-formazione. I buchi neri quindi possono considerarsi come speciali 
scatole nere o magici processi di tras-in-formazione produttivi ( i cui input 
 e output sono materia, energia e informazione) e prospettici.
Questo significa che da economista ho:
-elaborato una legge che vale anche per l'astronomia e l'intera fisica;
-preceduto di circa vent'anni quel che Hawking ha scoperto nel 1998 
(Gravitational  entropy) e nel 2005 (Information loss in black holes, 
Phisical review. D 72).
 Quindi all'INTERNO dei buchi neri si avrebbe una minore entropia (o una 
maggiore neg-entropia) rispetto alla maggiore entropia (o minore neg-entropia) 
ESTERNA. La formazione di maggiore entropia ESTERNA (corrispondente ad una 
minore informazione) dovrebbe essere necessariamente bilanciata da una maggiore 
informazione INTERNA (corrispondente ad una minore entropia). In base a questo 
ragionamento o bilanciamento - coerente con la logica della Nuova economia - i 
buchi neri dovrebbero produrre ed  emettere informazione netta al pari di 
qualunque processo produttivo. Tale asimmetria ESTERNA-INTERNA fa una 
differenza che è proprio l'informazione. Non sono pochi i saggi che ho dedicato 
alla capacità creativa dell'asimmetria in qualunque processo di avanzamento 
scientifico (cfr. soprattutto Incontro d'amore tra il cuore della fede e 
l'intelligenza della scienza, Aracne, Roma, 2014).
Quel che ho descritto schematicamente e sinteticamente, cosa di cui mi scuso, 
di-mostra la mirabile e meravigliosa armonia che governa il mondo.
Grazie.
Francesco Rizzo.


2015-05-26 23:19 GMT+02:00 John Collier 
colli...@ukzn.ac.zamailto:colli...@ukzn.ac.za:
Dear Srinandan,

He relation of geometry to information theory (and also of particle theory in 
the Standard Theory) is by way of group theory. Groups describe symmetries, 
which are reversible. What is left over are the asymmetries, which are the 
differences that can be identified as information. This is worked out in some 
detail by my former student, Scott Muller, in Asymmetry: The Foundation of 
Information. Springer: Berlin. 2007. Seth Lloyd relates the information concept 
to quantum mechanics via group theory and other means in his Programming the 
Universe: A Quantum Computer Scientist Takes on the Cosmos. More direct 
connections can be made via the entropy concept where the information is the 
difference between the entropy of a system and its entropy with all internal 
constraints relaxed, but it comes to the same thing in the end. There are 
several convergent ways to relate information to form, then, in contemporary 
physics. But basically it is in the asymmetries.

As far as the relation between the asymmetries and symmetries go, I think this 
is still a bit open, since the symmetries represent the laws. Some physicists 
like Paul Davies talk as if the symmetries add nothing once you have all the 
asymmetries, so the laws are a result of information as well. I don’t see 
through this adequately myself as yet, though.

John


From: Srinandan Dasmahapatra 
[mailto:s...@ecs.soton.ac.ukmailto:s...@ecs.soton.ac.uk]
Sent: May 26, 2015 10:20 PM
To: u...@umces.edumailto:u...@umces.edu; John Collier

Cc: fis
Subject: Re: [Fis] It From Bit video

Re: boundary conditions, etc.

I struggle to understand many/most of the posts on this list, and the 
references to boundary conditions, geometry and information leave me quite 
befuddled as well. Is it being claimed that 

Re: [Fis] It From Bit video. Collier and Muller

2015-05-27 Thread Joseph Brenner
Dear Srinandan, Dear John and All,

At the Vienna Information Summit, I will present a paper in the Symmetry 
Section of Gyuri Darvas entitled Symmetry and Information; Brothers in Arms. 
I wished by this title to convey the idea that symmetry and information somehow 
emerged together from a prior state of some kind. I do not state explicitly 
that asymmetry IS information and I was not aware of John's work on symmetry, 
even if I had seen reference to it earlier. But then, is it not possible to be 
aware of John's work 'all at once'. It requires several iterations; I have 
purchased Muller's book to get myself to the next stage of knowledge here.

The point and possible value of the Logic in Reality approach, what it brings 
to the table, still can I believe be seen in some of the implications of John's 
note: some people talk only of the laws/symmetries, others only of asymmetries. 
Darvas clearly shows that one cannot be considered without the other, and LIR 
states that it logical hence scientific that both the energetic partly 
symmetricalsubstrate of information and its ontological and epsitemological 
properties influence one another (interact). Laws are both information and the 
final cause of the regularities in the information, and Logic in Reality 
addresses and tries, with difficulty, to express in what way words like 'both' 
and 'at the same time' express how reality 'really' evolves.

I would be glad to forward a copy of my extended abstract for Vienna to anyone 
who is interested.

Thank you and best wishes,

Joseph
  - Original Message - 
  From: John Collier 
  To: Srinandan Dasmahapatra ; u...@umces.edu 
  Cc: fis 
  Sent: Tuesday, May 26, 2015 11:19 PM
  Subject: Re: [Fis] It From Bit video


  Dear Srinandan,

   

  He relation of geometry to information theory (and also of particle theory in 
the Standard Theory) is by way of group theory. Groups describe symmetries, 
which are reversible. What is left over are the asymmetries, which are the 
differences that can be identified as information. This is worked out in some 
detail by my former student, Scott Muller, in Asymmetry: The Foundation of 
Information. Springer: Berlin. 2007. Seth Lloyd relates the information concept 
to quantum mechanics via group theory and other means in his Programming the 
Universe: A Quantum Computer Scientist Takes on the Cosmos. More direct 
connections can be made via the entropy concept where the information is the 
difference between the entropy of a system and its entropy with all internal 
constraints relaxed, but it comes to the same thing in the end. There are 
several convergent ways to relate information to form, then, in contemporary 
physics. But basically it is in the asymmetries.

   

  As far as the relation between the asymmetries and symmetries go, I think 
this is still a bit open, since the symmetries represent the laws. Some 
physicists like Paul Davies talk as if the symmetries add nothing once you have 
all the asymmetries, so the laws are a result of information as well. I don’t 
see through this adequately myself as yet, though.

   

  John

   

   

  From: Srinandan Dasmahapatra [mailto:s...@ecs.soton.ac.uk] 
  Sent: May 26, 2015 10:20 PM
  To: u...@umces.edu; John Collier
  Cc: fis
  Subject: Re: [Fis] It From Bit video

   

  Re: boundary conditions, etc.

   

  I struggle to understand many/most of the posts on this list, and the 
references to boundary conditions, geometry and information leave me quite 
befuddled as well. Is it being claimed that geometry the same as information? 
That the requirement of predictions makes the focus on physical laws irrelevant 
unless the boundary conditions are specified? Or even that the continuum is at 
odds with the speed of light, considering classical electromagnetism is a 
well-defined continuum field theory. As for galactic distances, the only 
scientific basis upon which we conceive of the large scale structure of the 
universe is via the field equations of gravity, which brings a coherent package 
of causal thinking built into it. I did understand the bit on Noether, as 
energy conservation is indeed a consequence of time translation invariance, but 
that comes embedded in a continuum description, typically.

   

  In biological systems, energy input makes the picture specific to the system 
one cordons off for study, and often it is hard to adequately describe 
phenomena by scalar potentials alone due to the currents in the system. And 
Noether cannot deliver reversibility. 

   

  To me the message of Sean Carroll in the YouTube video that an equivalent 
redescription of physics (or biology) in terms of information is not enough, 
strikes me as sane.  

   

  Cheers, 

  Srinandan



   Original message 
  From: Robert E. Ulanowicz 
  Date:26/05/2015 16:16 (GMT+00:00) 
  To: John Collier 
  Cc: fis 
  Subject: Re: [Fis] It From Bit video 

  I would like to strongly reinforce John's comments