Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread John Collier
Folks,

Doing dimensional analysis entropy is heat difference divided by temperature. 
Heat is energy, and temperature is energy per degree of freedom. Dividing, we 
get units of inverse degrees of freedom. I submit that information has the same 
fundamental measure (this is a consequence of Scott Muller’s asymmetry 
principle of information. So fundamentally we are talking about the same basic 
thing with information and entropy.

I agree, though, that it is viewed from different perspectives and they have 
differing conventions for measurement.

I agree with Loet’s other points.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: July 26, 2015 8:50 AM
To: 'Joseph Brenner'; 'Fernando Flores'; fis@listas.unizar.es
Subject: Re: [Fis] Answer to the comments made by Joseph

Dear Joe,


a) information is more than order; there is information in absence 
(Deacon), in disorder, in incoherence as well as coherence;

The absent options provide the redundancy; that is, the complement of the 
information to the maximal information [H(max)].

See also my recent communication (in Vienna) or at 
http://arxiv.org/abs/1507.05251


b) information is not the same as matter-energy, but it is inseparable from 
it and reflects its dualistic properties;

Information is dimensionless. It is coupled to the physics of matter-energy 
because S = k(B) * H.
k(B) provides the dimensionality (Joule/Kelvin) and thus the physics. In other 
domains of application (e.g., economics), this coupling [via k(B)] is not 
meaningful.


c) information is both energy and a carrier of meaning, which is not, in my 
humble opinion, a hard physicalist approach;



Meaning provides more options to the information and thus increases the 
redundancy. In the case of reflexivity and further codification of meanings, 
the generation of redundancy can auto-catalytically be reinforced (Ulanowicz).



Best,

Loet


d) it remains to be shown that digitalism or computationalism is or could be 
the natural language for the description of the non-digital world, that is, of 
the complexity of the world that is of interest. Rafael Capurro has talked 
about the 'digital casting' of the world that we (or most of us) use in our 
daily lives, but this philosophical concept, with which I agree, is not a 
scientific description of the physics of informational processes as such. The 
best synthesis here of which I am aware is the Informational-Computationalism 
of Gordana Dodig-Crnkovic and even that is a framework, not an ontology.
e) it is possible to use probabilities to describe the evolution of real 
processes, as well as as a mathematical language for describing acts;
f) your presentation of a parameter designated as 'freedom' is indeed original, 
but it is a classificatory system, based on bits. It will miss the 
non-algorithmic aspects of values. I am suspicious of things that have infinite 
levels and represent 'pure' anything;
g) I do not feel you have added value to human acts by designating them as 
∞-free This may not be intended as doctrine but it looks like it.
h) your conclusions about informational value are correct from what I will call 
a hard neo-capitalist ;-) standpoint, but I am sure you agree there are other 
ones.

In trying to learn through association with this FIS group, I have come to 
believe that Informational Science is unique in that it can capture some of the 
complexity of nature, culture and society. It is not a 'hard simplification' as 
you suggest some sciences are.  The concept of (its) foundations is very broad, 
and it can and should include careful binary analyses such as the one you have 
made. However, I am pleading for a more directed positioning of your approach 
with respect to others. Is this an acceptable basis for you for continuing the 
debate?

Thank you again,

Joseph
- Original Message -
From: Fernando Floresmailto:fernando.flo...@kultur.lu.se
To: fis@listas.unizar.esmailto:fis@listas.unizar.es
Sent: Thursday, July 23, 2015 3:58 PM
Subject: [Fis] Answer to the comments made by Joseph

Hello everybody:


I will answer to the comments made by Joseph and Luis will answer to the 
comments made by Moisés.

Dear Joseph:

Thank you for your comments. We are not sure about the usefulness of 
identifying “information” (order) with “mater”. In this sense we are very 
carefully to avoid any hard physicalist approach. In this sense we believe with 
Norbert Wiener:
The mechanical brain does not secrete thought “as the liver does bile”, as the 
earlier materialist claimed, nor does it put it out in the form of energy, as 
the muscle puts out its activity. Information is information, not matter nor 
energy. No materialism, which does not admit this, can survive at the present 
day.
An informational description of the world must stand as a new branch of science 
in which “digitalism” will be the natural language.  Of course as any other 
science, it is a simplification

Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread Loet Leydesdorff
Dear John and colleagues,

 

So fundamentally we are talking about the same basic thing with information
and entropy.

 

The problem is fundamentally: the two are the same except for a constant.
Most authors attribute the dimensionality to this constant (kB). 

 

From the perspective of probability calculus, they are the same. 

 

Best,

Loet

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread John Collier
Loet,

I think that is consistent with what I said. Different ways of measuring and 
perspectives. I prefer to see the unity that comes out of the dimensional 
analysis approach, but I was always taught that if you wanted to really 
understand something, absorb that first. But my background is in applied 
physics. Research, but on applied issues in business and government. The 
advantage is that you see through the basic physical values (or parameters in 
general), and then you can apply it to the results of measurements. Always 
worked for me. One tricky problem I solved was a model for how the values I was 
getting were possible. Turned out that not enough dimensions were being taken 
into consideration in the text book solutions. So relevant information was 
being ignored. It might seem that dimensionality is given for physics, but not 
when you use generalized coordinate systems. The Boltzmann equation doesn't 
hold very well in some cases like that - he explicitly assumes a 6N dimensional 
system in his derivations. Not always true.

I will shut up now. These are the first posts I have had in weeks.

John

From: l...@leydesdorff.net [mailto:leydesdo...@gmail.com] On Behalf Of Loet 
Leydesdorff
Sent: July 27, 2015 7:10 PM
To: John Collier; 'Joseph Brenner'; 'Fernando Flores'; fis@listas.unizar.es
Subject: RE: [Fis] Answer to the comments made by Joseph

Dear John and colleagues,

So fundamentally we are talking about the same basic thing with information and 
entropy.

The problem is fundamentally: the two are the same except for a constant. 
Most authors attribute the dimensionality to this constant (kB).

From the perspective of probability calculus, they are the same.

Best,
Loet

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread Robert E. Ulanowicz
Folks

I know there is a long legacy of equating information with entropy, and
dimensionally, they are the same. Qualitatively, however, they are
antithetical. From the point of view of statistical mechanics, information
is a *decrease* in entropy, i.e., they are negatives of each other.

This all devolves back upon the requirement that *both* entropy and
information require a reference state. (The third law of thermodynamics.)
Once a reference distribution has been identified, one can then quantify
both entropy and information. It actually turns out that against any
reference state, entropy can be parsed into two components, mutual
information and conditional (or residual) entropy. Change the reference
state and the decomposition changes.
http://people.clas.ufl.edu/ulan/files/FISPAP.pdf (See also Chapter 5 in
http://people.clas.ufl.edu/ulan/publications/ecosystems/gand/.)

Cheers to all,
Bob

 Folks,

 Doing dimensional analysis entropy is heat difference divided by
 temperature. Heat is energy, and temperature is energy per degree of
 freedom. Dividing, we get units of inverse degrees of freedom. I submit
 that information has the same fundamental measure (this is a consequence
 of Scott Muller¡¯s asymmetry principle of information. So fundamentally we
 are talking about the same basic thing with information and entropy.

 I agree, though, that it is viewed from different perspectives and they
 have differing conventions for measurement.

 I agree with Loet¡¯s other points.

 John


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Answer to the comments made by Joseph

2015-07-26 Thread Loet Leydesdorff
Dear Joe,

 

a) information is more than order; there is information in absence
(Deacon), in disorder, in incoherence as well as coherence;

 

The absent options provide the redundancy; that is, the complement of the
information to the maximal information [H(max)].

 

See also my recent communication (in Vienna) or at
http://arxiv.org/abs/1507.05251 

 

b) information is not the same as matter-energy, but it is inseparable
from it and reflects its dualistic properties; 

 

Information is dimensionless. It is coupled to the physics of matter-energy
because S = k(B) * H. 

k(B) provides the dimensionality (Joule/Kelvin) and thus the physics. In
other domains of application (e.g., economics), this coupling [via k(B)] is
not meaningful.

 

c) information is both energy and a carrier of meaning, which is not, in
my humble opinion, a hard physicalist approach; 

 

Meaning provides more options to the information and thus increases the
redundancy. In the case of reflexivity and further codification of meanings,
the generation of redundancy can auto-catalytically be reinforced
(Ulanowicz).

 

Best,

Loet

 

d) it remains to be shown that digitalism or computationalism is or could be
the natural language for the description of the non-digital world, that is,
of the complexity of the world that is of interest. Rafael Capurro has
talked about the 'digital casting' of the world that we (or most of us) use
in our daily lives, but this philosophical concept, with which I agree, is
not a scientific description of the physics of informational processes as
such. The best synthesis here of which I am aware is the
Informational-Computationalism of Gordana Dodig-Crnkovic and even that is a
framework, not an ontology.

e) it is possible to use probabilities to describe the evolution of real
processes, as well as as a mathematical language for describing acts;

f) your presentation of a parameter designated as 'freedom' is indeed
original, but it is a classificatory system, based on bits. It will miss the
non-algorithmic aspects of values. I am suspicious of things that have
infinite levels and represent 'pure' anything; 

g) I do not feel you have added value to human acts by designating them as
∞-free This may not be intended as doctrine but it looks like it.

h) your conclusions about informational value are correct from what I will
call a hard neo-capitalist ;-) standpoint, but I am sure you agree there are
other ones.

 

In trying to learn through association with this FIS group, I have come to
believe that Informational Science is unique in that it can capture some of
the complexity of nature, culture and society. It is not a 'hard
simplification' as you suggest some sciences are.  The concept of (its)
foundations is very broad, and it can and should include careful binary
analyses such as the one you have made. However, I am pleading for a more
directed positioning of your approach with respect to others. Is this an
acceptable basis for you for continuing the debate?

 

Thank you again,

 

Joseph

- Original Message - 

From: Fernando Flores mailto:fernando.flo...@kultur.lu.se  

To: fis@listas.unizar.es 

Sent: Thursday, July 23, 2015 3:58 PM

Subject: [Fis] Answer to the comments made by Joseph

 

Hello everybody:

 

I will answer to the comments made by Joseph and Luis will answer to the
comments made by Moisés.

 

Dear Joseph:

 

Thank you for your comments. We are not sure about the usefulness of
identifying “information” (order) with “mater”. In this sense we are
very carefully to avoid any hard physicalist approach. In this sense we
believe with Norbert Wiener: 

The mechanical brain does not secrete thought “as the liver does bile”, as
the earlier materialist claimed, nor does it put it out in the form of
energy, as the muscle puts out its activity. Information is information, not
matter nor energy. No materialism, which does not admit this, can survive at
the present day.

An informational description of the world must stand as a new branch of
science in which “digitalism” will be the natural language.  Of course as
any other science, it is a simplification of the complexity of
nature/society/culture. I believe that we are shown that we are very
conscious about the risks of a hard simplification, and that is why we
introduced that idea of freedom in a chain of acts and use probability as
mathematical language. We considered the vital acts as ∞-free.

 

 

 

Fernando Flores PhD

Associate Professor

History of Ideas and Sciences

Lund University

 


  _  


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Answer to the comments made by Joseph

2015-07-25 Thread Joseph Brenner
Dear Fernando,

This is becoming very interesting. I understand your critique, but I do not 
believe it applies exactly to what I am trying to say. I start from a position 
that the apodictic statement by Wiener is not or in any case is no longer 
valid. In my view, the following should be taken into account:  
a) information is more than order; there is information in absence (Deacon), in 
disorder, in incoherence as well as coherence;
b) information is not the same as matter-energy, but it is inseparable from it 
and reflects its dualistic properties; 
c) information is both energy and a carrier of meaning, which is not, in my 
humble opinion, a hard physicalist approach; 
d) it remains to be shown that digitalism or computationalism is or could be 
the natural language for the description of the non-digital world, that is, of 
the complexity of the world that is of interest. Rafael Capurro has talked 
about the 'digital casting' of the world that we (or most of us) use in our 
daily lives, but this philosophical concept, with which I agree, is not a 
scientific description of the physics of informational processes as such. The 
best synthesis here of which I am aware is the Informational-Computationalism 
of Gordana Dodig-Crnkovic and even that is a framework, not an ontology.
e) it is possible to use probabilities to describe the evolution of real 
processes, as well as as a mathematical language for describing acts;
f) your presentation of a parameter designated as 'freedom' is indeed original, 
but it is a classificatory system, based on bits. It will miss the 
non-algorithmic aspects of values. I am suspicious of things that have infinite 
levels and represent 'pure' anything; 
g) I do not feel you have added value to human acts by designating them as 
∞-free This may not be intended as doctrine but it looks like it.
h) your conclusions about informational value are correct from what I will call 
a hard neo-capitalist ;-) standpoint, but I am sure you agree there are other 
ones.

In trying to learn through association with this FIS group, I have come to 
believe that Informational Science is unique in that it can capture some of the 
complexity of nature, culture and society. It is not a 'hard simplification' as 
you suggest some sciences are.  The concept of (its) foundations is very broad, 
and it can and should include careful binary analyses such as the one you have 
made. However, I am pleading for a more directed positioning of your approach 
with respect to others. Is this an acceptable basis for you for continuing the 
debate?

Thank you again,

Joseph
  - Original Message - 
  From: Fernando Flores 
  To: fis@listas.unizar.es 
  Sent: Thursday, July 23, 2015 3:58 PM
  Subject: [Fis] Answer to the comments made by Joseph


  Hello everybody:

   

I will answer to the comments made by Joseph and Luis will answer to the 
comments made by Moisés. 

  Dear Joseph:

   

  Thank you for your comments. We are not sure about the usefulness of 
identifying “information” (order) with “mater”. In this sense we are very 
carefully to avoid any hard physicalist approach. In this sense we believe with 
Norbert Wiener: 

  The mechanical brain does not secrete thought “as the liver does bile”, as 
the earlier materialist claimed, nor does it put it out in the form of energy, 
as the muscle puts out its activity. Information is information, not matter nor 
energy. No materialism, which does not admit this, can survive at the present 
day.

  An informational description of the world must stand as a new branch of 
science in which “digitalism” will be the natural language.  Of course as any 
other science, it is a simplification of the complexity of 
nature/society/culture. I believe that we are shown that we are very conscious 
about the risks of a hard simplification, and that is why we introduced that 
idea of freedom in a chain of acts and use probability as mathematical 
language. We considered the vital acts as ∞-free.

   

   

   

  Fernando Flores PhD

  Associate Professor

  History of Ideas and Sciences

  Lund University

   



--


  ___
  Fis mailing list
  Fis@listas.unizar.es
  http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis