Re: [Fis] About FIS 2005

2013-04-16 Thread Loet Leydesdorff
Dear colleagues, 

It seems to me that a difference that makes a difference (or a distinction)
generates another option in the system of reference and thus adds to the
redundancy instead of the Shannon-type information. 

The information is not in the DNA strings, but in the distribution of the
bases in the DNA strings. 

The confusion is generated because informing us introduces us implicitly
as a system of reference. However, we provide meaning to the information and
thus generate redundancies (other and possibly new options). The channels
are then changed, but not the information. The information is contained in a
series of differences or, in other words, a probability distribution. 

 

If one considers a difference which makes a difference directly as
information instead of a redundancy, one can no longer measure in terms of
bits of information and thus one loses the operationalization and the
possibility of measurement in information theory. In other words,
information theory then becomes only philosophy. 

 

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communications Research (ASCoR)

Kloveniersburgwal 48, 1012 CX Amsterdam
 mailto:l...@leydesdorff.net l...@leydesdorff.net ;
http://www.leydesdorff.net/ http://www.leydesdorff.net/ 
Honorary Professor, SPRU,  http://www.sussex.ac.uk/spru/ University of
Sussex; Visiting Professor, ISTIC,
http://www.istic.ac.cn/Eng/brief_en.html Beijing;
http://scholar.google.com/citations?user=ych9gNYJ
http://scholar.google.com/citations?user=ych9gNYJhl=en hl=en  



 

From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On
Behalf Of John Collier
Sent: Monday, April 15, 2013 4:37 PM
To: Bob Logan; y...@pku.edu.cn
Cc: fis
Subject: Re: [Fis] About FIS 2005

 

Bob, Xueshan, others,

This is an issue that I think more terminological than anything else, and I
think that there is no correct answer. The problem is more to find the
relations between different uses of information that are current in science
( Kinds of Information in Scientific Use
http://www.triple-c.at/index.php/tripleC/article/view/278/269 . 2011.
cognition, communication, co-operation. Vol 9, No 2
http://www.triple-c.at/index.php/tripleC/issue/view/22  ). For example in
astrophysics and cosmology it is useful to speak of information as a
conserved quantity that is related to energy but is not the same (not two
sides of the same coin as some would have it). 

Tom Schneider has done a lot of work on molecular machines (
http://schneider.ncifcrf.gov/ http://schneider.ncifcrf.gov/ ) in which he
sees a computational model using information to keep track of computations
as useful. Sure it al is grounded in energy, but this is not the most
perspicacious way to view what happens in these macromolecular interactions.
I have argued in Information in biological systems
http://web.ncf.ca/collier/papers/Information%20in%20Biological%20Systems.pd
f  (Handbook of Philosophy of Science, vol 8, Philosophy of Information
http://www.elsevier.com/wps/find/bookdescription.cws_home/716648/descriptio
n#description , 2008, Chapter 5f)  that we should distinguish between the
instrumental use of information in biology and a substantive use, in which
information is treated as such by the system. This is a stronger requirement
than in the astrophysical and cosmological uses of information (in a
different substantive way, and also stronger than Schneider's use). This is
a useful distinction in biology, or so I argue. However, in an earlier
paper, Intrinsic Information http://web.ncf.ca/collier/papers/intrinfo.pdf
(1990)  I argued that in order to understand what it is to mean that we get
information about the world, we must understand what it is that makes the
world capable of providing us with information. This leads to a natural
description of the world as containing information (see also Dretske,
knowledge and the flow of information, and Barwise and Perry, Situations and
Attitudes and following work of theirs) that flows into our minds, given the
right coordination. See also Barwise and Seligman, Information Flow for a
general account not mind dependent. 

What you want to treat as information depends very much on what you are
considering and how. I would argue that a unified theory of information
should recognize all of these usages, and put them in their place relative
to each other. Some usages, I believe, are dispensable in some context, and
some may be dispensable in all contexts. But I doubt that information talk
can be dispensed with entirely in favour of energy talk when boundary
conditions are important to system behaviour. This happens especially with
complex systems, but physicists have  found it useful in talking about
boundary conditions of black holes, among other things, that aren't
obviously complexly organized.

John

At 02:43 PM 2013/04/15, Bob Logan wrote:



Dear Xueshan - re Nalewajski's conjecture that molecular systems have

Re: [Fis] About FIS 2005

2013-04-15 Thread Bob Logan
Dear Xueshan - re Nalewajski's conjecture that molecular systems have 
information I am skeptical. The word information originated with the idea of 
forming the mind according to the OED. Information as far as I am concerned 
requires a sentient being to receive and understand it. Molecules and atoms 
react to forces not information. They have no idea of the forces acting on 
them. They are not informed as they have no sentience that can be informed. 
Information requires an interpretant for which the signal has meaning. 
Shannon's  information theory is merely signal theory as all he is concerned 
with is how well a set of symbols or a signal are transmitted from the sender 
to the receiver. The ability of the receiver to decipher the signal or 
interpret the signal  has no bearing on the reception of Shannon information. 
Shannon information has nothing to do with meaning. A set of random numbers has 
the maximum amount of Shannon information and yet has no meaning. If my set of 
symbols have meaning for you, whether or not you agree with the premise they 
represent, then they are information. As for a molecule or even a flower or a 
penguin they are not information. In other words information has to inform as a 
grammatical analysis of the word information implies. A representation 
represents, a contradiction contradicts, a saturation saturates and in general 
an Xtion Xes and therefore information informs or at least has the 
capability of informing. So while a text in the Basque or Albanian languages 
might not inform me because of my inability with these languages they are 
capable of informing those familiar with the Basque and Albanian languages 
respectively and are therefore informaton. A random set of letters cannot 
inform anyone yet they have maximum Shannon information. Information is a 
tricky thing. 

This line of thought raises the question of whether or not DNA is information. 
DNA does not inform a sentient being yet it does catalyze and hence instructs 
how RNA is produced which in turn catalyzes and instructs how proteins are 
created which in turn gives rise to bodily functions. Therefore we suggested 
that DNA represents a different form of information from Shannon information 
which we called biotic or instructional information. The argument can be found 
in the paper  Propagating of Organization: An Inquiry by Stuart Kauffman, 
Robert K. Logan, Robert Este, Randy Goebel, David Hobill and Ilya Smulevich. 
published in 2007 in Biology and Philosophy 23: 27-45. I am happy to share this 
paper with anyone requesting it. 

Bob Logan

On 2013-04-14, at 9:59 PM, Xueshan Yan wrote:

 
 Dear Michel,
 
 Thank you!
 
 I am very familiar with your FIS 2005 website long before.
 
 Have you read the Polish chemist Nalewajski's book:
 Information theory of molecular systems (Elsevier, 2006), I
 really want to know if there are INFORMATON that play a role
 between two atoms, or two molecules, or two supramolecules
 as Jean-Marie Lehn said.
 
 As to FIS 2005, I need every review about all four FIS
 conferences held in Madrid, Vienna, Paris, and Beijing, but
 only a general review about FIS 2005 not be given by people
 so far.
 
 Best regards,
 
 Xueshan
 9:59, April 15, 2013  Peking University
 
 
 -Original Message-
 From: Michel Petitjean [mailto:petitjean.chi...@gmail.com]
 
 Sent: Sunday, April 14, 2013 6:19 PM
 To: Yan Xueshan
 Subject: Re: About FIS 2005
 
 Dear Xueshan,
 As far as I know, there is no longer report, but I am at
 your 
 disposal if you wish to get more: please feel free to ask
 me. 
 Also you may have a look at the programme, the
 proceedings, 
 and all what is available from the main welcome page: 
 http://www.mdpi.org/fis2005/ Best, Michel.
 
 
 2013/4/14 Xueshan Yan y...@pku.edu.cn:
 
 Dear Michel,
 
 May I ask you a favor?
 
 Do you have any more detailed review about FIS 2005,
 except 
 your FIS 
 2005 brief conference report published in 
 http://www.mdpi.org/entropy/htm/e7030188.htm?
 
 Best regards,
 
 Xueshan
 17:47, April 14, 2013
 
 
 

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] About FIS 2005

2013-04-15 Thread John Collier


Bob, Xueshan, others,
This is an issue that I think more terminological than anything else, and
I think that there is no correct answer. The problem is more to find the
relations between different uses of information that are current in
science
(
Kinds of Information in Scientific Use. 2011. cognition,
communication, co-operation.
Vol 9,
No 2 ). For example in astrophysics and cosmology it is useful to
speak of information as a conserved quantity that is related to energy
but is not the same (not two sides of the same coin as some would have
it). 
Tom Schneider has done a lot of work on molecular machines
(
http://schneider.ncifcrf.gov/) in which he sees a computational model
using information to keep track of computations as useful. Sure it al is
grounded in energy, but this is not the most perspicacious way to view
what happens in these macromolecular interactions. I have argued in

Information in biological systems (Handbook of Philosophy of Science,
vol 8,

Philosophy of Information, 2008, Chapter 5f) that we should
distinguish between the instrumental use of information in biology and a
substantive use, in which information is treated as such by the system.
This is a stronger requirement than in the astrophysical and cosmological
uses of information (in a different substantive way, and also stronger
than Schneider's use). This is a useful distinction in biology, or so I
argue. However, in an earlier paper,
Intrinsic
Information (1990) I argued that in order to understand what it
is to mean that we get information about the world, we must understand
what it is that makes the world capable of providing us with information.
This leads to a natural description of the world as containing
information (see also Dretske, knowledge and the flow of information, and
Barwise and Perry, Situations and Attitudes and following work of theirs)
that flows into our minds, given the right coordination. See also Barwise
and Seligman, Information Flow for a general account not mind dependent.

What you want to treat as information depends very much on what you are
considering and how. I would argue that a unified theory of information
should recognize all of these usages, and put them in their place
relative to each other. Some usages, I believe, are dispensable in some
context, and some may be dispensable in all contexts. But I doubt that
information talk can be dispensed with entirely in favour of energy talk
when boundary conditions are important to system behaviour. This happens
especially with complex systems, but physicists have found it
useful in talking about boundary conditions of black holes, among other
things, that aren't obviously complexly organized.
John
At 02:43 PM 2013/04/15, Bob Logan wrote:
Dear Xueshan - re Nalewajski's
conjecture that molecular systems have information I am skeptical. The
word information originated with the idea of forming the mind according
to the OED. Information as far as I am concerned requires a sentient
being to receive and understand it. Molecules and atoms react to forces
not information. They have no idea of the forces acting on them. They are
not informed as they have no sentience that can be informed. Information
requires an interpretant for which the signal has meaning.
Shannon's information theory is merely signal theory as all he is
concerned with is how well a set of symbols or a signal are transmitted
from the sender to the receiver. The ability of the receiver to decipher
the signal or interpret the signal has no bearing on the reception
of Shannon information. Shannon information has nothing to do with
meaning. A set of random numbers has the maximum amount of Shannon
information and yet has no meaning. If my set of symbols have meaning for
you, whether or not you agree with the premise they represent, then they
are information. As for a molecule or even a flower or a penguin they are
not information. In other words information has to inform as a
grammatical analysis of the word information implies. A representation
represents, a contradiction contradicts, a saturation saturates and in
general an Xtion Xes and therefore information
informs or at least has the capability of informing. So while a text in
the Basque or Albanian languages might not inform me because of my
inability with these languages they are capable of informing those
familiar with the Basque and Albanian languages respectively and are
therefore informaton. A random set of letters cannot inform anyone yet
they have maximum Shannon information. Information is a tricky
thing. 

This line of thought raises the question of whether or not DNA is
information. DNA does not inform a sentient being yet it does catalyze
and hence instructs how RNA is produced which in turn catalyzes and
instructs how proteins are created which in turn gives rise to bodily
functions. Therefore we suggested that DNA represents a different form of
information from Shannon information which we called biotic or
instructional