Re: [Fis] Causation is transfer of information

2017-03-30 Thread Terrence W. DEACON
Dear Hector,

Whenever I read an email or hear a response that begins with the phrase
"With all due respect" I fear that what follows will indeed be
disrespectful and self-promoting. Scholarly respect is particularly
important when the diversity of backgrounds of the contributors is so broad
and their level of erudition in these different fields is likewise broad.
Best to begin with the assumption that all are well-read expert scholars
rather than complaining about others' ignorance of what you refer to—an
assumption that is often mistaken.

In our short email notes one cannot expect each author to provide a list of
all current mathematical and non-mathematical formal definitions of
information, or to provide an evidentiary list of their own papers on the
topic as a proof of competence, in order to make a point. Since we are
inevitably forced to use short-hand terms to qualify our particular usages,
my only suggestion is that we need to find mutially understandable
qualifiers for these different uses, to avoid pointless bickering about
what 'information' is or how it should be used.

The term "information" is not "fixed" to a particular technical definition
currently standard to only one or two fields like mathematics, physics, or
computation theory. Nor can we assume that technical approaches in one
field will be relevant to problems outside that field. I would hope that we
are collectively attempting to expand our mutual understanding of this
concept, recognizing its diversity, and the value of the many very
different approaches in different fields. I would like us to stop making
claims that one or another approach has exclusive priority and remain open
to dialogue and constructive argument. So although we should credit Wiener,
Fano, Solomonoff, Kolmogorov, Chaitin, Bennett, Landauer, and many many
others with greatly extending the field beyond Shannon's initial
contribution, even a full bibliography of mathematical and physical
contributions to the understanding of this concept would only scratch the
surface. Information concepts are critical to molecular and evolutionary
biology, cognitive neuroscience, semiotics and linguistics, and social
theory—to name but a few more divergent fields. Each of these fields has
their own list of luminaries and important discoveries.

The challenge is always to find a common set of terms and assumptions to
ground such ambitious multidisciplinary explorations.
To those who are convinced that the past 65 years of research HAS dealt
with all the relevant issues I beg your patience with those of us who
remain less convinced.

— Terry




On Thu, Mar 30, 2017 at 11:12 AM, John Collier  wrote:

> Dear Hector,
>
>
>
> Personally I agree that algorithmic information theory and the related
> concepts of randomness and Bennett’s logical depth are the best way to go.
> I have used them in many of my own works. When I met Chaitin a few years
> back we talked mostly about how unrewarding and controversial our work on
> information theory has been. When I did an article on information for the
> Stanford Encyclopaedia of Philosophy it was rejected in part becausewe of
> fierce divisions between supporters of Chaitin and supporters of
> Kolmogorov!  The stuff I put in on Spencer Brown was criticized because “he
> was some sort of Buddhist, wasn’t he?” It sounds like you have run into
> similar problems.
>
>
>
> That is why I suggested a realignment of what this group should be aiming
> for. I think the end result would justify our thinking, and your work
> certainly furthers it. But it does need to be worked out. Personally, I
> don’t have the patience for it.
>
>
>
> John Collier
>
> Emeritus Professor and Senior Research Associate
>
> Philosophy, University of KwaZulu-Natal
>
> http://web.ncf.ca/collier
>
>
>
> *From:* Hector Zenil [mailto:hzen...@gmail.com]
> *Sent:* Thursday, 30 March 2017 10:48 AM
> *To:* John Collier ; fis 
> *Subject:* Re: [Fis] Causation is transfer of information
>
>
>
> Dear John et al. Some comments below:
>
> On Thu, Mar 30, 2017 at 9:47 AM, John Collier  wrote:
>
> I think we should try to categorize and relate information concepts rather
> than trying to decide which is the “right one”. I have tried to do this by
> looking at various uses of information in science, and argue that the main
> uses show progressive containment: Kinds of Information in Scientific Use
> <http://www.triple-c.at/index.php/tripleC/article/view/278/269>.
> 2011. cognition, communication, co-operation. Vol 9, No 2
> <http://www.triple-c.at/index.php/tripleC/issue/view/22>
>
>
>
> There are various mathematical formulations of information as well, and I
> think the same strategy is required here. Sometimes they are equivalent,
> sometimes clo

Re: [Fis] Causation is transfer of information

2017-03-30 Thread John Collier
Dear Hector,

Personally I agree that algorithmic information theory and the related concepts 
of randomness and Bennett’s logical depth are the best way to go. I have used 
them in many of my own works. When I met Chaitin a few years back we talked 
mostly about how unrewarding and controversial our work on information theory 
has been. When I did an article on information for the Stanford Encyclopaedia 
of Philosophy it was rejected in part becausewe of fierce divisions between 
supporters of Chaitin and supporters of Kolmogorov!  The stuff I put in on 
Spencer Brown was criticized because “he was some sort of Buddhist, wasn’t he?” 
It sounds like you have run into similar problems.

That is why I suggested a realignment of what this group should be aiming for. 
I think the end result would justify our thinking, and your work certainly 
furthers it. But it does need to be worked out. Personally, I don’t have the 
patience for it.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Hector Zenil [mailto:hzen...@gmail.com]
Sent: Thursday, 30 March 2017 10:48 AM
To: John Collier ; fis 
Subject: Re: [Fis] Causation is transfer of information

Dear John et al. Some comments below:
On Thu, Mar 30, 2017 at 9:47 AM, John Collier 
mailto:colli...@ukzn.ac.za>> wrote:
I think we should try to categorize and relate information concepts rather than 
trying to decide which is the “right one”. I have tried to do this by looking 
at various uses of information in science, and argue that the main uses show 
progressive containment: Kinds of Information in Scientific 
Use<http://www.triple-c.at/index.php/tripleC/article/view/278/269>. 2011. 
cognition, communication, co-operation. Vol 9, No 
2<http://www.triple-c.at/index.php/tripleC/issue/view/22>

There are various mathematical formulations of information as well, and I think 
the same strategy is required here. Sometimes they are equivalent, sometimes 
close to equivalent, and sometimes quite different in form and motivation. Work 
on the foundations of information science needs to make these relations clear. 
A few years back (more than a decade) a mathematician on a list (newsgroup) 
argued that there were dozens of different mathematical definitions of 
information. I thought this was a bit excessive, and argued with him about 
convergences, but he was right that they were mathematically different. We need 
to look at information theory structures and their models to see where they are 
equivalent and where (and if) they overlap. Different mathematical forms can 
have models in common, sometimes all of them.

The agreement among professional mathematicians is that the correct definition 
of randomness as opposed to information is the Martin Loef definition for the 
infinite asymptotic case, and Kolmogorov-Chaitin for the finite case. 
Algorithmic probability (Solomonoff, Levin) is the theory of optimal induction 
and thus provides a formal universal meaning to the value of information. Then 
the general agreement is also that Bennett's logical depth separates the 
concept of randomness from information structure. No much controversy in in 
there on the nature of classical information as algorithmic information. Notice 
that 'algorithmic information' is not just one more definiton of information, 
IS the definition of mathematical information (again, by way of defining 
algorithmic randomness). So adding 'algorithmic' to information is not to talk 
about a special case that can then be ignored by philosophy of information.

All the above builds on (and well beyond) Shannon Entropy, which is not even 
very properly discussed in philosophy of information beyond its most basic 
definition (we rarely, if ever, see discussions around mutual information, 
conditional information, Judea Pearl's interventionist approach and 
counterfactuals, etc), let alone anything of the more advanced areas mentioned 
above, or a discussion on the now well established area of quantum information 
that is also comletely ignored.

This is like trying to do philosophy of cosmology discussing Gamow and Hubble 
but ignoring relativity, or trying to do philosophy of language today 
discussing Locke and Hume but not Chomsky, or doing philosophy of mind 
discussing the findings of Ramon y Cajal and claiming that his theories are not 
enough to explain the brain. It is some sort of strawman fallacy contructing an 
opponent living in the 40s to claim in 2017 that it fails at explaining 
everything about information. Shannon Entropy is a counting-symbol function, 
with interesting applications, Shannon himself knew it. It makes no sense to 
expect a counting-symbol function to tell anything interesting about 
information after 60 years. I refer again to my Entropy deceiving paper: 
https://arxiv.org/abs/1608.05972
I do not blame philosophers on this one, phycisists see

Re: [Fis] Causation is transfer of information

2017-03-30 Thread Hector Zenil
an
> merely as a representation) Information Originates in Symmetry Breaking
> <http://web.ncf.ca/collier/papers/infsym.pdf> (*Symmetry* 1996).
>

Very nice paper. I agree on symmetry breaking, I have similar ideas:
https://arxiv.org/abs/1210.1572
(published in the journal of Natural Computing)
On how symmetric rules can produce assymetric information.

Best,

Hector Zenil
http://www.hectorzenil.net/


> I adopt what I call dynamical realism, that anything that is real is
> either dynamical or interpretable in dynamical terms. Not everyone will
> agree.
>
>
>
> John Collier
>
> Emeritus Professor and Senior Research Associate
>
> Philosophy, University of KwaZulu-Natal
>
> http://web.ncf.ca/collier
>
>
>
> *From:* Guy A Hoelzer [mailto:hoel...@unr.edu]
> *Sent:* Wednesday, 29 March 2017 1:44 AM
> *To:* Sungchul Ji ; Terry Deacon <
> dea...@berkeley.edu>; John Collier ; Foundations of
> Information Science Information Science 
>
> *Subject:* Re: [Fis] Causation is transfer of information
>
>
>
> Greetings all,
>
>
>
> It seems that the indigestion from competing definitions of ‘information’
> is hard to resolve, and I agree with Terry and others that a broad
> definition is preferable.  I also think it is not a problem to allow
> multiple definitions that can be operationally adopted in appropriate
> contexts.  In some respects, apparently competing definitions are actually
> reinforcing.  For example, I prefer to use ‘information’ to describe any
> difference (a distinction or contrast), and it is also true that a subset
> of all differences are ones that ‘make a difference’ to an observer.  When
> we restrict ‘information’ to differences that make a difference it becomes
> inherently subjective.  That is certainly not a problem if you are
> interested in subjectivity, but it would eliminate the rationality of
> studying objective ‘information’, which I think holds great promise for
> understanding dynamical systems.  I don’t see any conflict between
> ‘information’ as negentropy and ‘information’ as a basis for decision
> making.  On the other hand, semantics and semiotics involve the attachment
> of meaning to information, which strikes me as a separate and complementary
> idea.  Therefore, I think it is important to sustain this distinction
> explicitly in what we write.  Maybe there is a context in which
> ‘information’ and ‘meaning’ are so intertwined that they cannot be
> isolated, but I can’t think of one.  I’m sure there are plenty of contexts
> in which the important thing is ‘meaning’, and where the (more general,
> IMHO) term ‘information’ is used instead.  I think it is fair to say that
> you can have information without meaning, but you can’t have meaning
> without information.  Can anybody think of a way in which it might be
> misleading if this distinction was generally accepted?
>
>
>
> Regards,
>
>
>
> Guy
>
>
>
>
>
> On Mar 28, 2017, at 3:26 PM, Sungchul Ji  wrote:
>
>
>
> Hi Fisers,
>
>
>
> I agree with Terry that "information" has three irreducible aspects ---
> *amount*, *meaning*, and *value*.  These somehow may be related to
> another triadic relation called the ITR as depicted below, although I don't
> know the exact rule of mapping between the two triads.  Perhaps, 'amount' =
> f, 'meaning' = g, and 'value' = h ? .
>
>
>
>   f   g
>
>Object --->  Sign -->  Interpretant
>
> |
>   ^
> |
>|
> |
>|
> |
>|
> |_|
>
> h
>
>
>
> *Figure 1.*  The *Irreducible Triadic Relation* (ITR) of seimosis (also
> called sign process or communication) first clearly articulated by Peirce
> to the best of my knowledge. *Warning*: Peirce often replaces Sign with
> Representamen and represents the whole triad, i.e., Figure 1
> itself (although he did not use such a figure in his writings) as the Sign.
> Not distinguishing between these two very different uses of the same word
> "Sign" can lead to semiotic confusions.   The three processes are defined
> as follows: f = sign production, g = sign interpretation, h = information
> flow (other ways of labeling the arrows are not excluded).   Each process
> or arrow reads "determines", "leads", "is presupposed by", etc., and
> the three arrows constitute a *commutative

Re: [Fis] Causation is transfer of information

2017-03-30 Thread John Collier
I think we should try to categorize and relate information concepts rather than 
trying to decide which is the “right one”. I have tried to do this by looking 
at various uses of information in science, and argue that the main uses show 
progressive containment: Kinds of Information in Scientific 
Use<http://www.triple-c.at/index.php/tripleC/article/view/278/269>. 2011. 
cognition, communication, co-operation. Vol 9, No 
2<http://www.triple-c.at/index.php/tripleC/issue/view/22>

There are various mathematical formulations of information as well, and I think 
the same strategy is required here. Sometimes they are equivalent, sometimes 
close to equivalent, and sometimes quite different in form and motivation. Work 
on the foundations of information science needs to make these relations clear. 
A few years back (more than a decade) a mathematician on a list (newsgroup) 
argued that there were dozens of different mathematical definitions of 
information. I thought this was a bit excessive, and argued with him about 
convergences, but he was right that they were mathematically different. We need 
to look at information theory structures and their models to see where they are 
equivalent and where (and if) they overlap. Different mathematical forms can 
have models in common, sometimes all of them.

I have argued that information originates in symmetry breaking (making a 
difference, if you like, but I see it as a dynamic process rather than merely 
as a representation) Information Originates in Symmetry 
Breaking<http://web.ncf.ca/collier/papers/infsym.pdf> (Symmetry 1996). I adopt 
what I call dynamical realism, that anything that is real is either dynamical 
or interpretable in dynamical terms. Not everyone will agree.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Guy A Hoelzer [mailto:hoel...@unr.edu]
Sent: Wednesday, 29 March 2017 1:44 AM
To: Sungchul Ji ; Terry Deacon 
; John Collier ; Foundations of 
Information Science Information Science 
Subject: Re: [Fis] Causation is transfer of information

Greetings all,

It seems that the indigestion from competing definitions of ‘information’ is 
hard to resolve, and I agree with Terry and others that a broad definition is 
preferable.  I also think it is not a problem to allow multiple definitions 
that can be operationally adopted in appropriate contexts.  In some respects, 
apparently competing definitions are actually reinforcing.  For example, I 
prefer to use ‘information’ to describe any difference (a distinction or 
contrast), and it is also true that a subset of all differences are ones that 
‘make a difference’ to an observer.  When we restrict ‘information’ to 
differences that make a difference it becomes inherently subjective.  That is 
certainly not a problem if you are interested in subjectivity, but it would 
eliminate the rationality of studying objective ‘information’, which I think 
holds great promise for understanding dynamical systems.  I don’t see any 
conflict between ‘information’ as negentropy and ‘information’ as a basis for 
decision making.  On the other hand, semantics and semiotics involve the 
attachment of meaning to information, which strikes me as a separate and 
complementary idea.  Therefore, I think it is important to sustain this 
distinction explicitly in what we write.  Maybe there is a context in which 
‘information’ and ‘meaning’ are so intertwined that they cannot be isolated, 
but I can’t think of one.  I’m sure there are plenty of contexts in which the 
important thing is ‘meaning’, and where the (more general, IMHO) term 
‘information’ is used instead.  I think it is fair to say that you can have 
information without meaning, but you can’t have meaning without information.  
Can anybody think of a way in which it might be misleading if this distinction 
was generally accepted?

Regards,

Guy


On Mar 28, 2017, at 3:26 PM, Sungchul Ji 
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Fisers,

I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .

  f   g
 

Re: [Fis] Causation is transfer of information

2017-03-30 Thread John Collier
Interesting papers. I have a few remarks, but no time right now. I heartily 
agree with your general point.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Hector Zenil [mailto:hzen...@gmail.com]
Sent: Wednesday, 29 March 2017 11:00 AM
To: Terrence W. DEACON 
Cc: fis 
Subject: Re: [Fis] Causation is transfer of information

With all due respect, I am still amazed how it is so much ignored and neglected 
all the science and math around information developed in the last 50-60 years! 
With most people here citing in the best case only Shannon Entropy but 
completely neglecting and ignoring algorithmic complexity, logical depth, 
quantum information and so on. Your philosophical discussions are quite empty 
if most people ignore the progress that computer science and math has done in 
the last 60 years! Please take it constructively. This should be a shame for 
the whole field of Philosophy of Information and FIS.

Perhaps I can help alleviate this a little even if I feel wrong pointing you 
out to my own papers on subjects relevant to philosophical discussion:

http://www.hectorzenil.net/publications.html

They do care about the meaning and value of information beyond Shannon Entropy. 
For example, paper J21:

- Natural Scene Statistics Mediate the Perception of Image Complexity 
(available online at 
http://www.tandfonline.com/doi/abs/10.1080/13506285.2014.950365 also available 
pdf preprint in the arxiv)

and

- Rare Speed-up in Automatic Theorem Proving Reveals Tradeoff Between 
Computational Time and Information Value (https://arxiv.org/abs/1506.04349).

And we even show how Entropy fails at the most basic level:

Low Algorithmic Complexity Entropy-deceiving Graphs 
(https://arxiv.org/abs/1608.05972)

Best Regards,

Hector Zenil

---
This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are addressed. If 
you have received this email in error please notify the sender and delete the 
message.

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
mailto:dea...@berkeley.edu>> wrote:
>
> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the concept 
> of information to only one subset of its potential applications. But to work 
> with this breadth of usage we need to recognize that 'information' can refer 
> to intrinsic statistical properties of a physical medium, extrinsic 
> referential properties of that medium (i.e. content), and the significance or 
> use value of that content, depending on the context.  A problem arises when 
> we demand that only one of these uses should be given legitimacy. As I have 
> repeatedly suggested on this listserve, it will be a source of constant 
> useless argument to make the assertion that someone is wrong in their 
> understanding of information if they use it in one of these non-formal ways. 
> But to fail to mark which conception of information is being considered, or 
> worse, to use equivocal conceptions of the term in the same argument, will 
> ultimately undermine our efforts to understand one another and develop a 
> complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in 
> legal and literary contexts, in all of these variant forms. But there has 
> been a slowly increasing tendency to use it to refer to the 
> information-beqaring medium itself, in substantial terms. This reached its 
> greatest extreme with the restricted technical usage formalized by Claude 
> Shannon. Remember, however, that this was only introduced a little over a 
> half century ago. When one of his mentors (Hartley) initially introduced a 
> logarithmic measure of signal capacity he called it 'intelligence' — as in 
> the gathering of intelligence by a spy organization. So had Shannon chose to 
> stay with that usage the confusions could have been worse (think about how 
> confusing it would have been to talk about the entropy of intelligence). Even 
> so, Shannon himself was to later caution against assuming that his use of the 
> term 'information' applied beyond its technical domain.
>
> So despite the precision and breadth of appliction that was achieved by 
> setting aside the extrinsic relational features that characterize the more 
> colloquial uses of the term, this does not mean that these other uses are in 
> some sense non-scientific. And I am not alone in the belief that these 
> non-intrinsic properties can also (eventually) be strictly formalized and 
> thereby contribute insights to such technical fields as molecular biology and 
> cognitive neuroscience.
>
> As a result I think that it is legitimate to argue that information (in the 

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Sungchul Ji
Hi Soeren and FISers,


(1) Tychism is intrinsic to the Planckian information, since it is defined as 
the binary logarithm of the ratio of the area under the curve (AUC) of the 
Planckian distribution (PDE)  over the AUC of the Gaussian-like Equation (GLE):


  I_P  =  log (AUC(PDE)/AUC(GLE))


Tychism is implied in GLE.


(2)  The Planckian processes are defined as those physicochemical or formal 
processes that generate long-tailed histograms (or their superpositions) 
fitting PDE (or its suppositions).   The Planckian process seems irreducibly 
triadic in the Peircean sesne:



f   
g

  Random processes --->   Long-tailed histograms  
->  PDE

 (Firstness)   
(Secondness) (Thirdness)

 |  
   ^

 |  
   |

 |  
   |

 |  
   |

 ||


h


Figure 2.  The Irreducible Triadic Relation (ITR) embodied in the Planckian 
processes.  f = selection process either natural or artificial; g =  
mathematical modeling; h = grounding, correspondence, or information flow.


(3)  (to be continued)


All the best.


Sung






From: Søren Brier 
Sent: Wednesday, March 29, 2017 7:06 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information


Dear Sung



It is difficult for me to say as you do not make your metaphysical framework 
explicit.  This was the great work Peirce did. I am pretty sure you do not have 
a dynamic triadic process concept of semiosis based on a tychastic theory of 
Firstness as potential qualia or forms of feeling of which information is only 
an aspect.



Best

   Søren



From: Sungchul Ji [mailto:s...@pharmacy.rutgers.edu]
Sent: 29. marts 2017 20:35
To: Søren Brier; Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Soeren,



Can  you be more specific about what aspect of my proposal described in my 
previous emails you think are my own and has nothing to do with (or are even 
based on my misinterpretation of) Peirce ?



Thanks in advance.



Sung











From: Søren Brier mailto:sbr@cbs.dk>>
Sent: Wednesday, March 29, 2017 2:10 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information



Dear Sung



I suggest you call this your own theory and make your own definitions of terms, 
because you confuse things by attempting to draw on Peirce, because there is a 
whole process philosophy with synechism, tychism, agapism and Scholastic 
realism plus a phenomenological and mathematically based  triadic metaphysics 
as the basis of Peirce’s concepts, which is the fruit of his life’s work. I do 
not think you are ready to carry that load. It takes many years to understand 
fully. The ‘sign’ is a triadic process of representamen, object and 
interpretant working in the realm of Firstness, Secondness and Thirdness in a 
society at large or a society of researchers devoted to the search for truth 
producing the meaning of signs, which when developed into propositional 
arguments can be tested in the fallible scientific process  of generating more 
rationality in culture as well as nature.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Sungchul Ji
Sent: 29. marts 2017 00:27
To: Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Fisers,



I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .



  f   g

   Ob

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Sungchul Ji
Hi Soeren,


Can  you be more specific about what aspect of my proposal described in my 
previous emails you think are my own and has nothing to do with (or are even 
based on my misinterpretation of) Peirce ?


Thanks in advance.


Sung






From: Søren Brier 
Sent: Wednesday, March 29, 2017 2:10 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information


Dear Sung



I suggest you call this your own theory and make your own definitions of terms, 
because you confuse things by attempting to draw on Peirce, because there is a 
whole process philosophy with synechism, tychism, agapism and Scholastic 
realism plus a phenomenological and mathematically based  triadic metaphysics 
as the basis of Peirce’s concepts, which is the fruit of his life’s work. I do 
not think you are ready to carry that load. It takes many years to understand 
fully. The ‘sign’ is a triadic process of representamen, object and 
interpretant working in the realm of Firstness, Secondness and Thirdness in a 
society at large or a society of researchers devoted to the search for truth 
producing the meaning of signs, which when developed into propositional 
arguments can be tested in the fallible scientific process  of generating more 
rationality in culture as well as nature.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Sungchul Ji
Sent: 29. marts 2017 00:27
To: Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Fisers,



I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .



  f   g

   Object --->  Sign -->  Interpretant

|   
 ^
|   
 |
|   
 |
|   
 |
|_|

h



Figure 1.  The Irreducible Triadic Relation (ITR) of seimosis (also called sign 
process or communication) first clearly articulated by Peirce to the best of my 
knowledge. Warning: Peirce often replaces Sign with Representamen and 
represents the whole triad, i.e., Figure 1 itself (although he did not use such 
a figure in his writings) as the Sign. Not distinguishing between these two 
very different uses of the same word "Sign" can lead to semiotic confusions.   
The three processes are defined as follows: f = sign production, g = sign 
interpretation, h = information flow (other ways of labeling the arrows are not 
excluded).   Each process or arrow reads "determines", "leads", "is presupposed 
by", etc., and the three arrows constitute a commutative triangle of category 
theory, i.e., f x g = h, meaning f followed by g ledes to the same result as h.



I started using  the so-called  ITR template, Figure 1,  about 5 years ago, and 
the main reason I am bringing it up here is to ask your critical opinion on my 
suggestion published in 2012 (Molecular Theory of the Living  Cell: Concepts, 
Molecular Mechanisms, and Biomedical Applications, Springer New York, p ~100 ?) 
that there are two kinds of causality -- (i) the energy-dependent causality 
(identified with Processes f and g in Figure 1) and (ii) the information (and 
hence code)-dependent causality (identified with Process h).  For convenience, 
I coined the term 'codality' to refer to the latter to contrast it with the 
traditional term causality.



I wonder if we can  view John's idea of the relation between 'information' and 
'cause' as being  an alternative way of expressing the same ideas as the 
"energy-dependent causality" or the "codality" defined in Figure 1.



All the best.



Sung







From: Fis mailto:fis-boun...@listas.unizar.es>> 
on behalf of Terrence W. DEACON 
mailto:dea...@berkeley.edu>>
Sent: Tuesday, March 28, 2017 4:23:14 PM
To: John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Corrected typos (in case the intrinsic redundancy didn't compensate for these 
minor corruptions of the text):



 information-beqaring medium =  information-bearing medi

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Hector Zenil
With all due respect, I am still amazed how it is so much ignored and
neglected all the science and math around information developed in the last
50-60 years! With most people here citing in the best case only Shannon
Entropy but completely neglecting and ignoring algorithmic complexity,
logical depth, quantum information and so on. Your philosophical
discussions are quite empty if most people ignore the progress that
computer science and math has done in the last 60 years! Please take it
constructively. This should be a shame for the whole field of Philosophy of
Information and FIS.

Perhaps I can help alleviate this a little even if I feel wrong pointing
you out to my own papers on subjects relevant to philosophical discussion:

http://www.hectorzenil.net/publications.html

They do care about the meaning and value of information beyond Shannon
Entropy. For example, paper J21:

- Natural Scene Statistics Mediate the Perception of Image Complexity
(available online at
http://www.tandfonline.com/doi/abs/10.1080/13506285.2014.950365 also
available pdf preprint in the arxiv)

and

- Rare Speed-up in Automatic Theorem Proving Reveals Tradeoff Between
Computational Time and Information Value (https://arxiv.org/abs/1506.04349).

And we even show how Entropy fails at the most basic level:

Low Algorithmic Complexity Entropy-deceiving Graphs (
https://arxiv.org/abs/1608.05972)

Best Regards,

Hector Zenil

---
This email and any files transmitted with it are confidential and intended
solely for the use of the individual or entity to whom they are addressed.
If you have received this email in error please notify the sender and
delete the message.

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
wrote:
>
> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the
concept of information to only one subset of its potential applications.
But to work with this breadth of usage we need to recognize that
'information' can refer to intrinsic statistical properties of a physical
medium, extrinsic referential properties of that medium (i.e. content), and
the significance or use value of that content, depending on the context.  A
problem arises when we demand that only one of these uses should be given
legitimacy. As I have repeatedly suggested on this listserve, it will be a
source of constant useless argument to make the assertion that someone is
wrong in their understanding of information if they use it in one of these
non-formal ways. But to fail to mark which conception of information is
being considered, or worse, to use equivocal conceptions of the term in the
same argument, will ultimately undermine our efforts to understand one
another and develop a complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in
legal and literary contexts, in all of these variant forms. But there has
been a slowly increasing tendency to use it to refer to the
information-beqaring medium itself, in substantial terms. This reached its
greatest extreme with the restricted technical usage formalized by Claude
Shannon. Remember, however, that this was only introduced a little over a
half century ago. When one of his mentors (Hartley) initially introduced a
logarithmic measure of signal capacity he called it 'intelligence' — as in
the gathering of intelligence by a spy organization. So had Shannon chose
to stay with that usage the confusions could have been worse (think about
how confusing it would have been to talk about the entropy of
intelligence). Even so, Shannon himself was to later caution against
assuming that his use of the term 'information' applied beyond its
technical domain.
>
> So despite the precision and breadth of appliction that was achieved by
setting aside the extrinsic relational features that characterize the more
colloquial uses of the term, this does not mean that these other uses are
in some sense non-scientific. And I am not alone in the belief that these
non-intrinsic properties can also (eventually) be strictly formalized and
thereby contribute insights to such technical fields as molecular biology
and cognitive neuroscience.
>
> As a result I think that it is legitimate to argue that information (in
the referential sense) is only in use among living forms, that an alert
signal sent by the computer in an automobile engine is information (in both
senses, depending on whether we include a human interpreter in the loop),
or that information (in the intrinsic sense of a medium property) is lost
within a black hole or that it can be used  to provide a more precise
conceptiont of physical cause (as in Collier's sense). These different uses
aren't unrelated to each other. They are just asymmetrically dependent on
one another, such that medium-intrinsic properties can be investigated
without considering referential properties, but not vice versa.
>
> It's time we move beyond terminological chauvenism so that we can further

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Guy A Hoelzer
Greetings all,

It seems that the indigestion from competing definitions of ‘information’ is 
hard to resolve, and I agree with Terry and others that a broad definition is 
preferable.  I also think it is not a problem to allow multiple definitions 
that can be operationally adopted in appropriate contexts.  In some respects, 
apparently competing definitions are actually reinforcing.  For example, I 
prefer to use ‘information’ to describe any difference (a distinction or 
contrast), and it is also true that a subset of all differences are ones that 
‘make a difference’ to an observer.  When we restrict ‘information’ to 
differences that make a difference it becomes inherently subjective.  That is 
certainly not a problem if you are interested in subjectivity, but it would 
eliminate the rationality of studying objective ‘information’, which I think 
holds great promise for understanding dynamical systems.  I don’t see any 
conflict between ‘information’ as negentropy and ‘information’ as a basis for 
decision making.  On the other hand, semantics and semiotics involve the 
attachment of meaning to information, which strikes me as a separate and 
complementary idea.  Therefore, I think it is important to sustain this 
distinction explicitly in what we write.  Maybe there is a context in which 
‘information’ and ‘meaning’ are so intertwined that they cannot be isolated, 
but I can’t think of one.  I’m sure there are plenty of contexts in which the 
important thing is ‘meaning’, and where the (more general, IMHO) term 
‘information’ is used instead.  I think it is fair to say that you can have 
information without meaning, but you can’t have meaning without information.  
Can anybody think of a way in which it might be misleading if this distinction 
was generally accepted?

Regards,

Guy


On Mar 28, 2017, at 3:26 PM, Sungchul Ji 
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Fisers,

I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .

  f   g
   Object --->  Sign -->  Interpretant
|   
 ^
|   
 |
|   
 |
|   
 |
|_|
h

Figure 1.  The Irreducible Triadic Relation (ITR) of seimosis (also called sign 
process or communication) first clearly articulated by Peirce to the best of my 
knowledge. Warning: Peirce often replaces Sign with Representamen and 
represents the whole triad, i.e., Figure 1 itself (although he did not use such 
a figure in his writings) as the Sign. Not distinguishing between these two 
very different uses of the same word "Sign" can lead to semiotic confusions.   
The three processes are defined as follows: f = sign production, g = sign 
interpretation, h = information flow (other ways of labeling the arrows are not 
excluded).   Each process or arrow reads "determines", "leads", "is presupposed 
by", etc., and the three arrows constitute a commutative triangle of category 
theory, i.e., f x g = h, meaning f followed by g ledes to the same result as h.

I started using  the so-called  ITR template, Figure 1,  about 5 years ago, and 
the main reason I am bringing it up here is to ask your critical opinion on my 
suggestion published in 2012 (Molecular Theory of the Living  Cell: Concepts, 
Molecular Mechanisms, and Biomedical Applications, Springer New York, p ~100 ?) 
that there are two kinds of causality -- (i) the energy-dependent causality 
(identified with Processes f and g in Figure 1) and (ii) the information (and 
hence code)-dependent causality (identified with Process h).  For convenience, 
I coined the term 'codality' to refer to the latter to contrast it with the 
traditional term causality.

I wonder if we can  view John's idea of the relation between 'information' and 
'cause' as being  an alternative way of expressing the same ideas as the 
"energy-dependent causality" or the "codality" defined in Figure 1.

All the best.

Sung



From: Fis mailto:fis-boun...@listas.unizar.es>> 
on behalf of Terrence W. DEACON 
mailto:dea...@berkeley.edu>>
Sent: Tuesday, 

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Sungchul Ji
Hi Fisers,


I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .


  f   g

   Object --->  Sign -->  Interpretant

|   
 ^
|   
 |
|   
 |
|   
 |
|_|

h


Figure 1.  The Irreducible Triadic Relation (ITR) of seimosis (also called sign 
process or communication) first clearly articulated by Peirce to the best of my 
knowledge. Warning: Peirce often replaces Sign with Representamen and 
represents the whole triad, i.e., Figure 1 itself (although he did not use such 
a figure in his writings) as the Sign. Not distinguishing between these two 
very different uses of the same word "Sign" can lead to semiotic confusions.   
The three processes are defined as follows: f = sign production, g = sign 
interpretation, h = information flow (other ways of labeling the arrows are not 
excluded).   Each process or arrow reads "determines", "leads", "is presupposed 
by", etc., and the three arrows constitute a commutative triangle of category 
theory, i.e., f x g = h, meaning f followed by g ledes to the same result as h.


I started using  the so-called  ITR template, Figure 1,  about 5 years ago, and 
the main reason I am bringing it up here is to ask your critical opinion on my 
suggestion published in 2012 (Molecular Theory of the Living  Cell: Concepts, 
Molecular Mechanisms, and Biomedical Applications, Springer New York, p ~100 ?) 
that there are two kinds of causality -- (i) the energy-dependent causality 
(identified with Processes f and g in Figure 1) and (ii) the information (and 
hence code)-dependent causality (identified with Process h).  For convenience, 
I coined the term 'codality' to refer to the latter to contrast it with the 
traditional term causality.


I wonder if we can  view John's idea of the relation between 'information' and 
'cause' as being  an alternative way of expressing the same ideas as the 
"energy-dependent causality" or the "codality" defined in Figure 1.


All the best.


Sung



____________
From: Fis  on behalf of Terrence W. DEACON 

Sent: Tuesday, March 28, 2017 4:23:14 PM
To: John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information

Corrected typos (in case the intrinsic redundancy didn't compensate for these 
minor corruptions of the text):

 information-beqaring medium =  information-bearing medium

appliction = application

 conceptiont =  conception

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
mailto:dea...@berkeley.edu>> wrote:
Dear FIS colleagues,

I agree with John Collier that we should not assume to restrict the concept of 
information to only one subset of its potential applications. But to work with 
this breadth of usage we need to recognize that 'information' can refer to 
intrinsic statistical properties of a physical medium, extrinsic referential 
properties of that medium (i.e. content), and the significance or use value of 
that content, depending on the context.  A problem arises when we demand that 
only one of these uses should be given legitimacy. As I have repeatedly 
suggested on this listserve, it will be a source of constant useless argument 
to make the assertion that someone is wrong in their understanding of 
information if they use it in one of these non-formal ways. But to fail to mark 
which conception of information is being considered, or worse, to use equivocal 
conceptions of the term in the same argument, will ultimately undermine our 
efforts to understand one another and develop a complete general theory of 
information.

This nominalization of 'inform' has been in use for hundreds of years in legal 
and literary contexts, in all of these variant forms. But there has been a 
slowly increasing tendency to use it to refer to the information-beqaring 
medium itself, in substantial terms. This reached its greatest extreme with the 
restricted technical usage formalized by Claude Shannon. Remember, however, 
that this was only introduced a little over a half century ago. When one of his 
mentors (Hartl

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Robert E. Ulanowicz
In order:

John,

I agree. For example, if one identifies information with constraint, the
notion of information as causation becomes tautologous. It also feeds into
the notion of "It from bit"!

Terry,

I agree, best to remain as catholic as possible in our conception of the
notion.

Otto:

Spot-on! Feedbacks among non-living components provided the cradle for the
early emergence and proliferation of information. (See p147ff in
.)

Cheers to all,
Bob U.

> Dear all,
> Just to comment on the discussion after Terrence's apt cautionary words...
>
> The various notions of information are partially a linguistic confusion,
> partially a relic of multiple conceptual histories colliding, and
> partially
> an ongoing negotiation (or even a war, to state it less creditably and
> with
> less civility), about the future of the term as a (more or less unified)
> scientific concept.
>
> To latch onto that negotiation, let me propose that an evolutionary
> approach to information can capture and explain some of that ambiguous
> multiplicity in terminology, by showing how pre-biotic natural processes
> developed feedback loops and material encoding techniques - which was a
> type of localised informational emergence - and how life, in developing
> cellular communication, DNA, sentience, memory, and selfhood, rarified
> this
> process further, producing informational processing such that had never
> existed before. Was it the same information? Or was it something new?
>
> Human consciousness and cultural semiosis are a yet higher level
> adaptation
> of information, and computer A.I. is something else entirely, for - at
> least for now - it lacks feelings and self-awareness and thus "meaning" in
> the human sense. But it computes, stores and processes. It might even
> develop suprasentience whose structure we cannot fathom based on our
> limited human perspective.  Is it still the same type of information? Or
> something different? Is evolution in quality (emergence) or only in
> quantity (continuous development)?
>
> I generally take the Peircean view that signification (informative
> relationality) evolves, and information, as an offshoot of that, is thus a
> multi-stage process - EVEN if it has a simple and predictable elemental
> substructure (composed of say, 1s and 0s, or quarks and bosons).
>
> Information might thus not only have a complex history of emergence, but
> also an unknown future, composed of various leaps in cosmic organization.
>
> In ignorant wonder, all the best,
>
> Otto Lehto,
>
> philosopher, political economist,
> PhD student at King's College London,
> webpage: www.ottolehto.com,
> cellphone: +358-407514748
>
> On Mar 28, 2017 23:24, "Terrence W. DEACON"  wrote:
>
>> Corrected typos (in case the intrinsic redundancy didn't compensate for
>> these minor corruptions of the text):
>>
>>  information-beqaring medium =  information-bearing medium
>>
>> appliction = application
>>
>>  conceptiont =  conception
>>
>> On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON
>> 
>> wrote:
>>
>>> Dear FIS colleagues,
>>>
>>> I agree with John Collier that we should not assume to restrict the
>>> concept of information to only one subset of its potential
>>> applications.
>>> But to work with this breadth of usage we need to recognize that
>>> 'information' can refer to intrinsic statistical properties of a
>>> physical
>>> medium, extrinsic referential properties of that medium (i.e. content),
>>> and
>>> the significance or use value of that content, depending on the
>>> context.  A
>>> problem arises when we demand that only one of these uses should be
>>> given
>>> legitimacy. As I have repeatedly suggested on this listserve, it will
>>> be a
>>> source of constant useless argument to make the assertion that someone
>>> is
>>> wrong in their understanding of information if they use it in one of
>>> these
>>> non-formal ways. But to fail to mark which conception of information is
>>> being considered, or worse, to use equivocal conceptions of the term in
>>> the
>>> same argument, will ultimately undermine our efforts to understand one
>>> another and develop a complete general theory of information.
>>>
>>> This nominalization of 'inform' has been in use for hundreds of years
>>> in
>>> legal and literary contexts, in all of these variant forms. But there
>>> has
>>> been a slowly increasing tendency to use it to refer to the
>>> information-beqaring medium itself, in substantial terms. This reached
>>> its
>>> greatest extreme with the restricted technical usage formalized by
>>> Claude
>>> Shannon. Remember, however, that this was only introduced a little over
>>> a
>>> half century ago. When one of his mentors (Hartley) initially
>>> introduced a
>>> logarithmic measure of signal capacity he called it 'intelligence' —
>>> as in
>>> the gathering of intelligence by a spy organization. So had Shannon
>>> chose
>>> to stay with that usag

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Otto Lehto
Dear all,
Just to comment on the discussion after Terrence's apt cautionary words...

The various notions of information are partially a linguistic confusion,
partially a relic of multiple conceptual histories colliding, and partially
an ongoing negotiation (or even a war, to state it less creditably and with
less civility), about the future of the term as a (more or less unified)
scientific concept.

To latch onto that negotiation, let me propose that an evolutionary
approach to information can capture and explain some of that ambiguous
multiplicity in terminology, by showing how pre-biotic natural processes
developed feedback loops and material encoding techniques - which was a
type of localised informational emergence - and how life, in developing
cellular communication, DNA, sentience, memory, and selfhood, rarified this
process further, producing informational processing such that had never
existed before. Was it the same information? Or was it something new?

Human consciousness and cultural semiosis are a yet higher level adaptation
of information, and computer A.I. is something else entirely, for - at
least for now - it lacks feelings and self-awareness and thus "meaning" in
the human sense. But it computes, stores and processes. It might even
develop suprasentience whose structure we cannot fathom based on our
limited human perspective.  Is it still the same type of information? Or
something different? Is evolution in quality (emergence) or only in
quantity (continuous development)?

I generally take the Peircean view that signification (informative
relationality) evolves, and information, as an offshoot of that, is thus a
multi-stage process - EVEN if it has a simple and predictable elemental
substructure (composed of say, 1s and 0s, or quarks and bosons).

Information might thus not only have a complex history of emergence, but
also an unknown future, composed of various leaps in cosmic organization.

In ignorant wonder, all the best,

Otto Lehto,

philosopher, political economist,
PhD student at King's College London,
webpage: www.ottolehto.com,
cellphone: +358-407514748

On Mar 28, 2017 23:24, "Terrence W. DEACON"  wrote:

> Corrected typos (in case the intrinsic redundancy didn't compensate for
> these minor corruptions of the text):
>
>  information-beqaring medium =  information-bearing medium
>
> appliction = application
>
>  conceptiont =  conception
>
> On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
> wrote:
>
>> Dear FIS colleagues,
>>
>> I agree with John Collier that we should not assume to restrict the
>> concept of information to only one subset of its potential applications.
>> But to work with this breadth of usage we need to recognize that
>> 'information' can refer to intrinsic statistical properties of a physical
>> medium, extrinsic referential properties of that medium (i.e. content), and
>> the significance or use value of that content, depending on the context.  A
>> problem arises when we demand that only one of these uses should be given
>> legitimacy. As I have repeatedly suggested on this listserve, it will be a
>> source of constant useless argument to make the assertion that someone is
>> wrong in their understanding of information if they use it in one of these
>> non-formal ways. But to fail to mark which conception of information is
>> being considered, or worse, to use equivocal conceptions of the term in the
>> same argument, will ultimately undermine our efforts to understand one
>> another and develop a complete general theory of information.
>>
>> This nominalization of 'inform' has been in use for hundreds of years in
>> legal and literary contexts, in all of these variant forms. But there has
>> been a slowly increasing tendency to use it to refer to the
>> information-beqaring medium itself, in substantial terms. This reached its
>> greatest extreme with the restricted technical usage formalized by Claude
>> Shannon. Remember, however, that this was only introduced a little over a
>> half century ago. When one of his mentors (Hartley) initially introduced a
>> logarithmic measure of signal capacity he called it 'intelligence' — as in
>> the gathering of intelligence by a spy organization. So had Shannon chose
>> to stay with that usage the confusions could have been worse (think about
>> how confusing it would have been to talk about the entropy of
>> intelligence). Even so, Shannon himself was to later caution against
>> assuming that his use of the term 'information' applied beyond its
>> technical domain.
>>
>> So despite the precision and breadth of appliction that was achieved by
>> setting aside the extrinsic relational features that characterize the more
>> colloquial uses of the term, this does not mean that these other uses are
>> in some sense non-scientific. And I am not alone in the belief that these
>> non-intrinsic properties can also (eventually) be strictly formalized and
>> thereby contribute insights to such technical fields as mole

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Terrence W. DEACON
Corrected typos (in case the intrinsic redundancy didn't compensate for
these minor corruptions of the text):

 information-beqaring medium =  information-bearing medium

appliction = application

 conceptiont =  conception

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
wrote:

> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the
> concept of information to only one subset of its potential applications.
> But to work with this breadth of usage we need to recognize that
> 'information' can refer to intrinsic statistical properties of a physical
> medium, extrinsic referential properties of that medium (i.e. content), and
> the significance or use value of that content, depending on the context.  A
> problem arises when we demand that only one of these uses should be given
> legitimacy. As I have repeatedly suggested on this listserve, it will be a
> source of constant useless argument to make the assertion that someone is
> wrong in their understanding of information if they use it in one of these
> non-formal ways. But to fail to mark which conception of information is
> being considered, or worse, to use equivocal conceptions of the term in the
> same argument, will ultimately undermine our efforts to understand one
> another and develop a complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in
> legal and literary contexts, in all of these variant forms. But there has
> been a slowly increasing tendency to use it to refer to the
> information-beqaring medium itself, in substantial terms. This reached its
> greatest extreme with the restricted technical usage formalized by Claude
> Shannon. Remember, however, that this was only introduced a little over a
> half century ago. When one of his mentors (Hartley) initially introduced a
> logarithmic measure of signal capacity he called it 'intelligence' — as in
> the gathering of intelligence by a spy organization. So had Shannon chose
> to stay with that usage the confusions could have been worse (think about
> how confusing it would have been to talk about the entropy of
> intelligence). Even so, Shannon himself was to later caution against
> assuming that his use of the term 'information' applied beyond its
> technical domain.
>
> So despite the precision and breadth of appliction that was achieved by
> setting aside the extrinsic relational features that characterize the more
> colloquial uses of the term, this does not mean that these other uses are
> in some sense non-scientific. And I am not alone in the belief that these
> non-intrinsic properties can also (eventually) be strictly formalized and
> thereby contribute insights to such technical fields as molecular biology
> and cognitive neuroscience.
>
> As a result I think that it is legitimate to argue that information (in
> the referential sense) is only in use among living forms, that an alert
> signal sent by the computer in an automobile engine is information (in both
> senses, depending on whether we include a human interpreter in the loop),
> or that information (in the intrinsic sense of a medium property) is lost
> within a black hole or that it can be used  to provide a more precise
> conceptiont of physical cause (as in Collier's sense). These different uses
> aren't unrelated to each other. They are just asymmetrically dependent on
> one another, such that medium-intrinsic properties can be investigated
> without considering referential properties, but not vice versa.
>
> It's time we move beyond terminological chauvenism so that we can further
> our dialogue about the entire domain in which the concept of information is
> important. To succeed at this, we only need to be clear about which
> conception of information we are using in any given context.
>
> — Terry
>
>
>
>
>
> On Tue, Mar 28, 2017 at 8:32 PM, John Collier  wrote:
>
>> I wrote a paper some time ago arguing that causal processes are the
>> transfer of information. Therefore I think that physical processes can and
>> do convey information. Cause can be dispensed with.
>>
>>
>>
>>- There is a copy at Causation is the Transfer of Information
>> In Howard Sankey (ed) 
>> *Causation,
>>Natural Laws and Explanation* (Dordrecht: Kluwer, 1999)
>>
>>
>>
>> Information is a very powerful concept. It is a shame to restrict oneself
>> to only a part of its possible applications.
>>
>>
>>
>> John Collier
>>
>> Emeritus Professor and Senior Research Associate
>>
>> Philosophy, University of KwaZulu-Natal
>>
>> http://web.ncf.ca/collier
>>
>>
>>
>> ___
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>
>>
>
>
> --
> Professor Terrence W. Deacon
> University of California, Berkeley
>



-- 
Professor Terrence W. Deacon
University of California, Berkeley
___

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Terrence W. DEACON
Dear FIS colleagues,

I agree with John Collier that we should not assume to restrict the concept
of information to only one subset of its potential applications. But to
work with this breadth of usage we need to recognize that 'information' can
refer to intrinsic statistical properties of a physical medium, extrinsic
referential properties of that medium (i.e. content), and the significance
or use value of that content, depending on the context.  A problem arises
when we demand that only one of these uses should be given legitimacy. As I
have repeatedly suggested on this listserve, it will be a source of
constant useless argument to make the assertion that someone is wrong in
their understanding of information if they use it in one of these
non-formal ways. But to fail to mark which conception of information is
being considered, or worse, to use equivocal conceptions of the term in the
same argument, will ultimately undermine our efforts to understand one
another and develop a complete general theory of information.

This nominalization of 'inform' has been in use for hundreds of years in
legal and literary contexts, in all of these variant forms. But there has
been a slowly increasing tendency to use it to refer to the
information-beqaring medium itself, in substantial terms. This reached its
greatest extreme with the restricted technical usage formalized by Claude
Shannon. Remember, however, that this was only introduced a little over a
half century ago. When one of his mentors (Hartley) initially introduced a
logarithmic measure of signal capacity he called it 'intelligence' — as in
the gathering of intelligence by a spy organization. So had Shannon chose
to stay with that usage the confusions could have been worse (think about
how confusing it would have been to talk about the entropy of
intelligence). Even so, Shannon himself was to later caution against
assuming that his use of the term 'information' applied beyond its
technical domain.

So despite the precision and breadth of appliction that was achieved by
setting aside the extrinsic relational features that characterize the more
colloquial uses of the term, this does not mean that these other uses are
in some sense non-scientific. And I am not alone in the belief that these
non-intrinsic properties can also (eventually) be strictly formalized and
thereby contribute insights to such technical fields as molecular biology
and cognitive neuroscience.

As a result I think that it is legitimate to argue that information (in the
referential sense) is only in use among living forms, that an alert signal
sent by the computer in an automobile engine is information (in both
senses, depending on whether we include a human interpreter in the loop),
or that information (in the intrinsic sense of a medium property) is lost
within a black hole or that it can be used  to provide a more precise
conceptiont of physical cause (as in Collier's sense). These different uses
aren't unrelated to each other. They are just asymmetrically dependent on
one another, such that medium-intrinsic properties can be investigated
without considering referential properties, but not vice versa.

It's time we move beyond terminological chauvenism so that we can further
our dialogue about the entire domain in which the concept of information is
important. To succeed at this, we only need to be clear about which
conception of information we are using in any given context.

— Terry





On Tue, Mar 28, 2017 at 8:32 PM, John Collier  wrote:

> I wrote a paper some time ago arguing that causal processes are the
> transfer of information. Therefore I think that physical processes can and
> do convey information. Cause can be dispensed with.
>
>
>
>- There is a copy at Causation is the Transfer of Information
> In Howard Sankey (ed) 
> *Causation,
>Natural Laws and Explanation* (Dordrecht: Kluwer, 1999)
>
>
>
> Information is a very powerful concept. It is a shame to restrict oneself
> to only a part of its possible applications.
>
>
>
> John Collier
>
> Emeritus Professor and Senior Research Associate
>
> Philosophy, University of KwaZulu-Natal
>
> http://web.ncf.ca/collier
>
>
>
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>


-- 
Professor Terrence W. Deacon
University of California, Berkeley
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Causation is transfer of information

2017-03-28 Thread John Collier
I wrote a paper some time ago arguing that causal processes are the transfer of 
information. Therefore I think that physical processes can and do convey 
information. Cause can be dispensed with.


  *   There is a copy at Causation is the Transfer of 
Information In Howard Sankey (ed) 
Causation, Natural Laws and Explanation (Dordrecht: Kluwer, 1999)

Information is a very powerful concept. It is a shame to restrict oneself to 
only a part of its possible applications.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis