Re: [Fis] Causation is transfer of information

2017-03-30 Thread Hector Zenil
y as a representation) Information Originates in Symmetry Breaking
> <http://web.ncf.ca/collier/papers/infsym.pdf> (*Symmetry* 1996).
>

Very nice paper. I agree on symmetry breaking, I have similar ideas:
https://arxiv.org/abs/1210.1572
(published in the journal of Natural Computing)
On how symmetric rules can produce assymetric information.

Best,

Hector Zenil
http://www.hectorzenil.net/


> I adopt what I call dynamical realism, that anything that is real is
> either dynamical or interpretable in dynamical terms. Not everyone will
> agree.
>
>
>
> John Collier
>
> Emeritus Professor and Senior Research Associate
>
> Philosophy, University of KwaZulu-Natal
>
> http://web.ncf.ca/collier
>
>
>
> *From:* Guy A Hoelzer [mailto:hoel...@unr.edu]
> *Sent:* Wednesday, 29 March 2017 1:44 AM
> *To:* Sungchul Ji <s...@pharmacy.rutgers.edu>; Terry Deacon <
> dea...@berkeley.edu>; John Collier <colli...@ukzn.ac.za>; Foundations of
> Information Science Information Science <fis@listas.unizar.es>
>
> *Subject:* Re: [Fis] Causation is transfer of information
>
>
>
> Greetings all,
>
>
>
> It seems that the indigestion from competing definitions of ‘information’
> is hard to resolve, and I agree with Terry and others that a broad
> definition is preferable.  I also think it is not a problem to allow
> multiple definitions that can be operationally adopted in appropriate
> contexts.  In some respects, apparently competing definitions are actually
> reinforcing.  For example, I prefer to use ‘information’ to describe any
> difference (a distinction or contrast), and it is also true that a subset
> of all differences are ones that ‘make a difference’ to an observer.  When
> we restrict ‘information’ to differences that make a difference it becomes
> inherently subjective.  That is certainly not a problem if you are
> interested in subjectivity, but it would eliminate the rationality of
> studying objective ‘information’, which I think holds great promise for
> understanding dynamical systems.  I don’t see any conflict between
> ‘information’ as negentropy and ‘information’ as a basis for decision
> making.  On the other hand, semantics and semiotics involve the attachment
> of meaning to information, which strikes me as a separate and complementary
> idea.  Therefore, I think it is important to sustain this distinction
> explicitly in what we write.  Maybe there is a context in which
> ‘information’ and ‘meaning’ are so intertwined that they cannot be
> isolated, but I can’t think of one.  I’m sure there are plenty of contexts
> in which the important thing is ‘meaning’, and where the (more general,
> IMHO) term ‘information’ is used instead.  I think it is fair to say that
> you can have information without meaning, but you can’t have meaning
> without information.  Can anybody think of a way in which it might be
> misleading if this distinction was generally accepted?
>
>
>
> Regards,
>
>
>
> Guy
>
>
>
>
>
> On Mar 28, 2017, at 3:26 PM, Sungchul Ji <s...@pharmacy.rutgers.edu> wrote:
>
>
>
> Hi Fisers,
>
>
>
> I agree with Terry that "information" has three irreducible aspects ---
> *amount*, *meaning*, and *value*.  These somehow may be related to
> another triadic relation called the ITR as depicted below, although I don't
> know the exact rule of mapping between the two triads.  Perhaps, 'amount' =
> f, 'meaning' = g, and 'value' = h ? .
>
>
>
>   f   g
>
>Object --->  Sign -->  Interpretant
>
> |
>   ^
> |
>|
> |
>|
> |
>|
> |_|
>
> h
>
>
>
> *Figure 1.*  The *Irreducible Triadic Relation* (ITR) of seimosis (also
> called sign process or communication) first clearly articulated by Peirce
> to the best of my knowledge. *Warning*: Peirce often replaces Sign with
> Representamen and represents the whole triad, i.e., Figure 1
> itself (although he did not use such a figure in his writings) as the Sign.
> Not distinguishing between these two very different uses of the same word
> "Sign" can lead to semiotic confusions.   The three processes are defined
> as follows: f = sign production, g = sign interpretation, h = information
> flow (other ways of labeling the arrows are not excluded).   Each process
> or arrow reads "determines", "leads", "i

Re: [Fis] Causation is transfer of information

2017-03-30 Thread John Collier
Interesting papers. I have a few remarks, but no time right now. I heartily 
agree with your general point.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Hector Zenil [mailto:hzen...@gmail.com]
Sent: Wednesday, 29 March 2017 11:00 AM
To: Terrence W. DEACON <dea...@berkeley.edu>
Cc: fis <fis@listas.unizar.es>
Subject: Re: [Fis] Causation is transfer of information

With all due respect, I am still amazed how it is so much ignored and neglected 
all the science and math around information developed in the last 50-60 years! 
With most people here citing in the best case only Shannon Entropy but 
completely neglecting and ignoring algorithmic complexity, logical depth, 
quantum information and so on. Your philosophical discussions are quite empty 
if most people ignore the progress that computer science and math has done in 
the last 60 years! Please take it constructively. This should be a shame for 
the whole field of Philosophy of Information and FIS.

Perhaps I can help alleviate this a little even if I feel wrong pointing you 
out to my own papers on subjects relevant to philosophical discussion:

http://www.hectorzenil.net/publications.html

They do care about the meaning and value of information beyond Shannon Entropy. 
For example, paper J21:

- Natural Scene Statistics Mediate the Perception of Image Complexity 
(available online at 
http://www.tandfonline.com/doi/abs/10.1080/13506285.2014.950365 also available 
pdf preprint in the arxiv)

and

- Rare Speed-up in Automatic Theorem Proving Reveals Tradeoff Between 
Computational Time and Information Value (https://arxiv.org/abs/1506.04349).

And we even show how Entropy fails at the most basic level:

Low Algorithmic Complexity Entropy-deceiving Graphs 
(https://arxiv.org/abs/1608.05972)

Best Regards,

Hector Zenil

---
This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are addressed. If 
you have received this email in error please notify the sender and delete the 
message.

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
<dea...@berkeley.edu<mailto:dea...@berkeley.edu>> wrote:
>
> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the concept 
> of information to only one subset of its potential applications. But to work 
> with this breadth of usage we need to recognize that 'information' can refer 
> to intrinsic statistical properties of a physical medium, extrinsic 
> referential properties of that medium (i.e. content), and the significance or 
> use value of that content, depending on the context.  A problem arises when 
> we demand that only one of these uses should be given legitimacy. As I have 
> repeatedly suggested on this listserve, it will be a source of constant 
> useless argument to make the assertion that someone is wrong in their 
> understanding of information if they use it in one of these non-formal ways. 
> But to fail to mark which conception of information is being considered, or 
> worse, to use equivocal conceptions of the term in the same argument, will 
> ultimately undermine our efforts to understand one another and develop a 
> complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in 
> legal and literary contexts, in all of these variant forms. But there has 
> been a slowly increasing tendency to use it to refer to the 
> information-beqaring medium itself, in substantial terms. This reached its 
> greatest extreme with the restricted technical usage formalized by Claude 
> Shannon. Remember, however, that this was only introduced a little over a 
> half century ago. When one of his mentors (Hartley) initially introduced a 
> logarithmic measure of signal capacity he called it 'intelligence' — as in 
> the gathering of intelligence by a spy organization. So had Shannon chose to 
> stay with that usage the confusions could have been worse (think about how 
> confusing it would have been to talk about the entropy of intelligence). Even 
> so, Shannon himself was to later caution against assuming that his use of the 
> term 'information' applied beyond its technical domain.
>
> So despite the precision and breadth of appliction that was achieved by 
> setting aside the extrinsic relational features that characterize the more 
> colloquial uses of the term, this does not mean that these other uses are in 
> some sense non-scientific. And I am not alone in the belief that these 
> non-intrinsic properties can also (eventually) be strictly formalized and 
> thereby contribute insights to such technical fields as molecular biology and 
> cognitive neuroscience.
>
> As a result I think that it is

Re: [Fis] Causation is transfer of information

2017-03-30 Thread Sungchul Ji
Hi Soeren and FISers,


(1) Tychism is intrinsic to the Planckian information, since it is defined as 
the binary logarithm of the ratio of the area under the curve (AUC) of the 
Planckian distribution (PDE)  over the AUC of the Gaussian-like Equation (GLE):


  I_P  =  log (AUC(PDE)/AUC(GLE))


Tychism is implied in GLE.


(2)  The Planckian processes are defined as those physicochemical or formal 
processes that generate long-tailed histograms (or their superpositions) 
fitting PDE (or its suppositions).   The Planckian process seems irreducibly 
triadic in the Peircean sesne:



f   
g

  Random processes --->   Long-tailed histograms  
->  PDE

 (Firstness)   
(Secondness) (Thirdness)

 |  
   ^

 |  
   |

 |  
   |

 |  
   |

 ||


h


Figure 2.  The Irreducible Triadic Relation (ITR) embodied in the Planckian 
processes.  f = selection process either natural or artificial; g =  
mathematical modeling; h = grounding, correspondence, or information flow.


(3)  (to be continued)


All the best.


Sung






From: Søren Brier <sbr@cbs.dk>
Sent: Wednesday, March 29, 2017 7:06 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information


Dear Sung



It is difficult for me to say as you do not make your metaphysical framework 
explicit.  This was the great work Peirce did. I am pretty sure you do not have 
a dynamic triadic process concept of semiosis based on a tychastic theory of 
Firstness as potential qualia or forms of feeling of which information is only 
an aspect.



Best

   Søren



From: Sungchul Ji [mailto:s...@pharmacy.rutgers.edu]
Sent: 29. marts 2017 20:35
To: Søren Brier; Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Soeren,



Can  you be more specific about what aspect of my proposal described in my 
previous emails you think are my own and has nothing to do with (or are even 
based on my misinterpretation of) Peirce ?



Thanks in advance.



Sung











From: Søren Brier <sbr@cbs.dk<mailto:sbr@cbs.dk>>
Sent: Wednesday, March 29, 2017 2:10 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information



Dear Sung



I suggest you call this your own theory and make your own definitions of terms, 
because you confuse things by attempting to draw on Peirce, because there is a 
whole process philosophy with synechism, tychism, agapism and Scholastic 
realism plus a phenomenological and mathematically based  triadic metaphysics 
as the basis of Peirce’s concepts, which is the fruit of his life’s work. I do 
not think you are ready to carry that load. It takes many years to understand 
fully. The ‘sign’ is a triadic process of representamen, object and 
interpretant working in the realm of Firstness, Secondness and Thirdness in a 
society at large or a society of researchers devoted to the search for truth 
producing the meaning of signs, which when developed into propositional 
arguments can be tested in the fallible scientific process  of generating more 
rationality in culture as well as nature.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Sungchul Ji
Sent: 29. marts 2017 00:27
To: Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Fisers,



I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .



  f   g

   Object --->

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Sungchul Ji
Hi Soeren,


Can  you be more specific about what aspect of my proposal described in my 
previous emails you think are my own and has nothing to do with (or are even 
based on my misinterpretation of) Peirce ?


Thanks in advance.


Sung






From: Søren Brier <sbr@cbs.dk>
Sent: Wednesday, March 29, 2017 2:10 PM
To: Sungchul Ji; Terrence W. DEACON; John Collier
Cc: fis
Subject: RE: [Fis] Causation is transfer of information


Dear Sung



I suggest you call this your own theory and make your own definitions of terms, 
because you confuse things by attempting to draw on Peirce, because there is a 
whole process philosophy with synechism, tychism, agapism and Scholastic 
realism plus a phenomenological and mathematically based  triadic metaphysics 
as the basis of Peirce’s concepts, which is the fruit of his life’s work. I do 
not think you are ready to carry that load. It takes many years to understand 
fully. The ‘sign’ is a triadic process of representamen, object and 
interpretant working in the realm of Firstness, Secondness and Thirdness in a 
society at large or a society of researchers devoted to the search for truth 
producing the meaning of signs, which when developed into propositional 
arguments can be tested in the fallible scientific process  of generating more 
rationality in culture as well as nature.



Best

   Søren



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Sungchul Ji
Sent: 29. marts 2017 00:27
To: Terrence W. DEACON; John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Hi Fisers,



I agree with Terry that "information" has three irreducible aspects --- amount, 
meaning, and value.  These somehow may be related to another triadic relation 
called the ITR as depicted below, although I don't know the exact rule of 
mapping between the two triads.  Perhaps, 'amount' = f, 'meaning' = g, and 
'value' = h ? .



  f   g

   Object --->  Sign -->  Interpretant

|   
 ^
|   
 |
|   
 |
|   
 |
|_|

h



Figure 1.  The Irreducible Triadic Relation (ITR) of seimosis (also called sign 
process or communication) first clearly articulated by Peirce to the best of my 
knowledge. Warning: Peirce often replaces Sign with Representamen and 
represents the whole triad, i.e., Figure 1 itself (although he did not use such 
a figure in his writings) as the Sign. Not distinguishing between these two 
very different uses of the same word "Sign" can lead to semiotic confusions.   
The three processes are defined as follows: f = sign production, g = sign 
interpretation, h = information flow (other ways of labeling the arrows are not 
excluded).   Each process or arrow reads "determines", "leads", "is presupposed 
by", etc., and the three arrows constitute a commutative triangle of category 
theory, i.e., f x g = h, meaning f followed by g ledes to the same result as h.



I started using  the so-called  ITR template, Figure 1,  about 5 years ago, and 
the main reason I am bringing it up here is to ask your critical opinion on my 
suggestion published in 2012 (Molecular Theory of the Living  Cell: Concepts, 
Molecular Mechanisms, and Biomedical Applications, Springer New York, p ~100 ?) 
that there are two kinds of causality -- (i) the energy-dependent causality 
(identified with Processes f and g in Figure 1) and (ii) the information (and 
hence code)-dependent causality (identified with Process h).  For convenience, 
I coined the term 'codality' to refer to the latter to contrast it with the 
traditional term causality.



I wonder if we can  view John's idea of the relation between 'information' and 
'cause' as being  an alternative way of expressing the same ideas as the 
"energy-dependent causality" or the "codality" defined in Figure 1.



All the best.



Sung







From: Fis <fis-boun...@listas.unizar.es<mailto:fis-boun...@listas.unizar.es>> 
on behalf of Terrence W. DEACON 
<dea...@berkeley.edu<mailto:dea...@berkeley.edu>>
Sent: Tuesday, March 28, 2017 4:23:14 PM
To: John Collier
Cc: fis
Subject: Re: [Fis] Causation is transfer of information



Corrected typos (in case the intrinsic redundancy didn't compensate for these 
minor corruptions of the text):



 information-beqaring medium =  information-be

Re: [Fis] Causation is transfer of information

2017-03-29 Thread Hector Zenil
With all due respect, I am still amazed how it is so much ignored and
neglected all the science and math around information developed in the last
50-60 years! With most people here citing in the best case only Shannon
Entropy but completely neglecting and ignoring algorithmic complexity,
logical depth, quantum information and so on. Your philosophical
discussions are quite empty if most people ignore the progress that
computer science and math has done in the last 60 years! Please take it
constructively. This should be a shame for the whole field of Philosophy of
Information and FIS.

Perhaps I can help alleviate this a little even if I feel wrong pointing
you out to my own papers on subjects relevant to philosophical discussion:

http://www.hectorzenil.net/publications.html

They do care about the meaning and value of information beyond Shannon
Entropy. For example, paper J21:

- Natural Scene Statistics Mediate the Perception of Image Complexity
(available online at
http://www.tandfonline.com/doi/abs/10.1080/13506285.2014.950365 also
available pdf preprint in the arxiv)

and

- Rare Speed-up in Automatic Theorem Proving Reveals Tradeoff Between
Computational Time and Information Value (https://arxiv.org/abs/1506.04349).

And we even show how Entropy fails at the most basic level:

Low Algorithmic Complexity Entropy-deceiving Graphs (
https://arxiv.org/abs/1608.05972)

Best Regards,

Hector Zenil

---
This email and any files transmitted with it are confidential and intended
solely for the use of the individual or entity to whom they are addressed.
If you have received this email in error please notify the sender and
delete the message.

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
wrote:
>
> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the
concept of information to only one subset of its potential applications.
But to work with this breadth of usage we need to recognize that
'information' can refer to intrinsic statistical properties of a physical
medium, extrinsic referential properties of that medium (i.e. content), and
the significance or use value of that content, depending on the context.  A
problem arises when we demand that only one of these uses should be given
legitimacy. As I have repeatedly suggested on this listserve, it will be a
source of constant useless argument to make the assertion that someone is
wrong in their understanding of information if they use it in one of these
non-formal ways. But to fail to mark which conception of information is
being considered, or worse, to use equivocal conceptions of the term in the
same argument, will ultimately undermine our efforts to understand one
another and develop a complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in
legal and literary contexts, in all of these variant forms. But there has
been a slowly increasing tendency to use it to refer to the
information-beqaring medium itself, in substantial terms. This reached its
greatest extreme with the restricted technical usage formalized by Claude
Shannon. Remember, however, that this was only introduced a little over a
half century ago. When one of his mentors (Hartley) initially introduced a
logarithmic measure of signal capacity he called it 'intelligence' — as in
the gathering of intelligence by a spy organization. So had Shannon chose
to stay with that usage the confusions could have been worse (think about
how confusing it would have been to talk about the entropy of
intelligence). Even so, Shannon himself was to later caution against
assuming that his use of the term 'information' applied beyond its
technical domain.
>
> So despite the precision and breadth of appliction that was achieved by
setting aside the extrinsic relational features that characterize the more
colloquial uses of the term, this does not mean that these other uses are
in some sense non-scientific. And I am not alone in the belief that these
non-intrinsic properties can also (eventually) be strictly formalized and
thereby contribute insights to such technical fields as molecular biology
and cognitive neuroscience.
>
> As a result I think that it is legitimate to argue that information (in
the referential sense) is only in use among living forms, that an alert
signal sent by the computer in an automobile engine is information (in both
senses, depending on whether we include a human interpreter in the loop),
or that information (in the intrinsic sense of a medium property) is lost
within a black hole or that it can be used  to provide a more precise
conceptiont of physical cause (as in Collier's sense). These different uses
aren't unrelated to each other. They are just asymmetrically dependent on
one another, such that medium-intrinsic properties can be investigated
without considering referential properties, but not vice versa.
>
> It's time we move beyond terminological chauvenism 

Re: [Fis] Causation is transfer of information

2017-03-28 Thread Robert E. Ulanowicz
In order:

John,

I agree. For example, if one identifies information with constraint, the
notion of information as causation becomes tautologous. It also feeds into
the notion of "It from bit"!

Terry,

I agree, best to remain as catholic as possible in our conception of the
notion.

Otto:

Spot-on! Feedbacks among non-living components provided the cradle for the
early emergence and proliferation of information. (See p147ff in
.)

Cheers to all,
Bob U.

> Dear all,
> Just to comment on the discussion after Terrence's apt cautionary words...
>
> The various notions of information are partially a linguistic confusion,
> partially a relic of multiple conceptual histories colliding, and
> partially
> an ongoing negotiation (or even a war, to state it less creditably and
> with
> less civility), about the future of the term as a (more or less unified)
> scientific concept.
>
> To latch onto that negotiation, let me propose that an evolutionary
> approach to information can capture and explain some of that ambiguous
> multiplicity in terminology, by showing how pre-biotic natural processes
> developed feedback loops and material encoding techniques - which was a
> type of localised informational emergence - and how life, in developing
> cellular communication, DNA, sentience, memory, and selfhood, rarified
> this
> process further, producing informational processing such that had never
> existed before. Was it the same information? Or was it something new?
>
> Human consciousness and cultural semiosis are a yet higher level
> adaptation
> of information, and computer A.I. is something else entirely, for - at
> least for now - it lacks feelings and self-awareness and thus "meaning" in
> the human sense. But it computes, stores and processes. It might even
> develop suprasentience whose structure we cannot fathom based on our
> limited human perspective.  Is it still the same type of information? Or
> something different? Is evolution in quality (emergence) or only in
> quantity (continuous development)?
>
> I generally take the Peircean view that signification (informative
> relationality) evolves, and information, as an offshoot of that, is thus a
> multi-stage process - EVEN if it has a simple and predictable elemental
> substructure (composed of say, 1s and 0s, or quarks and bosons).
>
> Information might thus not only have a complex history of emergence, but
> also an unknown future, composed of various leaps in cosmic organization.
>
> In ignorant wonder, all the best,
>
> Otto Lehto,
>
> philosopher, political economist,
> PhD student at King's College London,
> webpage: www.ottolehto.com,
> cellphone: +358-407514748
>
> On Mar 28, 2017 23:24, "Terrence W. DEACON"  wrote:
>
>> Corrected typos (in case the intrinsic redundancy didn't compensate for
>> these minor corruptions of the text):
>>
>>  information-beqaring medium =  information-bearing medium
>>
>> appliction = application
>>
>>  conceptiont =  conception
>>
>> On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON
>> 
>> wrote:
>>
>>> Dear FIS colleagues,
>>>
>>> I agree with John Collier that we should not assume to restrict the
>>> concept of information to only one subset of its potential
>>> applications.
>>> But to work with this breadth of usage we need to recognize that
>>> 'information' can refer to intrinsic statistical properties of a
>>> physical
>>> medium, extrinsic referential properties of that medium (i.e. content),
>>> and
>>> the significance or use value of that content, depending on the
>>> context.  A
>>> problem arises when we demand that only one of these uses should be
>>> given
>>> legitimacy. As I have repeatedly suggested on this listserve, it will
>>> be a
>>> source of constant useless argument to make the assertion that someone
>>> is
>>> wrong in their understanding of information if they use it in one of
>>> these
>>> non-formal ways. But to fail to mark which conception of information is
>>> being considered, or worse, to use equivocal conceptions of the term in
>>> the
>>> same argument, will ultimately undermine our efforts to understand one
>>> another and develop a complete general theory of information.
>>>
>>> This nominalization of 'inform' has been in use for hundreds of years
>>> in
>>> legal and literary contexts, in all of these variant forms. But there
>>> has
>>> been a slowly increasing tendency to use it to refer to the
>>> information-beqaring medium itself, in substantial terms. This reached
>>> its
>>> greatest extreme with the restricted technical usage formalized by
>>> Claude
>>> Shannon. Remember, however, that this was only introduced a little over
>>> a
>>> half century ago. When one of his mentors (Hartley) initially
>>> introduced a
>>> logarithmic measure of signal capacity he called it 'intelligence' —
>>> as in
>>> the gathering of intelligence by a spy organization. So had