Re: [Fis] Is information physical? 'Signs rust.'

2018-04-27 Thread Karl Javorszky
This is a literary level exposition of a view, of the category of
Confessiones. The confidence of a philosopher, like that of a poet, that
his words can be understood, even though they are of a subjective,
individual perspective, is well rewarded if indeed the worldview can be
understood.

Two aspects on which i'd like to comment :

1. If this general allmighty versatile ubiquitous something is such a
wonder thing - what distinguishes then this construct from concepts of
theology? Bruno has been advancing the idea that insofar the problems we
discuss here are of a deep nature, our forefathers will have discussed them
already, in their own respective generations, using the available concepts
of their respective times, and these were of theological lexica. Therefore,
so I understand Bruno to say, we shall not be alienated by the reappearance
of ideas theological. And here we experience a globality of potentials
ascribed to an idea, by the beutiful sonett above by Joseph, which does
come near to ancient beliefs. Welcome the approach, because we try to catch
a metamorphosing beast, which we call information.

2. No day shall pass without mentioning the cycles.  Could we interpret the
"patterns of energy flow" as some kinds of filaments, paths, levels,
densities, probabilities, predictabilities? If we un-anchor our concepts of
"how much determines where", then we have a continuous rearrangement, with
many patterns in it.

The numbers show an unequivocal, solid, rational support for what Joseph
described above as main characteristics of the idea of information.

Karl



joe.bren...@bluewin.ch  schrieb am Do., 26. Apr.
2018 16:33:

> Information refers to changes in patterns of energy flow, some slow
> (frozen), some fast, some quantitative and measurable, some qualitative and
> non-measurable, some meaningful and some meaningless, partly causally
> effective and partly inert, partly present and partly absent, all at the
> same time.
>
> Best wishes,
>
> Joseph
>
> >Message d'origine
> >De : u...@umces.edu
> >Date : 25/04/2018 - 08:14 (PDT)
> >À : mbur...@math.ucla.edu
> >Cc : fis@listas.unizar.es
> >Objet : Re: [Fis] Is information physical?
> >
> >Dear Mark,
> >
> >I share your inclination, albeit from a different perspective.
> >
> >Consider the two statements:
> >
> >1. Information is impossible without a physical carrier.
> >
> >2. Information is impossible without the influence of that which does not
> exist.
> >
> >There is significant truth in both statements.
> >
> >I know that Claude Shannon is not a popular personality on FIS, but I
> >admire how he first approached the subject. He began by quantifying,
> >not information in the intuitive, positivist  sense, but rather the
> >*lack* of information, or "uncertainty", as he put it. Positivist
> >information thereby becomes a double negative -- any decrease in
> >uncertainty.
> >
> >In short, the quantification of information begins by quantifying
> >something that does not exist, but nonetheless is related to that
> >which does. Terry calls this lack the "absential", I call it the
> >"apophatic" and it is a major player in living systems!
> >
> >Karl Popper finished his last book with the exhortation that we need
> >to develop a "calculus of conditional probabilities". Well, that
> >effort was already underway in information theory. Using conditional
> >probabilities allows one to parse Shannon's formula for diversity into
> >two terms -- on being positivist information (average mutual
> >information) and the other apophasis (conditional entropy).
> >
> >
> >This duality in nature is evident but often unnoticed in the study of
> >networks. Most look at networks and immediately see the constraints
> >between nodes. And so it is. But there is also indeterminacy in almost
> >all real networks, and this often is disregarded. The proportions
> >between constraint and indeterminacy can readily be calculated.
> >
> >What is important in living systems (and I usually think of the more
> >indeterminate ecosystems, rather than organisms [but the point applies
> >there as well]) is that some degree of conditional entropy is
> >absolutely necessary for systems sustainability, as it provides the
> >flexibility required to construct new responses to novel challenges.
> >
> >While system constraint usually abets system performance, systems that
> >become too efficient do so by decreasing their (mutually exclusive)
> >flexibility and become progressively vulnerable to collapse.
> >
> >The lesson for evolutionary theory is clear. Survival is not always a
> >min/max (fitt*est*) issue. It is about a balance between adaptation
> >and adaptability. Ecosystems do not attain maximum efficiency. To do
> >so would doom them.
> > The balance also
> >puts the lie to a major maxim of economics, which is that nothing
> >should hinder the efficiency of the market. That's a 

Re: [Fis] Is information physical? 'Signs rust.'

2018-04-27 Thread Guy A Hoelzer
Joseph,

Thank you for this concise statement.  It very closely matches my own 
perspective.  I would only add the notion that meaningfulness or 
meaninglessness is not an inherent property of information.  It is entirely 
contingent upon the affect, or the absence of affect, of encountered 
information on an agent.

Regards,

Guy



On Apr 26, 2018, at 7:31 AM, 
joe.bren...@bluewin.ch wrote:

Information refers to changes in patterns of energy flow, some slow (frozen), 
some fast, some quantitative and measurable, some qualitative and 
non-measurable, some meaningful and some meaningless, partly causally effective 
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

Message d'origine
De : u...@umces.edu
Date : 25/04/2018 - 08:14 (PDT)
À : mbur...@math.ucla.edu
Cc : fis@listas.unizar.es
Objet : Re: [Fis] Is information physical?

Dear Mark,

I share your inclination, albeit from a different perspective.

Consider the two statements:

1. Information is impossible without a physical carrier.

2. Information is impossible without the influence of that which does not exist.

There is significant truth in both statements.

I know that Claude Shannon is not a popular personality on FIS, but I
admire how he first approached the subject. He began by quantifying,
not information in the intuitive, positivist  sense, but rather the
*lack* of information, or "uncertainty", as he put it. Positivist
information thereby becomes a double negative -- any decrease in
uncertainty.

In short, the quantification of information begins by quantifying
something that does not exist, but nonetheless is related to that
which does. Terry calls this lack the "absential", I call it the
"apophatic" and it is a major player in living systems!

Karl Popper finished his last book with the exhortation that we need
to develop a "calculus of conditional probabilities". Well, that
effort was already underway in information theory. Using conditional
probabilities allows one to parse Shannon's formula for diversity into
two terms -- on being positivist information (average mutual
information) and the other apophasis (conditional entropy).


This duality in nature is evident but often unnoticed in the study of
networks. Most look at networks and immediately see the constraints
between nodes. And so it is. But there is also indeterminacy in almost
all real networks, and this often is disregarded. The proportions
between constraint and indeterminacy can readily be calculated.

What is important in living systems (and I usually think of the more
indeterminate ecosystems, rather than organisms [but the point applies
there as well]) is that some degree of conditional entropy is
absolutely necessary for systems sustainability, as it provides the
flexibility required to construct new responses to novel challenges.

While system constraint usually abets system performance, systems that
become too efficient do so by decreasing their (mutually exclusive)
flexibility and become progressively vulnerable to collapse.

The lesson for evolutionary theory is clear. Survival is not always a
min/max (fitt*est*) issue. It is about a balance between adaptation
and adaptability. Ecosystems do not attain maximum efficiency. To do
so would doom them.

 The balance also
puts the lie to a major maxim of economics, which is that nothing
should hinder the efficiency of the market. That's a recipe for "boom
and bust". 


Mark, I do disagree with your opinion that information cannot be
measured. The wider application of information theory extends beyond
communication and covers the information inherent in structure, or
what John Collier calls "enformation". Measurement is extremely
important there. Perhaps you are disquieted by the relative nature of
information measurements. Such relativity is inevitable. Information
can only be measured with respect to some (arbitrary) reference
distribution (which is also known in the wider realm of thermodynamics
as "the third law

Re: [Fis] Is information physical? 'Signs rust.'

2018-04-26 Thread Mark Johnson
Dear Joseph,

Thank you for this beautiful summary.

That describes the world doesn't it? (it also describes music - which is a good 
sign). 

I want to say why information matters to me, not to argue about what it is. 

Information matters because it enables these conversations which dissolve 
barriers between disciplines, and ultimately has the capacity to dissolve 
barriers between each of us.

Information is such a powerful concept because everyone thinks they know what 
it is. Really, the conversation is the important thing. We may think we argue, 
but we are all in this dance together. It's always a privilege to have one's 
certainties shattered - who'd have thought the information in email messages 
could be so powerful?!

Best wishes,

Mark

-Original Message-
From: "joe.bren...@bluewin.ch" 
Sent: ‎26/‎04/‎2018 15:33
To: "u...@umces.edu" 
Cc: "fis@listas.unizar.es" 
Subject: Re: [Fis] Is information physical? 'Signs rust.'

Information refers to changes in patterns of energy flow, some slow (frozen), 
some fast, some quantitative and measurable, some qualitative and 
non-measurable, some meaningful and some meaningless, partly causally effective 
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

>Message d'origine
>De : u...@umces.edu
>Date : 25/04/2018 - 08:14 (PDT)
>À : mbur...@math.ucla.edu
>Cc : fis@listas.unizar.es
>Objet : Re: [Fis] Is information physical?
>
>Dear Mark,
>
>I share your inclination, albeit from a different perspective.
>
>Consider the two statements:
>
>1. Information is impossible without a physical carrier.
>
>2. Information is impossible without the influence of that which does not 
>exist.
>
>There is significant truth in both statements.
>
>I know that Claude Shannon is not a popular personality on FIS, but I
>admire how he first approached the subject. He began by quantifying,
>not information in the intuitive, positivist  sense, but rather the
>*lack* of information, or "uncertainty", as he put it. Positivist
>information thereby becomes a double negative -- any decrease in
>uncertainty.
>
>In short, the quantification of information begins by quantifying
>something that does not exist, but nonetheless is related to that
>which does. Terry calls this lack the "absential", I call it the
>"apophatic" and it is a major player in living systems!
>
>Karl Popper finished his last book with the exhortation that we need
>to develop a "calculus of conditional probabilities". Well, that
>effort was already underway in information theory. Using conditional
>probabilities allows one to parse Shannon's formula for diversity into
>two terms -- on being positivist information (average mutual
>information) and the other apophasis (conditional entropy).
><https://people.clas.ufl.edu/ulan/files/FISPAP.pdf>
>
>This duality in nature is evident but often unnoticed in the study of
>networks. Most look at networks and immediately see the constraints
>between nodes. And so it is. But there is also indeterminacy in almost
>all real networks, and this often is disregarded. The proportions
>between constraint and indeterminacy can readily be calculated.
>
>What is important in living systems (and I usually think of the more
>indeterminate ecosystems, rather than organisms [but the point applies
>there as well]) is that some degree of conditional entropy is
>absolutely necessary for systems sustainability, as it provides the
>flexibility required to construct new responses to novel challenges.
>
>While system constraint usually abets system performance, systems that
>become too efficient do so by decreasing their (mutually exclusive)
>flexibility and become progressively vulnerable to collapse.
>
>The lesson for evolutionary theory is clear. Survival is not always a
>min/max (fitt*est*) issue. It is about a balance between adaptation
>and adaptability. Ecosystems do not attain maximum efficiency. To do
>so would doom them.
><https://people.clas.ufl.edu/ulan/files/ECOCOMP2.pdf> The balance also
>puts the lie to a major maxim of economics, which is that nothing
>should hinder the efficiency of the market. That's a recipe for "boom
>and bust". <https://people.clas.ufl.edu/ulan/files/Crisis.pdf>
>
>Mark, I do disagree with your opinion that information cannot be
>measured. The wider application of information theory extends beyond
>communication and covers the information inherent in structure, or
>what John Collier calls "enformation". Measurement is extremely
>important there. Perhaps you are disquieted by the relative nature of
>information measurements. Such relativity 

Re: [Fis] Is information physical? 'Signs rust.'

2018-04-26 Thread joe.bren...@bluewin.ch
Information refers to changes in patterns of energy flow, some slow (frozen), 
some fast, some quantitative and measurable, some qualitative and 
non-measurable, some meaningful and some meaningless, partly causally effective 
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

>Message d'origine
>De : u...@umces.edu
>Date : 25/04/2018 - 08:14 (PDT)
>À : mbur...@math.ucla.edu
>Cc : fis@listas.unizar.es
>Objet : Re: [Fis] Is information physical?
>
>Dear Mark,
>
>I share your inclination, albeit from a different perspective.
>
>Consider the two statements:
>
>1. Information is impossible without a physical carrier.
>
>2. Information is impossible without the influence of that which does not 
>exist.
>
>There is significant truth in both statements.
>
>I know that Claude Shannon is not a popular personality on FIS, but I
>admire how he first approached the subject. He began by quantifying,
>not information in the intuitive, positivist  sense, but rather the
>*lack* of information, or "uncertainty", as he put it. Positivist
>information thereby becomes a double negative -- any decrease in
>uncertainty.
>
>In short, the quantification of information begins by quantifying
>something that does not exist, but nonetheless is related to that
>which does. Terry calls this lack the "absential", I call it the
>"apophatic" and it is a major player in living systems!
>
>Karl Popper finished his last book with the exhortation that we need
>to develop a "calculus of conditional probabilities". Well, that
>effort was already underway in information theory. Using conditional
>probabilities allows one to parse Shannon's formula for diversity into
>two terms -- on being positivist information (average mutual
>information) and the other apophasis (conditional entropy).
>
>
>This duality in nature is evident but often unnoticed in the study of
>networks. Most look at networks and immediately see the constraints
>between nodes. And so it is. But there is also indeterminacy in almost
>all real networks, and this often is disregarded. The proportions
>between constraint and indeterminacy can readily be calculated.
>
>What is important in living systems (and I usually think of the more
>indeterminate ecosystems, rather than organisms [but the point applies
>there as well]) is that some degree of conditional entropy is
>absolutely necessary for systems sustainability, as it provides the
>flexibility required to construct new responses to novel challenges.
>
>While system constraint usually abets system performance, systems that
>become too efficient do so by decreasing their (mutually exclusive)
>flexibility and become progressively vulnerable to collapse.
>
>The lesson for evolutionary theory is clear. Survival is not always a
>min/max (fitt*est*) issue. It is about a balance between adaptation
>and adaptability. Ecosystems do not attain maximum efficiency. To do
>so would doom them.
> The balance also
>puts the lie to a major maxim of economics, which is that nothing
>should hinder the efficiency of the market. That's a recipe for "boom
>and bust". 
>
>Mark, I do disagree with your opinion that information cannot be
>measured. The wider application of information theory extends beyond
>communication and covers the information inherent in structure, or
>what John Collier calls "enformation". Measurement is extremely
>important there. Perhaps you are disquieted by the relative nature of
>information measurements. Such relativity is inevitable. Information
>can only be measured with respect to some (arbitrary) reference
>distribution (which is also known in the wider realm of thermodynamics
>as "the third law".)
>
>Remember how Bateson pointed to the overwhelmingly positivist nature
>of physics. Classical physics is deficient in its lack of recognition
>of the apophatic. Information theory cures that.
>
>Yes, information requires a material carrier. It also is intimately
>affected by and requires nonmaterial apophasis.
>
>Best wishes,
>Bob
>
>On 4/24/18, Burgin, Mark  wrote:
>> Dear Colleagues,
>>
>> I would like to suggest the new topic for discussion
>>
>>Is information physical?
>>
>> My opinion is presented below:
>>
>> Why some people erroneously think that information is physical
>>
>> The main reason to think that information is physical is the strong
>> belief of many people, especially, scientists that there is only
>> physical reality, which is studied by science. At the same time, people
>> encounter something that they call information.
>>
>> When people receive a letter, they comprehend that it is information
>> because with the letter they receive information. The letter is
>> physical, i.e., a physical object. As a result, people start thinking
>> that information is physical