Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-10 Thread Francesco Rizzo
Caro Stan e cari Tutti,
la subsumption hierarchy è il basamento epistemologico e logico della mia
"Nuova economia" da 35
 35 anni: Questa è una bella sincronicità tra l'inter-azione o causalità
reciproca o com-penetrazione
metodologica  di Stan e la mia orto-prassi.
Grazie per la conferma e il conforto che (ne) ricevo.
Francesco

2016-06-10 13:46 GMT+02:00 Karl Javorszky <karl.javors...@gmail.com>:

> Dear FIS,
>
> now there is a voice discussing the concepts and methods of counting. This
> is highly encouraging.
>
> Taking together with the overall theme of "Mechanical Information in DNA"
> of the discussion, it seems that - at least some of - members of FIS begin
> to address the quastions of HOW the transfer of information from a sequence
> (the DNA) into an organism (a non-sequenced, commutative multitude) can
> take place/does take place.
>
> Some of FIS, who are longer than a few months with this chat group, will
> have noticed, that I insist that there exists a very nice and neat
> algorithm to connect unidimensional descriptions (like the DNA) with
> pluridimensional assemblies (like the organism).
>
> I have made an explanation which includes drawings with red and blue
> arrows and makes it impossible not to understand how the transfer of
> genetic information takles place.
>
> The treatise has 55 pages and is easy to understand. You can have it thru
> the publisher (Morawa, Wien), but it has come out just this week, so I
> dispose presently only of the proof copies. These I can send to interested
> persons.
>
> Please contact me for details. If you are interested in Information
> Theory, this is the work that simplifies the question(s) into
> interpretations of a+b=c.
>
> The first 100 buyers of the work will get a personally hand-signed copy.
> There is a money-back guarantee: if the treatise you buy is not a
> state-of-art exercise in the philosophy of the logical language, opening up
> algorithms that connect descriptions of linear sequences with descriptions
> of pluridimensional assemblies, with easy examples and easy-to-follow
> deictic definitioons, you will be refunded on sending back the copy.
>
> Let me express once again the hope that there are some among the
> subscribers to the FIS list, who are interested in how information
> processing in biology takes actually place.
>
> Karl
>
>
>
>
>
> 2016-06-09 16:31 GMT+02:00 Mark Johnson <johnsonm...@gmail.com>:
>
>> Dear all,
>>
>> Is this a question about counting? I'm thinking that Ashby noted that
>> Shannon information is basically counting. What do we do when we count
>> something?
>>
>> Analogy is fundamental - how things are seen to be the same may be more
>> important than how they are seen to be different.
>>
>> It seems that this example of DNA is a case where knowledge advances
>> because what was once thought to be the same (for example, perceived
>> empirical regularities in genetic analysis) is later identified to be
>> different in identifiable ways.
>>
>> Science has tended to assume that by observing regularities, causes can
>> be discursively constructed. But maybe another way of looking at it is to
>> say what is discursively constructed are the countable analogies between
>> events. Determining analogies constrains perception of what is countable,
>> and by extension what we can say about nature; new knowledge changes that
>> perception.
>>
>> Information theory (Shannon) demands that analogies are made explicit -
>> the indices have to be agreed. What do we count? Why x? Why not y?
>> otherwise the measurements make no sense. I think this is an insight that
>> Ashby had and why he championed Information Theory as analogous to his Law
>> of Requisite Variety (incidentally, Keynes's Treatise on Probability
>> contains a similar idea about analogy and knowledge). Is there any reason
>> why the "relations of production" in a mechanism shouldn't be counted?
>> determining the analogies is the key thing isn't it?
>>
>> One further point is that determining analogies in theory is different
>> from measuring them in practice. Ashby's concept of cybernetics-as-method
>> was: "the cyberneticist observes what might have happened but did not".
>> There is a point where idealised analogies cannot map onto experience. Then
>> we learn something new.
>>
>> Best wishes,
>>
>> Mark
>> --
>> From: Loet Leydesdorff <l...@leydesdorff.net>
>> Sent: ‎09/‎06/‎2016 12:52
>> To: 'John Collier' <colli...@ukzn.ac.za>; 'Joseph Brenner'
>> <joe.bre

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-10 Thread Karl Javorszky
Dear FIS,

now there is a voice discussing the concepts and methods of counting. This
is highly encouraging.

Taking together with the overall theme of "Mechanical Information in DNA"
of the discussion, it seems that - at least some of - members of FIS begin
to address the quastions of HOW the transfer of information from a sequence
(the DNA) into an organism (a non-sequenced, commutative multitude) can
take place/does take place.

Some of FIS, who are longer than a few months with this chat group, will
have noticed, that I insist that there exists a very nice and neat
algorithm to connect unidimensional descriptions (like the DNA) with
pluridimensional assemblies (like the organism).

I have made an explanation which includes drawings with red and blue arrows
and makes it impossible not to understand how the transfer of genetic
information takles place.

The treatise has 55 pages and is easy to understand. You can have it thru
the publisher (Morawa, Wien), but it has come out just this week, so I
dispose presently only of the proof copies. These I can send to interested
persons.

Please contact me for details. If you are interested in Information Theory,
this is the work that simplifies the question(s) into interpretations of
a+b=c.

The first 100 buyers of the work will get a personally hand-signed copy.
There is a money-back guarantee: if the treatise you buy is not a
state-of-art exercise in the philosophy of the logical language, opening up
algorithms that connect descriptions of linear sequences with descriptions
of pluridimensional assemblies, with easy examples and easy-to-follow
deictic definitioons, you will be refunded on sending back the copy.

Let me express once again the hope that there are some among the
subscribers to the FIS list, who are interested in how information
processing in biology takes actually place.

Karl





2016-06-09 16:31 GMT+02:00 Mark Johnson <johnsonm...@gmail.com>:

> Dear all,
>
> Is this a question about counting? I'm thinking that Ashby noted that
> Shannon information is basically counting. What do we do when we count
> something?
>
> Analogy is fundamental - how things are seen to be the same may be more
> important than how they are seen to be different.
>
> It seems that this example of DNA is a case where knowledge advances
> because what was once thought to be the same (for example, perceived
> empirical regularities in genetic analysis) is later identified to be
> different in identifiable ways.
>
> Science has tended to assume that by observing regularities, causes can be
> discursively constructed. But maybe another way of looking at it is to say
> what is discursively constructed are the countable analogies between
> events. Determining analogies constrains perception of what is countable,
> and by extension what we can say about nature; new knowledge changes that
> perception.
>
> Information theory (Shannon) demands that analogies are made explicit -
> the indices have to be agreed. What do we count? Why x? Why not y?
> otherwise the measurements make no sense. I think this is an insight that
> Ashby had and why he championed Information Theory as analogous to his Law
> of Requisite Variety (incidentally, Keynes's Treatise on Probability
> contains a similar idea about analogy and knowledge). Is there any reason
> why the "relations of production" in a mechanism shouldn't be counted?
> determining the analogies is the key thing isn't it?
>
> One further point is that determining analogies in theory is different
> from measuring them in practice. Ashby's concept of cybernetics-as-method
> was: "the cyberneticist observes what might have happened but did not".
> There is a point where idealised analogies cannot map onto experience. Then
> we learn something new.
>
> Best wishes,
>
> Mark
> --
> From: Loet Leydesdorff <l...@leydesdorff.net>
> Sent: ‎09/‎06/‎2016 12:52
> To: 'John Collier' <colli...@ukzn.ac.za>; 'Joseph Brenner'
> <joe.bren...@bluewin.ch>; 'fis' <fis@listas.unizar.es>
>
> Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA
>
> Dear colleagues,
>
>
>
> It seems to me that a definition of information should be compatible with
> the possibility to measure information in bits of information. Bits of
> information are dimensionless and “yet meaningless.” The meaning can be
> provided by the substantive system that is thus measured. For example,
> semantics can be measured using a semantic map; changes in the map can be
> measured as changes in the distributions, for example, of words. One can,
> for example, study whether change in one semantic domain is larger and/or
> faster than in another. The results (expressed in bits, dits or nits of
> information) can be prov

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Stanley N Salthe
Regarding your last posting, I agree, and would formulate the following
subsumption hierarchy:

(thermodynamic energy flows {Shannon information theory {Peircean
semiotics}}}

STAN

On Thu, Jun 9, 2016 at 10:31 AM, Mark Johnson <johnsonm...@gmail.com> wrote:

> Dear all,
>
> Is this a question about counting? I'm thinking that Ashby noted that
> Shannon information is basically counting. What do we do when we count
> something?
>
> Analogy is fundamental - how things are seen to be the same may be more
> important than how they are seen to be different.
>
> It seems that this example of DNA is a case where knowledge advances
> because what was once thought to be the same (for example, perceived
> empirical regularities in genetic analysis) is later identified to be
> different in identifiable ways.
>
> Science has tended to assume that by observing regularities, causes can be
> discursively constructed. But maybe another way of looking at it is to say
> what is discursively constructed are the countable analogies between
> events. Determining analogies constrains perception of what is countable,
> and by extension what we can say about nature; new knowledge changes that
> perception.
>
> Information theory (Shannon) demands that analogies are made explicit -
> the indices have to be agreed. What do we count? Why x? Why not y?
> otherwise the measurements make no sense. I think this is an insight that
> Ashby had and why he championed Information Theory as analogous to his Law
> of Requisite Variety (incidentally, Keynes's Treatise on Probability
> contains a similar idea about analogy and knowledge). Is there any reason
> why the "relations of production" in a mechanism shouldn't be counted?
> determining the analogies is the key thing isn't it?
>
> One further point is that determining analogies in theory is different
> from measuring them in practice. Ashby's concept of cybernetics-as-method
> was: "the cyberneticist observes what might have happened but did not".
> There is a point where idealised analogies cannot map onto experience. Then
> we learn something new.
>
> Best wishes,
>
> Mark
> --
> From: Loet Leydesdorff <l...@leydesdorff.net>
> Sent: ‎09/‎06/‎2016 12:52
> To: 'John Collier' <colli...@ukzn.ac.za>; 'Joseph Brenner'
> <joe.bren...@bluewin.ch>; 'fis' <fis@listas.unizar.es>
> Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA
>
> Dear colleagues,
>
>
>
> It seems to me that a definition of information should be compatible with
> the possibility to measure information in bits of information. Bits of
> information are dimensionless and “yet meaningless.” The meaning can be
> provided by the substantive system that is thus measured. For example,
> semantics can be measured using a semantic map; changes in the map can be
> measured as changes in the distributions, for example, of words. One can,
> for example, study whether change in one semantic domain is larger and/or
> faster than in another. The results (expressed in bits, dits or nits of
> information) can be provided with meaning by the substantive theorizing
> about the domain(s) under study. One may wish to call this “meaningful
> information”.
>
>
>
> I am aware that several authors have defined information as a difference
> that makes a difference (McKay, 1969; Bateson, 1973). It seems to me that
> this is “meaningful information”. Information is contained in just a series
> of differences or a distribution. Whether the differences make a difference
> seems to me a matter of statistical testing. Are the differences
> significant or not? If they are significant, they teach us about the
> (substantive!) systems under study, and can thus be provided with meaning
> in the terms of  studying these systems.
>
>
>
> Kauffman *et al*. (2008, at p. 28) define information as “natural
> selection assembling the very constraints on the release of energy that
> then constitutes work and the propagation of organization.” How can one
> measure this information? Can the difference that the differences in it
> make, be tested for their significance?
>
>
>
> Varela (1979, p. 266) argued that since the word “information” is derived
> from “in-formare,” the semantics call for the specification of a system of
> reference to be informed. The system of reference provides the information
> with meaning, but the meaning is not in the information which is “yet
> meaningless”. Otherwise, there are as many “informations” as there are
> systems of reference and the use of the word itself becomes a source of
> confusion.
>
>
>
> In summary, it seems to me that the achievement of defining informat

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Mark Johnson
Dear all,

Is this a question about counting? I'm thinking that Ashby noted that Shannon 
information is basically counting. What do we do when we count something?

Analogy is fundamental - how things are seen to be the same may be more 
important than how they are seen to be different. 

It seems that this example of DNA is a case where knowledge advances because 
what was once thought to be the same (for example, perceived empirical 
regularities in genetic analysis) is later identified to be different in 
identifiable ways.

Science has tended to assume that by observing regularities, causes can be 
discursively constructed. But maybe another way of looking at it is to say what 
is discursively constructed are the countable analogies between events. 
Determining analogies constrains perception of what is countable, and by 
extension what we can say about nature; new knowledge changes that perception.

Information theory (Shannon) demands that analogies are made explicit - the 
indices have to be agreed. What do we count? Why x? Why not y? otherwise the 
measurements make no sense. I think this is an insight that Ashby had and why 
he championed Information Theory as analogous to his Law of Requisite Variety 
(incidentally, Keynes's Treatise on Probability contains a similar idea about 
analogy and knowledge). Is there any reason why the "relations of production" 
in a mechanism shouldn't be counted?  determining the analogies is the key 
thing isn't it?

One further point is that determining analogies in theory is different from 
measuring them in practice. Ashby's concept of cybernetics-as-method was: "the 
cyberneticist observes what might have happened but did not". There is a point 
where idealised analogies cannot map onto experience. Then we learn something 
new.

Best wishes,

Mark


-Original Message-
From: "Loet Leydesdorff" <l...@leydesdorff.net>
Sent: ‎09/‎06/‎2016 12:52
To: "'John Collier'" <colli...@ukzn.ac.za>; "'Joseph Brenner'" 
<joe.bren...@bluewin.ch>; "'fis'" <fis@listas.unizar.es>
Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA

Dear colleagues, 
 
It seems to me that a definition of information should be compatible with the 
possibility to measure information in bits of information. Bits of information 
are dimensionless and “yet meaningless.” The meaning can be provided by the 
substantive system that is thus measured. For example, semantics can be 
measured using a semantic map; changes in the map can be measured as changes in 
the distributions, for example, of words. One can, for example, study whether 
change in one semantic domain is larger and/or faster than in another. The 
results (expressed in bits, dits or nits of information) can be provided with 
meaning by the substantive theorizing about the domain(s) under study. One may 
wish to call this “meaningful information”. 
 
I am aware that several authors have defined information as a difference that 
makes a difference (McKay, 1969; Bateson, 1973). It seems to me that this is 
“meaningful information”. Information is contained in just a series of 
differences or a distribution. Whether the differences make a difference seems 
to me a matter of statistical testing. Are the differences significant or not? 
If they are significant, they teach us about the (substantive!) systems under 
study, and can thus be provided with meaning in the terms of  studying these 
systems. 
 
Kauffman et al. (2008, at p. 28) define information as “natural selection 
assembling the very constraints on the release of energy that then constitutes 
work and the propagation of organization.” How can one measure this 
information? Can the difference that the differences in it make, be tested for 
their significance? 
 
Varela (1979, p. 266) argued that since the word “information” is derived from 
“in-formare,” the semantics call for the specification of a system of reference 
to be informed. The system of reference provides the information with meaning, 
but the meaning is not in the information which is “yet meaningless”. 
Otherwise, there are as many “informations” as there are systems of reference 
and the use of the word itself becomes a source of confusion.
 
In summary, it seems to me that the achievement of defining information more 
abstractly as measurement in bits (H = - Σ p log(p)) and the availability of 
statistics should not be ignored. From this perspective, information theory can 
be considered as another form of statistics (entropy statistics). A substantive 
definition of information itself is no longer meaningful (and perhaps even 
obscure): the expected information content of a distribution or the information 
contained in the message that an event has happened, can be expressed in bits 
or other measures of information.
 
Best,
Loet
 



Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Commun

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Loet Leydesdorff
Dear colleagues, 

 

It seems to me that a definition of information should be compatible with the 
possibility to measure information in bits of information. Bits of information 
are dimensionless and “yet meaningless.” The meaning can be provided by the 
substantive system that is thus measured. For example, semantics can be 
measured using a semantic map; changes in the map can be measured as changes in 
the distributions, for example, of words. One can, for example, study whether 
change in one semantic domain is larger and/or faster than in another. The 
results (expressed in bits, dits or nits of information) can be provided with 
meaning by the substantive theorizing about the domain(s) under study. One may 
wish to call this “meaningful information”. 

 

I am aware that several authors have defined information as a difference that 
makes a difference (McKay, 1969; Bateson, 1973). It seems to me that this is 
“meaningful information”. Information is contained in just a series of 
differences or a distribution. Whether the differences make a difference seems 
to me a matter of statistical testing. Are the differences significant or not? 
If they are significant, they teach us about the (substantive!) systems under 
study, and can thus be provided with meaning in the terms of  studying these 
systems. 

 

Kauffman et al. (2008, at p. 28) define information as “natural selection 
assembling the very constraints on the release of energy that then constitutes 
work and the propagation of organization.” How can one measure this 
information? Can the difference that the differences in it make, be tested for 
their significance? 

 

Varela (1979, p. 266) argued that since the word “information” is derived from 
“in-formare,” the semantics call for the specification of a system of reference 
to be informed. The system of reference provides the information with meaning, 
but the meaning is not in the information which is “yet meaningless”. 
Otherwise, there are as many “informations” as there are systems of reference 
and the use of the word itself becomes a source of confusion.

 

In summary, it seems to me that the achievement of defining information more 
abstractly as measurement in bits (H = - Σ p log(p)) and the availability of 
statistics should not be ignored. From this perspective, information theory can 
be considered as another form of statistics (entropy statistics). A substantive 
definition of information itself is no longer meaningful (and perhaps even 
obscure): the expected information content of a distribution or the information 
contained in the message that an event has happened, can be expressed in bits 
or other measures of information.

 

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

 <mailto:l...@leydesdorff.net> l...@leydesdorff.net ;  
<http://www.leydesdorff.net/> http://www.leydesdorff.net/ 
Associate Faculty,  <http://www.sussex.ac.uk/spru/> SPRU, University of Sussex; 

Guest Professor  <http://www.zju.edu.cn/english/> Zhejiang Univ., Hangzhou; 
Visiting Professor,  <http://www.istic.ac.cn/Eng/brief_en.html> ISTIC, Beijing;

Visiting Professor,  <http://www.bbk.ac.uk/> Birkbeck, University of London; 

 <http://scholar.google.com/citations?user=ych9gNYJ=en> 
http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of John Collier
Sent: Thursday, June 09, 2016 12:04 PM
To: Joseph Brenner; fis
Subject: Re: [Fis] Fw: "Mechanical Information" in DNA

 

I am inclined to agree with Joseph. That is why I put “mechanical information” 
in shudder quotes in my Subject line.

 

On the other hand, one of the benefits of an information approach is that one 
can add together information (taking care to subtract effects of common 
information – also describable as correlations). So I don’t think that the 
reductionist perspective follows immediately from describing the target 
information in the paper as “mechanical”. “Mechanical”, “mechanism” and similar 
terms can be used (and have been used) to refer to processes that are not 
reducible. “Mechanicism” and “mechanicist” can be used to capture reducible 
dynamics that we get from any conservative system (what I call Hamiltonian 
systems in my papers on the dynamics of emergence – such systems don’t show 
emergent properties except in a trivial sense of being unanticipated). I think 
it is doubtful at best that the mechanical information referred to is 
mechanicist.

 

John Collier

Professor Emeritus and Senior Research Associate

University of KwaZulu-Natal

http://web.ncf.ca/collier

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: Thursday, 09 June 2016 11:10 AM
To: fis <fis@listas.unizar.es>
Subject: [Fis] Fw: "Mechanical Information" in DNA

 

Dear Folks,

 

In my humbl

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread John Collier
I am inclined to agree with Joseph. That is why I put “mechanical information” 
in shudder quotes in my Subject line.

On the other hand, one of the benefits of an information approach is that one 
can add together information (taking care to subtract effects of common 
information – also describable as correlations). So I don’t think that the 
reductionist perspective follows immediately from describing the target 
information in the paper as “mechanical”. “Mechanical”, “mechanism” and similar 
terms can be used (and have been used) to refer to processes that are not 
reducible. “Mechanicism” and “mechanicist” can be used to capture reducible 
dynamics that we get from any conservative system (what I call Hamiltonian 
systems in my papers on the dynamics of emergence – such systems don’t show 
emergent properties except in a trivial sense of being unanticipated). I think 
it is doubtful at best that the mechanical information referred to is 
mechanicist.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: Thursday, 09 June 2016 11:10 AM
To: fis <fis@listas.unizar.es>
Subject: [Fis] Fw: "Mechanical Information" in DNA

Dear Folks,

In my humble opinion, "Mechanical Information" is a contradiction in terms when 
applied to biological processes as described, among others, by Bob L. and his 
colleagues. When applied to isolated DNA, it gives at best a reductionist 
perspective. In the reference cited by Hector, the word 'mechanical' could be 
dropped or replaced by spatial without affecting the meaning.

Best,

Joseph

- Original Message -
From: Bob Logan<mailto:lo...@physics.utoronto.ca>
To: Moisés André Nisenbaum<mailto:moises.nisenb...@ifrj.edu.br>
Cc: fis<mailto:fis@listas.unizar.es>
Sent: Thursday, June 09, 2016 4:04 AM
Subject: Re: [Fis] "Mechanical Information" in DNA

Thanks to Moises for the mention of my paper with Stuart Kauffman. If anyone is 
interested in reading it one can find it at the following Web site:

https://www.academia.edu/783503/Propagating_organization_an_enquiry

Here is the abstract:

Propagating Organization: An Inquiry.

Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich.

2007. Biology and Philosophy 23: 27-45.
Abstract
Our aim in this article is to attempt to discuss propagating organization of 
process, a poorly articulated union of matter, energy, work, constraints and 
that vexed concept, “information”, which unite in far from equilibrium living 
physical systems. Our hope is to stimulate discussions by philosophers of 
biology and biologists to further clarify the concepts we discuss here. We 
place our discussion in the broad context of a “general biology”, properties 
that might well be found in life anywhere in the cosmos, freed from the 
specific examples of terrestrial life after 3.8 billion years of evolution. By 
placing the discussion in this wider, if still hypothetical, context, we also 
try to place in context some of the extant discussion of information as 
intimately related to DNA, RNA and protein transcription and translation 
processes. While characteristic of current terrestrial life, there are no 
compelling grounds to suppose the same mechanisms would be involved in any life 
form able to evolve by heritable variation and natural selection. In turn, this 
allows us to discuss at least briefly, the focus of much of the philosophy of 
biology on population genetics, which, of course, assumes DNA, RNA, proteins, 
and other features of terrestrial life. Presumably, evolution by natural 
selection – and perhaps self-organization - could occur on many worlds via 
different causal mechanisms.
Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
o