[Fis] _ Re: Fis Digest, Vol 27, Issue 15

2016-06-09 Thread Don Favareau
Thank you, Plamen, not only for placing my announcement for about the
"intentionality" questionnaire, but for all that you have done in bringing
together and putting into discussion so many different thinkers from so
many disciplinary backgrounds. That, too, is precisely what the survey of
how best to conceptualize the phenomenon/a of "intentionality" for a 21st
century science likewise seeks to do!

So here, by way of explanation, I have cut-and-pasted a slightly more
readable version of the text I sent you, as well as an attachment
explaining some of the rationale behind the project. I should stress, to,
that while the project itself was born within the community of
biosemioticians, it can only probably really succeed with input from the
fields of philosophy, information science, molecular and cellular biology,
ethology, complex systems theory, communication studies, and many many
more. All these research agenda - and certainly everything that has been
discussed here and in the pages of the Special Issue will have to tackle
the phenomenon of systems so arranged, at least in part, to be "about"
something outside or other than themselves. So I sincerely hope that as
many members of this mailing list will share their own important and unique
perspectives on: (a) how this could be so, and (b) how to go about
productively conceptualizing and researching it.

Thanks you all once again for your stimulating discussions! I'm sure that I
speak for many others, as well as myself, in saying even those of us who
did not write much to it, benefited considerably from reading it!

With all best wishes always!

Don



Hi Plamen!



Thanks for giving me the opportunity to draw upon the collective insight
and expertise of this group!



By way of explanation: One concern that joins the FIS with the Biosemiotics
group is the need to come up with a biological but not anthropomorphic
understanding of the notion of *intentionality* – or, as Terrence Deacon
suggests replacing this perhaps already overly-mentalistic term with,
*“ententionality”,
*which he defines as:



“a generic adjective to describe all phenomena that are intrinsically
incomplete in the sense of being in relationship to, constituted by,
or *organized
to achieve* something non-intrinsic…[such] *ententional *phenomena include:
*functions* that have satisfaction conditions, *adaptations* that have
environmental correlates, *thoughts* that have contents, *purposes* that
have goals, *subjective experiences* that have a self/other perspective,
and v*alues* that have a ‘self’ that is benefited or harmed (Deacon
2012:27; italics added).



Such an understanding, again, is one that is needed both in Biology and in
Information Science, and so it seems to me that the current questionnaire
that is now circulating around in Biosemiotics circles concerning how to
best go about conceiving and researching this phenomenon for those purposes
would be very much of interest to those on the FIS list-serve also.



So with your kind permission, I would like to ask you to make the following
two online survey links available to this group for their input and
consideration:



PART 1 of this survey consists of 5 simple short answer QUESTIONS regarding
the notion of *intentionality*, as you think it might be conceptualized for
the purposes of 21st century science, and may be accessed by clicking here:
https://www.surveymonkey.com/r/MKHPT97



PART 2 of the survey asks its respondants to consider how the term
“intentionality” has been conceptualized in a small number of previously
published QUOTES and to click on the response that best reflects their
opinion of their suitability for use 21st century science. This part of the
survey can be accessed by clicking here:
https://www.surveymonkey.com/r/T66XDMH



Respondents van pick and choose those questions that they wish to respond
to (the system will not require that they respond to them all), and can
also choose to remain anonymous, if they wish, when the results of this
questionnaire are published later in the year in the journal *Biosemiotics.*



I do hope that the members of the listserve that were involved in the
Special Issue on Integral Biomathematics of the *PBMB* will take the
opportunity to join us in this project, as we work to expand our
understanding of this neglected organizing principle in both Biology and
Information Science.



All best wishes and thanks again!



Don Favareau

National University of Singapore


About The Biosemiotic Glossary Project.pdf
Description: Adobe PDF document
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fwd: Re: Cancer Cure? (Plamen S.)

2016-06-09 Thread Dr. Plamen L. Simeonov
Dear All,

we are indeed approaching the end of this series of sessions on life
science, phenomenology and mathematics. Your note sent 2 weeks ago with the
reference to your new book did not remain unnoticed, Francesco. Therefore I
will try to respond to it and make some final comments on what we have done
so far and what remains for the future. We can hardly become exhaustive on
all these issues raised with relation the central problems in science. It
is clear to most of us that some of them, in particular the antagonistic
ones, are due to the increased specialisation in the disciplines which
makes the establishment of a multi-rogue (to cite Bateson) difficult. The
last example  was the one of George Mutter with the results of the medical
expert consultation on cancer heterogeneity with the result of an
additional split of cancers into precancels and cancers. Such domain
differentiations happen all the time. Without clear definitions and focused
problems science cannot advance. And at the same time we are criticising
reductionism as dominating modern science. In a follow-up posting I told
George that we are actually interested in both types of heterogeneity, the
(histological) one of precancels in groups of patients and in the
microbiological-genetical one of cancers of individual patients both on
temporal and spacial scale. But can we embrace all the different aspects of
studying and understanding cancer within a single methodologically sound
theoretical and experimental framework? Based on the discussions I had with
many of you in the past 7 years, I believe that we have such a
predisposition.

My summary from Francesco’s note is that we cannot ignore the stimulating
role of other, at first sight remote disciplines, when trying to understand
life. In particular the metaphors about its “currency” and good/bad
“economy” are very powerful means to address matter, energy and information
transfer and transformation at all their levels of organisation. The
self-organised criticality (SOC) theme we continued this last session on
3-phi integrative medicine after the one on physics looks like an enhanced
model of Varela's and Maturana’s autopoiesis. We can improve and recombine
(as Pedroo suggested) in the same manner Robert Rosen’s reaction-diffusion
systems, Allan Turing’s biochemical morphogenesis and oracle machines, von
Neuman’s cellular automata and even Penrose-Hameroff’s Orchestrated OR
theory. All of them and many others represent some valid aspect of life.

Our effort here in the past 4 months was to try investigating the role
which philosophical phenomenology could play in enriching these models of
life and how mathematics and computation can formalise them in an adequate
manner, although we know that not everything in life is formalisable. We
touched upon some exciting questions and puzzles, even on not so well
defined concepts such as the one about wether the understanding that
quantum properties of matter do emerge from geometry can be mistakenly
interpreted as a relation between potentiality and actuality, an issue by
Joe Brenner in a personal correspondence. I hope that most of you remain
satisfied with the scope and deepness of this online discussion intended as
continuation and feedback to the authors of the selected field
contributions of our

2015 JPBMB Special Issue on Integral Biomathics: Life Sciences, Mathematics
and Phenomenological Philosophy

(note: free access to all articles until July 19th, 2016)

and successor of

2013 JPBMB Special Issue on Integral Biomathics: Can Biology Create a
Profoundly New Mathematics and Computation?


It is time to announce our *third special issue on Integral Biomathics
planned for 2017 *and *dedicated to the scientific and philosophical
exchange between East and Wes*t. I’ll be pleased if some of you decide to
contribute to it with an original article or a sequel of a previous one
from the earlier publications of this row. *Abstracts are due by August
31st 2016. *
Official announcements with detailed CFP will be disseminated by the end of
June.

Finally, please allow me to place an announcement by Don Favareau, who
would be pleased to obtain your feedback on one of the topics in this
online discussion: *biosemiotics*.

With my best wishes for a spectacular UEFA soccer championship in France
(starting tomorrow), summer Olympics in Brazil, and of course a
(re-)creative and inspiring research summer.

Yours,

Plamen

___

Hi Plamen!



Thanks for giving me the opportunity to draw upon the collective insight
and expertise of this group!



By way of explanation: One concern that joins the FIS with the Biosemiotics
group is the need to come up with a biological but not anthropomorphic
understanding of the notion of *intentionality* – or, as Terrence Deacon
suggests replacing this perhaps already 

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Stanley N Salthe
Regarding your last posting, I agree, and would formulate the following
subsumption hierarchy:

(thermodynamic energy flows {Shannon information theory {Peircean
semiotics}}}

STAN

On Thu, Jun 9, 2016 at 10:31 AM, Mark Johnson  wrote:

> Dear all,
>
> Is this a question about counting? I'm thinking that Ashby noted that
> Shannon information is basically counting. What do we do when we count
> something?
>
> Analogy is fundamental - how things are seen to be the same may be more
> important than how they are seen to be different.
>
> It seems that this example of DNA is a case where knowledge advances
> because what was once thought to be the same (for example, perceived
> empirical regularities in genetic analysis) is later identified to be
> different in identifiable ways.
>
> Science has tended to assume that by observing regularities, causes can be
> discursively constructed. But maybe another way of looking at it is to say
> what is discursively constructed are the countable analogies between
> events. Determining analogies constrains perception of what is countable,
> and by extension what we can say about nature; new knowledge changes that
> perception.
>
> Information theory (Shannon) demands that analogies are made explicit -
> the indices have to be agreed. What do we count? Why x? Why not y?
> otherwise the measurements make no sense. I think this is an insight that
> Ashby had and why he championed Information Theory as analogous to his Law
> of Requisite Variety (incidentally, Keynes's Treatise on Probability
> contains a similar idea about analogy and knowledge). Is there any reason
> why the "relations of production" in a mechanism shouldn't be counted?
> determining the analogies is the key thing isn't it?
>
> One further point is that determining analogies in theory is different
> from measuring them in practice. Ashby's concept of cybernetics-as-method
> was: "the cyberneticist observes what might have happened but did not".
> There is a point where idealised analogies cannot map onto experience. Then
> we learn something new.
>
> Best wishes,
>
> Mark
> --
> From: Loet Leydesdorff 
> Sent: ‎09/‎06/‎2016 12:52
> To: 'John Collier' ; 'Joseph Brenner'
> ; 'fis' 
> Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA
>
> Dear colleagues,
>
>
>
> It seems to me that a definition of information should be compatible with
> the possibility to measure information in bits of information. Bits of
> information are dimensionless and “yet meaningless.” The meaning can be
> provided by the substantive system that is thus measured. For example,
> semantics can be measured using a semantic map; changes in the map can be
> measured as changes in the distributions, for example, of words. One can,
> for example, study whether change in one semantic domain is larger and/or
> faster than in another. The results (expressed in bits, dits or nits of
> information) can be provided with meaning by the substantive theorizing
> about the domain(s) under study. One may wish to call this “meaningful
> information”.
>
>
>
> I am aware that several authors have defined information as a difference
> that makes a difference (McKay, 1969; Bateson, 1973). It seems to me that
> this is “meaningful information”. Information is contained in just a series
> of differences or a distribution. Whether the differences make a difference
> seems to me a matter of statistical testing. Are the differences
> significant or not? If they are significant, they teach us about the
> (substantive!) systems under study, and can thus be provided with meaning
> in the terms of  studying these systems.
>
>
>
> Kauffman *et al*. (2008, at p. 28) define information as “natural
> selection assembling the very constraints on the release of energy that
> then constitutes work and the propagation of organization.” How can one
> measure this information? Can the difference that the differences in it
> make, be tested for their significance?
>
>
>
> Varela (1979, p. 266) argued that since the word “information” is derived
> from “in-formare,” the semantics call for the specification of a system of
> reference to be informed. The system of reference provides the information
> with meaning, but the meaning is not in the information which is “yet
> meaningless”. Otherwise, there are as many “informations” as there are
> systems of reference and the use of the word itself becomes a source of
> confusion.
>
>
>
> In summary, it seems to me that the achievement of defining information
> more abstractly as measurement in bits (*H = -* Σ *p log(p)*) and the
> availability of statistics should not be ignored. From this perspective,
> information theory can be considered as another form of statistics (entropy
> statistics). A substantive definition of information itself is no longer
> meaningful (and perhaps even 

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Mark Johnson
Dear all,

Is this a question about counting? I'm thinking that Ashby noted that Shannon 
information is basically counting. What do we do when we count something?

Analogy is fundamental - how things are seen to be the same may be more 
important than how they are seen to be different. 

It seems that this example of DNA is a case where knowledge advances because 
what was once thought to be the same (for example, perceived empirical 
regularities in genetic analysis) is later identified to be different in 
identifiable ways.

Science has tended to assume that by observing regularities, causes can be 
discursively constructed. But maybe another way of looking at it is to say what 
is discursively constructed are the countable analogies between events. 
Determining analogies constrains perception of what is countable, and by 
extension what we can say about nature; new knowledge changes that perception.

Information theory (Shannon) demands that analogies are made explicit - the 
indices have to be agreed. What do we count? Why x? Why not y? otherwise the 
measurements make no sense. I think this is an insight that Ashby had and why 
he championed Information Theory as analogous to his Law of Requisite Variety 
(incidentally, Keynes's Treatise on Probability contains a similar idea about 
analogy and knowledge). Is there any reason why the "relations of production" 
in a mechanism shouldn't be counted?  determining the analogies is the key 
thing isn't it?

One further point is that determining analogies in theory is different from 
measuring them in practice. Ashby's concept of cybernetics-as-method was: "the 
cyberneticist observes what might have happened but did not". There is a point 
where idealised analogies cannot map onto experience. Then we learn something 
new.

Best wishes,

Mark


-Original Message-
From: "Loet Leydesdorff" 
Sent: ‎09/‎06/‎2016 12:52
To: "'John Collier'" ; "'Joseph Brenner'" 
; "'fis'" 
Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA

Dear colleagues, 
 
It seems to me that a definition of information should be compatible with the 
possibility to measure information in bits of information. Bits of information 
are dimensionless and “yet meaningless.” The meaning can be provided by the 
substantive system that is thus measured. For example, semantics can be 
measured using a semantic map; changes in the map can be measured as changes in 
the distributions, for example, of words. One can, for example, study whether 
change in one semantic domain is larger and/or faster than in another. The 
results (expressed in bits, dits or nits of information) can be provided with 
meaning by the substantive theorizing about the domain(s) under study. One may 
wish to call this “meaningful information”. 
 
I am aware that several authors have defined information as a difference that 
makes a difference (McKay, 1969; Bateson, 1973). It seems to me that this is 
“meaningful information”. Information is contained in just a series of 
differences or a distribution. Whether the differences make a difference seems 
to me a matter of statistical testing. Are the differences significant or not? 
If they are significant, they teach us about the (substantive!) systems under 
study, and can thus be provided with meaning in the terms of  studying these 
systems. 
 
Kauffman et al. (2008, at p. 28) define information as “natural selection 
assembling the very constraints on the release of energy that then constitutes 
work and the propagation of organization.” How can one measure this 
information? Can the difference that the differences in it make, be tested for 
their significance? 
 
Varela (1979, p. 266) argued that since the word “information” is derived from 
“in-formare,” the semantics call for the specification of a system of reference 
to be informed. The system of reference provides the information with meaning, 
but the meaning is not in the information which is “yet meaningless”. 
Otherwise, there are as many “informations” as there are systems of reference 
and the use of the word itself becomes a source of confusion.
 
In summary, it seems to me that the achievement of defining information more 
abstractly as measurement in bits (H = - Σ p log(p)) and the availability of 
statistics should not be ignored. From this perspective, information theory can 
be considered as another form of statistics (entropy statistics). A substantive 
definition of information itself is no longer meaningful (and perhaps even 
obscure): the expected information content of a distribution or the information 
contained in the message that an event has happened, can be expressed in bits 
or other measures of information.
 
Best,
Loet
 



Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net ; http://www.leydesdorff.net/ 
Associate Faculty, 

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread Loet Leydesdorff
Dear colleagues, 

 

It seems to me that a definition of information should be compatible with the 
possibility to measure information in bits of information. Bits of information 
are dimensionless and “yet meaningless.” The meaning can be provided by the 
substantive system that is thus measured. For example, semantics can be 
measured using a semantic map; changes in the map can be measured as changes in 
the distributions, for example, of words. One can, for example, study whether 
change in one semantic domain is larger and/or faster than in another. The 
results (expressed in bits, dits or nits of information) can be provided with 
meaning by the substantive theorizing about the domain(s) under study. One may 
wish to call this “meaningful information”. 

 

I am aware that several authors have defined information as a difference that 
makes a difference (McKay, 1969; Bateson, 1973). It seems to me that this is 
“meaningful information”. Information is contained in just a series of 
differences or a distribution. Whether the differences make a difference seems 
to me a matter of statistical testing. Are the differences significant or not? 
If they are significant, they teach us about the (substantive!) systems under 
study, and can thus be provided with meaning in the terms of  studying these 
systems. 

 

Kauffman et al. (2008, at p. 28) define information as “natural selection 
assembling the very constraints on the release of energy that then constitutes 
work and the propagation of organization.” How can one measure this 
information? Can the difference that the differences in it make, be tested for 
their significance? 

 

Varela (1979, p. 266) argued that since the word “information” is derived from 
“in-formare,” the semantics call for the specification of a system of reference 
to be informed. The system of reference provides the information with meaning, 
but the meaning is not in the information which is “yet meaningless”. 
Otherwise, there are as many “informations” as there are systems of reference 
and the use of the word itself becomes a source of confusion.

 

In summary, it seems to me that the achievement of defining information more 
abstractly as measurement in bits (H = - Σ p log(p)) and the availability of 
statistics should not be ignored. From this perspective, information theory can 
be considered as another form of statistics (entropy statistics). A substantive 
definition of information itself is no longer meaningful (and perhaps even 
obscure): the expected information content of a distribution or the information 
contained in the message that an event has happened, can be expressed in bits 
or other measures of information.

 

Best,

Loet

 

  _  

Loet Leydesdorff 

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

  l...@leydesdorff.net ;  
 http://www.leydesdorff.net/ 
Associate Faculty,   SPRU, University of Sussex; 

Guest Professor   Zhejiang Univ., Hangzhou; 
Visiting Professor,   ISTIC, Beijing;

Visiting Professor,   Birkbeck, University of London; 

  
http://scholar.google.com/citations?user=ych9gNYJ=en

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of John Collier
Sent: Thursday, June 09, 2016 12:04 PM
To: Joseph Brenner; fis
Subject: Re: [Fis] Fw: "Mechanical Information" in DNA

 

I am inclined to agree with Joseph. That is why I put “mechanical information” 
in shudder quotes in my Subject line.

 

On the other hand, one of the benefits of an information approach is that one 
can add together information (taking care to subtract effects of common 
information – also describable as correlations). So I don’t think that the 
reductionist perspective follows immediately from describing the target 
information in the paper as “mechanical”. “Mechanical”, “mechanism” and similar 
terms can be used (and have been used) to refer to processes that are not 
reducible. “Mechanicism” and “mechanicist” can be used to capture reducible 
dynamics that we get from any conservative system (what I call Hamiltonian 
systems in my papers on the dynamics of emergence – such systems don’t show 
emergent properties except in a trivial sense of being unanticipated). I think 
it is doubtful at best that the mechanical information referred to is 
mechanicist.

 

John Collier

Professor Emeritus and Senior Research Associate

University of KwaZulu-Natal

http://web.ncf.ca/collier

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: Thursday, 09 June 2016 11:10 AM
To: fis 
Subject: [Fis] Fw: "Mechanical Information" in DNA

 

Dear Folks,

 

In my humble opinion, "Mechanical Information" is a contradiction in terms when 

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread John Collier
I am inclined to agree with Joseph. That is why I put “mechanical information” 
in shudder quotes in my Subject line.

On the other hand, one of the benefits of an information approach is that one 
can add together information (taking care to subtract effects of common 
information – also describable as correlations). So I don’t think that the 
reductionist perspective follows immediately from describing the target 
information in the paper as “mechanical”. “Mechanical”, “mechanism” and similar 
terms can be used (and have been used) to refer to processes that are not 
reducible. “Mechanicism” and “mechanicist” can be used to capture reducible 
dynamics that we get from any conservative system (what I call Hamiltonian 
systems in my papers on the dynamics of emergence – such systems don’t show 
emergent properties except in a trivial sense of being unanticipated). I think 
it is doubtful at best that the mechanical information referred to is 
mechanicist.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: Thursday, 09 June 2016 11:10 AM
To: fis 
Subject: [Fis] Fw: "Mechanical Information" in DNA

Dear Folks,

In my humble opinion, "Mechanical Information" is a contradiction in terms when 
applied to biological processes as described, among others, by Bob L. and his 
colleagues. When applied to isolated DNA, it gives at best a reductionist 
perspective. In the reference cited by Hector, the word 'mechanical' could be 
dropped or replaced by spatial without affecting the meaning.

Best,

Joseph

- Original Message -
From: Bob Logan
To: Moisés André Nisenbaum
Cc: fis
Sent: Thursday, June 09, 2016 4:04 AM
Subject: Re: [Fis] "Mechanical Information" in DNA

Thanks to Moises for the mention of my paper with Stuart Kauffman. If anyone is 
interested in reading it one can find it at the following Web site:

https://www.academia.edu/783503/Propagating_organization_an_enquiry

Here is the abstract:

Propagating Organization: An Inquiry.

Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich.

2007. Biology and Philosophy 23: 27-45.
Abstract
Our aim in this article is to attempt to discuss propagating organization of 
process, a poorly articulated union of matter, energy, work, constraints and 
that vexed concept, “information”, which unite in far from equilibrium living 
physical systems. Our hope is to stimulate discussions by philosophers of 
biology and biologists to further clarify the concepts we discuss here. We 
place our discussion in the broad context of a “general biology”, properties 
that might well be found in life anywhere in the cosmos, freed from the 
specific examples of terrestrial life after 3.8 billion years of evolution. By 
placing the discussion in this wider, if still hypothetical, context, we also 
try to place in context some of the extant discussion of information as 
intimately related to DNA, RNA and protein transcription and translation 
processes. While characteristic of current terrestrial life, there are no 
compelling grounds to suppose the same mechanisms would be involved in any life 
form able to evolve by heritable variation and natural selection. In turn, this 
allows us to discuss at least briefly, the focus of much of the philosophy of 
biology on population genetics, which, of course, assumes DNA, RNA, proteins, 
and other features of terrestrial life. Presumably, evolution by natural 
selection – and perhaps self-organization - could occur on many worlds via 
different causal mechanisms.
Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work