### Re: [Fis] If always n>0 why we need log

```Dear Sung et al.,

I appreciate human bias in terms of numerical scale, but I don’t think that is
what we actually achieve by using logarithms.  If the universe of possibility
is fractal, using a logarithm does not eliminate the problem of large numbers.
I think the primary outcome achieved by using logarithms is that units come to
represent proportions rather than absolute (fixed scale) amounts.  It reveals
an aspect of scale-free form.

On Jun 3, 2018, at 10:42 AM, Sungchul Ji
mailto:s...@pharmacy.rutgers.edu>> wrote:

Hi Krassimir,

I think the main reason that we express 'information'  as a logarithmic
function of the number of choices available, n, may be because the human brain
finds it easier to remember (and communicate and reason with)  10 than
100, or 100 than 10. . . . 0, etc.

All the best.

Sung

From: Krassimir Markov mailto:mar...@foibg.com>>
Sent: Sunday, June 3, 2018 12:06 PM
To: Foundation of Information Science
Cc: Sungchul Ji
Subject: If always n>0 why we need log

Dear Sung,

A simple question:

I = -log_2(m/n) = - log_2 (m) + log_2(n)   (1)

Friendly greetings

Krassimir

___
Fis mailing list
Fis@listas.unizar.es

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Is information physical? 'Signs rust.'

```Joseph,

Thank you for this concise statement.  It very closely matches my own
perspective.  I would only add the notion that meaningfulness or
meaninglessness is not an inherent property of information.  It is entirely
contingent upon the affect, or the absence of affect, of encountered
information on an agent.

Regards,

Guy

On Apr 26, 2018, at 7:31 AM,
joe.bren...@bluewin.ch wrote:

Information refers to changes in patterns of energy flow, some slow (frozen),
some fast, some quantitative and measurable, some qualitative and
non-measurable, some meaningful and some meaningless, partly causally effective
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

Message d'origine
De : u...@umces.edu
Date : 25/04/2018 - 08:14 (PDT)
À : mbur...@math.ucla.edu
Cc : fis@listas.unizar.es
Objet : Re: [Fis] Is information physical?

Dear Mark,

I share your inclination, albeit from a different perspective.

Consider the two statements:

1. Information is impossible without a physical carrier.

2. Information is impossible without the influence of that which does not exist.

There is significant truth in both statements.

I know that Claude Shannon is not a popular personality on FIS, but I
admire how he first approached the subject. He began by quantifying,
not information in the intuitive, positivist  sense, but rather the
*lack* of information, or "uncertainty", as he put it. Positivist
information thereby becomes a double negative -- any decrease in
uncertainty.

In short, the quantification of information begins by quantifying
something that does not exist, but nonetheless is related to that
which does. Terry calls this lack the "absential", I call it the
"apophatic" and it is a major player in living systems!

Karl Popper finished his last book with the exhortation that we need
to develop a "calculus of conditional probabilities". Well, that
effort was already underway in information theory. Using conditional
probabilities allows one to parse Shannon's formula for diversity into
two terms -- on being positivist information (average mutual
information) and the other apophasis (conditional entropy).

This duality in nature is evident but often unnoticed in the study of
networks. Most look at networks and immediately see the constraints
between nodes. And so it is. But there is also indeterminacy in almost
all real networks, and this often is disregarded. The proportions
between constraint and indeterminacy can readily be calculated.

What is important in living systems (and I usually think of the more
indeterminate ecosystems, rather than organisms [but the point applies
there as well]) is that some degree of conditional entropy is
absolutely necessary for systems sustainability, as it provides the
flexibility required to construct new responses to novel challenges.

While system constraint usually abets system performance, systems that
become too efficient do so by decreasing their (mutually exclusive)
flexibility and become progressively vulnerable to collapse.

The lesson for evolutionary theory is clear. Survival is not always a
and adaptability. Ecosystems do not attain maximum efficiency. To do
so would doom them.

The balance also
puts the lie to a major maxim of economics, which is that nothing
should hinder the efficiency of the market. That's a recipe for "boom
and bust".

Mark, I do disagree with your opinion that information cannot be
measured. The wider application of information theory extends beyond
communication and covers the information inherent in structure, or
what John Collier calls "enformation". Measurement is extremely
important there. Perhaps you are disquieted by the relative nature of
information measurements. Such relativity is inevitable. Information
can only be measured with respect to some (arbitrary) reference
distribution (which is also known in the wider realm of thermodynamics
as "the third law".)

Remember how Bateson pointed to the overwhelmingly ```

### Re: [Fis] "Mental model" ???

```Hi Krassimir,

Thanks for asking this important question.  I’m curious to see how others might
answer it.  For me, the “mental model” is a centralized system for information
processing that receives inputs from multiple sensory mechanisms and can induce
action as a consequence.  In biology, this can be a centralized nervous system
and whole organism behavior as the form of an output.  We illustrate this
process as we read and write FIS posts.

Are you looking for more detail about the “mental model”, or is this the sort
of thing you have in mind?

Cheers,

Guy

Guy Hoelzer, Associate Professor
Department of Biology

Phone:  775-784-4860
Fax:  775-784-1302

On Feb 26, 2018, at 2:48 PM, Krassimir Markov
<mar...@foibg.com<mailto:mar...@foibg.com>> wrote:

Dear colleagues,

I understand that it is not so easy to answer to the simple question.

But the mental models are very important for understanding the information and
communication phenomena.

So, again the same simple question: What is the “mental model”?

Friendly greetings
Krassimir

___
Fis mailing list
Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] The unification of the theories of information based on the cateogry theory

```Hi Christophe,

I completely agree that there is an important distinction between the
communication between attached beams and the semantic communication between
agents.  Like you, I have long been interested in the evolution of biological
signaling systems.  My dissertation research included explorations of the
potential meaning attached to courtship displays by a damselfish species.  In
the decades since I did that work, my view of communication has broadened.
Semantics is context dependent and is manifested internally by the agents
(perception), which makes it very hard to study empirically.  The best I could
do, and I think this may be a general limitation, was to model the hypothetical
semantic content of a signal, naively predict how the perceiver ‘should’
respond to the hypothetically encoded meaning, and judge whether the empirical
data fit my model of the system.  Note that I did my dissertation research at a
time when the leading idea was that  all signals were deceptive devices for
maximizing personal fitness (e.g., Krebs and Davies, 1984).  My observational
and experimental work on this system led me to think more deeply about the
evolution of signaling systems, and I proposed the following:

*   individuals assess all of the information they perceive, some of which
represents signals expressed by other individuals
‘packaged’ in a signal
*   individuals signal to other individuals in unconventional ways, in
addition to conventional ways (evolved signaling systems)
*   evolved, codified kinds of signals generally started as one of those
unconventional kinds of signals that conferred fitness gains for both the
signaler and the perceiver, on average
*   such useful signals tend to persist and they have an opportunity for
adaptive fine-tuning, morphological integration (e.g., a color patch used for
display), and amplification
*   I think these become the classical animal signaling systems we are so
familiar with

So, for me, codified semantic signals are embedded in, and deeply connected to,
a sea of information about other individuals.  Such signals may be anywhere
along a spectrum from simple information transfer (similar to the beams) to
semantically-based language.  Semantics is a fascinating and important target
of study, but I think limiting our terminology to that domain misleadingly
suggests that it is more disconnected from less formalized modes of
communication than it really is.  I also think it suggests that semantic
communication is more disconnected from the universal physico-chemical laws
than it really is.  I prefer to think of semantic communication as a subset of
all communication, and I see value in understanding the information transfer
between connected beams as sharing some fundamental similarities to semantic
communication.

Regards,

Guy

On Feb 14, 2018, at 3:05 PM, Christophe Menant
<christophe.men...@hotmail.fr<mailto:christophe.men...@hotmail.fr>> wrote:

Yes Guy,
Unconsciously I take communications as related to meaning generation.
But, as you say, we could use the word for the two beams attached to each other
with bolts and that ‘communicate’ relatively to the strength of the building.
The difference may be in the purpose of the communication, in the constraint
justifying its being.
The ‘communication’ between the two beams is about maintaining them together,
satisfying physical laws (that exist everywhere). It comes from the decision of
the architect who is constrained to get a building that stands up. The
constraint is with the architect, not with the beams that only obey physical
laws.
In the case of living entities the constraints are locally present in the
organisms (‘stay alive’). The constraint is not in the environment of the
organism. And the constraint addresses more than physico-chemical laws.
If there is meaning generation for constraint satisfaction in the case of
organisms, it is difficult to talk the same for the two beams.
This introduces the locality of constraints as a key subject in the evolution
of our universe. It is an event that emerged from the a-biotic universe
populated with physico-chemical laws valid everywhere.
Another subject interesting to many of us
All the best
Christophe

________
De : Guy A Hoelzer <hoel...@unr.edu<mailto:hoel...@unr.edu>>
Envoyé : mardi 13 février 2018 18:18
À : Foundations of Information Science Information Science
Cc : Terry Deacon; Christophe Menant
Objet : Re: [Fis] The unification of the theories of information based on the
cateogry theory

Hi All,

I want to pick on Christophe’s post to make a general plea about FIS posting.
This is not a comment on meaning generation by agents.  Christophe  wrote:

"Keeping in mind that communications exist only because agents need to manage
meanings for given purposes”.

Thi```

### Re: [Fis] The unification of the theories of information based on the cateogry theory

```Hi All,

I want to pick on Christophe’s post to make a general plea about FIS posting.
This is not a comment on meaning generation by agents.  Christophe  wrote:

"Keeping in mind that communications exist only because agents need to manage
meanings for given purposes”.

This seems to imply that we have such confidence that this premise is correct
that it is safe to assume it is true.  However, the word “communication” is
sometimes used in ways that do not comport with this premise.  For example, it
can be said that in the building of a structure, two beams that are attached to
each other with bolts are “communicating” with each other.  This certainly fits
my notion of communication, although there are no “agents” or “meanings” here.
Energy (e.g., movement) can be transferred from one beam to the next, which
represents “communication” to me.  I would personally define communication as
the transfer of information, and I prefer to define “information” without any
reference to “meaning”.  If the claim above had been written as a contingency
(e.g., “If we assume that communications exist…”), then I could embrace the
rest of Christophe’s post.

I think the effectiveness of our FIS posts is diminished by presuming everybody
shares our particular perspectives on these concepts.  It leads us to talk past
each other to a degree; so I hope we can remain open to the correctness or
utility of alternative perspectives that have been frequently voiced within FIS
and use contingent language to establish the premises of our FIS posts.

Regards,

Guy

On Feb 13, 2018, at 5:19 AM, Christophe Menant
> wrote:

Dear Terry and FISers,
It looks indeed reasonable to position the term 'language' as ‘simply referring
to the necessity of a shared medium of communication’. Keeping in mind that
communications exist only because agents need to manage meanings for given
purposes.
And the concept of agent can be an entry point for a ‘general theory of
information’ as it does not make distinctions.
The Peircean triadic approach is also an available framework (but with, alas, a
limited development of the Interpreter).
I choose to use agents capable of meaning generation, having some compatibility
with the Peircean approach and with the Biosemiotics
Umwelt.(https://philpapers.org/rec/MENCSA-2)

All the best
Christophe
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] If "data = information", why we need both concepts?

```Jose,

I agree that the semantic and physical notions of ‘information’ are
intertwined, and I think we can be more explicit about how the are related.  I
claim that physical information is general, while semantic information is
merely a subset of physical information.  Semantic information is composed of
kinds of physical contrasts to which symbolic meaning has been attached.
Meaningfulness cannot exist in the absence of physical contrast, but physical
information can exist independently of sensation, perception, cognition, and
contextual theory.

Regards,

Guy

On Oct 3, 2017, at 12:53 PM, Jose Javier Blanco Rivero
<javierwe...@gmail.com<mailto:javierwe...@gmail.com>> wrote:

Dear all,

What if, in order to understand information and its relationship with data and
meaning, we distinguish the kind of system we are talking about in each case?

We may distinguish systems by their type of operation and the form of their
selforganization. There are living systems, mind systems, social systems and
artificial systems.

What information is depends on the type of system we are talking about. Maybe
distinguishing between information and meaning in living systems and artificial
systems might not make much sense, but it is crucial for social systems. Bits
of information codify possibilities of experience and action (following
somewhat loosely Luhmanns social systems theory) and meaning cristalizes when a
posibility is fulfilled for a particular subsystem (interaction systems,
organizations...). The role of language in social systems is another reason to
distinguish information from meaning.
In artificial systems it might make sense to distinguish between data and
information, being data everything a computer needs to make a calculations and
information the results of those calculations that enable it to do more
calculations or to render an output of whatever kind. So what is information at
some stage of the process becomes data on other.

It is obvious that all of these systems operate closely intertwined. They
couple and decouple, retaining their specificity.

Best regards,

El oct 3, 2017 4:28 PM, "Guy A Hoelzer"
<hoel...@unr.edu<mailto:hoel...@unr.edu>> escribió:
Dear Krassimir et al.,

Your post provides an example of the importance that semantics plays in our
discussions.  I have suggested on several occasions that statements about
‘information’ should explicitly distinguish between a purely heuristic
definition, such as those involving ‘meaning’, and definitions focused on a
physical phenomena.  I personally prefer to adopt the latter definition, which
would make your post false.  For example, when I type the symbol ‘Q’ I have
created information because there is a contrast between white and black regions
of its local space.  Meaning is utterly irrelevant to the attribute of
‘information’ from this perspective.  I can create an instance of information
by writing ‘Q’, and you can receive that information by viewing it, even if it
means nothing to either of us.  The symbol ‘Q’ might be attached to some
meaning for one or both of us, but for me that is irrelevant to the question of
information content which can be measured in  a variety of ways in this
example.  If we agree on a symbolic meaning of ‘Q’, then the information
transfer can also carry the transfer of ‘meaning’.

In other words, I would argue that data is indeed information, unless it is
perfectly uniform.  Meaning is attached to data by putting the data in the
context of a theory, but this is an analytical option.  For example, you could
always display the data on graphs without a theoretical context, and such an
analysis might make trends or patterns more evident, even without meaning
attached.  Descriptive or observational data are often presented this way in
young scientific disciplines that have yet to develop a rich theoretical
context in which to interpret the meaning of data.

On the other hand, if you start by explicitly stating that you are using the
semantic notion of information at the start, I would agree whole heartedly with

Best Wishes,

Guy

> On Oct 3, 2017, at 4:16 AM, Krassimir Markov
> <mar...@foibg.com<mailto:mar...@foibg.com>> wrote:
>
> Dear John and FIS Colleagues,
>
> I am Computer Science specialist and I never take data to be information.
>
> For not specialists maybe it is normal "data to be often taken to be
> information" but this is not scientific reasoning.
>
> Simple question: if "data = information", why we need both concepts?
>
>
> Friendly greetings
>
> Krassimir
>
>
> Dear list,
>
>
> As Floridi points out in his Information. Oxford: Oxford University Press,
> 2010. A volume for the Very Short Introduction series. data is often taken
> to be information. If so, then the below distinction is somewhat
> arbitra```

### Re: [Fis] If "data = information", why we need both concepts?

```Dear Krassimir et al.,

Your post provides an example of the importance that semantics plays in our
discussions.  I have suggested on several occasions that statements about
‘information’ should explicitly distinguish between a purely heuristic
definition, such as those involving ‘meaning’, and definitions focused on a
physical phenomena.  I personally prefer to adopt the latter definition, which
would make your post false.  For example, when I type the symbol ‘Q’ I have
created information because there is a contrast between white and black regions
of its local space.  Meaning is utterly irrelevant to the attribute of
‘information’ from this perspective.  I can create an instance of information
by writing ‘Q’, and you can receive that information by viewing it, even if it
means nothing to either of us.  The symbol ‘Q’ might be attached to some
meaning for one or both of us, but for me that is irrelevant to the question of
information content which can be measured in  a variety of ways in this
example.  If we agree on a symbolic meaning of ‘Q’, then the information
transfer can also carry the transfer of ‘meaning’.

In other words, I would argue that data is indeed information, unless it is
perfectly uniform.  Meaning is attached to data by putting the data in the
context of a theory, but this is an analytical option.  For example, you could
always display the data on graphs without a theoretical context, and such an
analysis might make trends or patterns more evident, even without meaning
attached.  Descriptive or observational data are often presented this way in
young scientific disciplines that have yet to develop a rich theoretical
context in which to interpret the meaning of data.

On the other hand, if you start by explicitly stating that you are using the
semantic notion of information at the start, I would agree whole heartedly with

Best Wishes,

Guy

> On Oct 3, 2017, at 4:16 AM, Krassimir Markov  wrote:
>
> Dear John and FIS Colleagues,
>
> I am Computer Science specialist and I never take data to be information.
>
> For not specialists maybe it is normal "data to be often taken to be
> information" but this is not scientific reasoning.
>
> Simple question: if "data = information", why we need both concepts?
>
>
> Friendly greetings
>
> Krassimir
>
>
> Dear list,
>
>
> As Floridi points out in his Information. Oxford: Oxford University Press,
> 2010. A volume for the Very Short Introduction series. data is often taken
> to be information. If so, then the below distinction is somewhat
> arbitrary. It may be useful or not. I think that for some circumstances it
> is useful, but for others it is misleading, especially if we are trying to
> come to grips with what meaning is. I am not sure there is ever data
> without interpretation (it seems to me that it is always assumed to be
> about something). There are, however, various degrees and depths of
> interpretation, and we may have data at a more abstract level that is
> interpreted as meaning something less abstract, such as pointer readings
> of a barometer and air pressure. The pointer readings are signs of air
> pressure. Following C.S. Peirce, all signs have an interpretant. We can
> ignore this (abstraction) and deal with just pointer readings of a
> particular design of gauge, and take this to be the data, but even the
> pointer readings have an important contextual element, being of a
> particular kind of gauge, and that also determines an interpretant. Just
> pointer readings alone are not data, they are merely numbers (which also,
> of course, have an interpretant that is even more abstract.
>
> So I think the data/information distinction needs to be made clear in each
> case, if it is to be used.
>
> Note that I believe that there is information that is independent of mind,
> but the above points still hold once we start into issues of observation.
> My belief is based on an explanatory inference that must be tested (and
> also be useful in this context). I believe that the idea of mind
> independent information has been tested, and is useful, but I am not going
> to go into that further here.
>
>
> Regards,
>
> John
>
> PS, please note that my university email was inadvertently wiped out, so I
> am currently using the above email, also the alias coll...@ncf.ca If
> anyone has wondered why their mail to me has been returned, this is why.
>
>
>
>
> On 2017/09/30 11:20 AM, Krassimir Markov wrote:
>
> Dear Christophe and FIS Colleagues,
>
> I agree with idea of meaning.
>
> The only what I would to add is the next:
>
> There are two types of reflections:
>
> 1. Reflections without meaning called DATA;
>
> 2. Reflections with meaning called INFORMATION.
>
> Friendly greetings
> Krassimir
>
>
> --
> Krassimir Markov
> Director
> ITHEA Institute of Information Theories and Applications
> Sofia, Bulgaria
> presid...@ithea.org
> ```

### Re: [Fis] INFORMATION: JUST A MATTER OF MATH

```I agree with Arturo.  I understand information exclusively as matter and
energy, and the diversity of their states through space/time.  What else it
there?  The alternative would be to accept ‘information’ as merely an heuristic
concept that helps us to communicate and make sense of our lives without the
goal of identifying real phenomena.  I think the freedom to create and use such
heuristic concepts is essential for many reasons, but we are constantly
challenged as scientists with distinguishing between these terms and those we
think and hope approximate real phenomena.  A grad student I worked with
suggested the term “tool words” to label terms we recognize as mainly
heuristic.  As an evolutionary biologist, I would suggest the term “fitness”
has been a very useful heuristic term, but that “fitness” does not actually
exist.  This statement might surprise or even put off many of my colleagues,
which I think illustrates the problem caused by failing to make this
distinction explicit.  As I have argued before, I think clearly distinguishing
between ‘information’ and ‘meaning’ would be a good first step in this
direction.

Regards,

Guy

Guy Hoelzer, Associate Professor
Department of Biology

Phone:  775-784-4860
Fax:  775-784-1302

On Sep 15, 2017, at 6:16 AM,
tozziart...@libero.it<mailto:tozziart...@libero.it> wrote:

Dear FISers,
I'm sorry for bothering you,
but I start not to agree from the very first principles.

The only language able to describe and quantify scientific issues is
mathematics.
Without math, you do not have observables, and information is observable.
Therefore, information IS energy or matter, and can be examined through
entropies (such as., e.g., the Bekenstein-Hawking one).

And, please, colleagues, do not start to write that information is subjective
and it depends on the observer's mind. This issue has been already tackled by
the math of physics: science already predicts that information can be
"subjective", in the MATHEMATICAL frameworks of both relativity and quantum
dynamics' Copenhagen interpretation.
Therefore, the subjectivity of information is clearly framed in a TOTALLY
physical context of matter and energy.

Sorry for my polemic ideas, but, if you continue to define information on the
basis of qualitative (and not quantitative) science, information becomes
metaphysics, or sociology, or psychology (i.e., branches with doubtful
possibility of achieving knowledge, due to their current lack of math).

Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

Messaggio originale
Da: "Pedro C. Marijuan"
<pcmarijuan.i...@aragon.es<mailto:pcmarijuan.i...@aragon.es>>
Data: 15/09/2017 14.13
A: "fis"<fis@listas.unizar.es<mailto:fis@listas.unizar.es>>
Ogg: [Fis] PRINCIPLES OF IS

Dear FIS Colleagues,

As promised herewith the "10 principles of information science". A couple of
previous comments may be in order.
First, what is in general the role of principles in science? I was motivated by
the unfinished work of philosopher Ortega y Gasset, "The idea of principle in
Leibniz and the evolution of deductive theory" (posthumously published in
1958). Our tentative information science seems to be very different from other
sciences, rather multifarious in appearance and concepts, and cavalierly moving
from scale to scale. What could be the specific role of principles herein?
Rather than opening homogeneous realms for conceptual development, these
information principles would appear as a sort of "portals" that connect with
essential topics of other disciplines in the different organization layers, but
at the same time they should try to be consistent with each other and provide a
coherent vision of the information world.
And second, about organizing the present discussion, I bet I was too optimistic
with the commentators scheme. In any case, for having a first glance on the
whole scheme, the opinions of philosophers would be very interesting. In order
to warm up the discussion, may I ask John Collier, Joseph Brenner and Rafael
Capurro to send some initial comments / criticisms? Later on, if the
commentators idea flies, Koichiro Matsuno and Wolfgang Hofkirchner would be
very valuable voices to put a perspectival end to this info principles
discussion (both attended the Madrid bygone FIS 1994 conference)...
But this is FIS list, unpredictable in between the frozen states and the
chaotic states! So, everybody is invited to get ahead at his own, with the only
customar```

### [Fis] _ Re: _ Re: _ Re: On mathematical theories and models in biology

```I personally consider metabolism to be at the core of what constitutes ‘life’,
so the notion of autopoeisis is very attractive to me.  It is also possible
that the richness of life as we know it depends on having metabolisms
(activity), genomes (memory), and reproduction combined.  The reductionistic
approach to singling out one of these three pillars of life as its essence may
be futile.  However, I want to point out that the most reduced version of
‘life’ I have seen was proposed by Terry Deacon in the concept he calls
“autocells”.  Terry has made great contributions to FIS dealing with related
topics, and I hope he will chime in here to describe his minimalist form of
life, which is not cellular, does not have any metabolism or genetically
encoded memory.  Autocells do, however, reproduce.

Regards,

Guy

On Mar 29, 2016, at 1:55 PM, Louis H Kauffman
> wrote:

This is a reply to Plamen’s comment about autopoeisis. In their paper
Maturana,Uribe and Varela give a working model (computer model) for autopoeisis.
It is very simple, consisting of a subtrate of nodal elements that tend to bond
when in proximity, and a collection of catalytic nodal elements that promote
bonding in their vicinity. The result of this dynamics is that carapaces of
linked nodal elements form around the catalytic elements and these photo-cells
tend to keep surviving the perturbations built into the system. This model
shows that cells can arise from a very simple dynmamic geometric/topological
substrate long before anything as sophisticated as DNA has happened.

On Mar 29, 2016, at 2:54 PM, Stanley N Salthe
> wrote:

Plamen wrote:

I begin to believe that the transition from abiotic to biotic structures,
incl. Maturana-Varela.-Uribe’s autopoiesis may, really have some underlying
matrix/”skeleton”/”programme” which has nothing in common with the nature of
DNA, and that DNA and RNA as we know them today may have emerged as secondary
or even tertiary “memory” of something underlying deeper below the
microbiological surface. It is at least worth thinking in this direction. I do
not mean necessarily the role of the number concept and Platonic origin of the
universe, but something probably much more “physical”

S: An interesting recently published effort along these lines is:

Alvaro Moreno and Matteo Mossio: Biological Autonomy: A Philosophical and
Theoretical Enquiry (History, Philosophy and Theory of the Life Sciences 12)
Springer

They seek a materialist understanding of biology as a system, attempting to
refer to the genetic system as little as possible.

I have until very recently attempted to evade/avoid mechanistic thinking in
regard to biology, but, on considering the origin of life generally while
keeping Howard Pattee's thinking in mind, I have been struck by the notion that
the origin of life (that is: WITH the genetic system) was the origin of
mechanism in the universe.  Before that coding system, everything was mass
action.  I think we still do not understand how this mechanism evolved.

STAN

On Tue, Mar 29, 2016 at 7:40 AM, Dr. Plamen L. Simeonov
> wrote:

Dear Lou, Pedro and All,

I am going to present a few opportunistic ideas related to what was said before
in this session. Coming back to Pivar’s speculative mechano-topological model
of life excluding genetics I wish to turn your attention to another author with
a similar idea but on a sound mathematical base, Davide Ambrosi with his resume
at
https://www.uni-muenster.de/imperia/md/content/cim/events/cim-mathmod-workshop-2015_abstracts.pdf[uni-muenster.de]:
“Davide Ambrosi:
A role for mechanics in the growth, remodelling and morphogenesis of living
systems  In the XX Century the interactions between mechanics in biology were
much  biased by a bioengineering attitude: people were mainly interested in
evaluating the state of stress that bones and tissues undergo in order to
properly design prosthesis and devices. However in the last decades a new
vision is emerging. "Mechano-biology" is changing the point of view, with
respect to "Bio-mechanics", emphasizing the biological feedback. Cells, tissues
and organs do not only deform when loaded: they reorganize, they duplicate,
they actively produce dynamic patterns that apparently have multiple biological
aims.
In this talk I will concentrate on two paradigmatic systems where the interplay
between mechanics and biology is, in my opinion, particularly challenging: the
homeostatic stress as a driver for ```

### Re: [Fis] New Year Lecture: Aftermath

```Hi Terry,

I have used the term ‘perception’ in referring to in-formation that affects
internal structure or dynamics.  This would contrast with forms of potential
information that might pass through the system without being ‘perceived’.  For
example, we have a finite number of mechanisms we call senses, each of which is
sensitive to particular modes of information we encounter in our environment,
but we are not able to perceive every form of information that we encounter
(e.g., UV light).  I think you are using the term ‘interpretation’ to describe
the same thing.  Do you agree?  Do you think the notions of perception and
interpretation are effectively the same thing?

Cheers,

Guy

Guy Hoelzer, Associate Professor
Department of Biology

Phone:  775-784-4860
Fax:  775-784-1302
hoel...@unr.edumailto:hoel...@unr.edu

On Apr 24, 2015, at 10:22 AM, Terrence W. DEACON
dea...@berkeley.edumailto:dea...@berkeley.edu wrote:

Hi Pedro,

Indeed, you capture a fundamental point of my work. I entirely agree with your
comment about living processes and their internal informative organization.
The three exceedingly simple molecular model systems (forms of autogenesis)
that I discuss toward the end of the paper were intended to exemplify a minimal
life-like unit that—because of its self-reconstituting and self-repairing
features—could both exemplify an origin of life transition and a first simplest
system exhibiting interpretive competence. It is only because these autogenic
systems respond to disruption of their internal organizational coherence that
they can be said to also interpret aspects of their environment with respect to
this. My goal in this work is to ultimately provide a physico-chemical
foundation for a scientific biosemiotics, which is currently mostly exemplified
by analogies to human-level semiotic categories.

Sincerely, Terry

On Fri, Apr 24, 2015 at 5:34 AM, Pedro C. Marijuan
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es wrote:
Dear Terry and colleagues,

I hope you don't mind if I send some suggestions publicly. First, thank you for
the aftermath, it provides appropriate closure to a very intense discussion
session. Second, I think you have encapsulated very clearly an essential point
(at least in my opinion):

Among these givens is the question of what is minimally necessary for
a system or process to be interpretive, in the sense of being able to utilize
present
intrinsic physical properties of things to refer to absent or
displaced properties or phenomena. This research question is ignorable
when it is possible to assume human or even animal interpreters as
part of the system one is analyzing. At some point, however, it
becomes relevant to not only be more explicit about what is being
assumed, but also to explain how this interpretive capacity could
possibly originate in a universe where direct contiguity of causal
influence is the rule.

My suggestion concerns the absence phenomenon (it also has appeared in some
previous discussion in this list --notably from Bob's). You imply that there is
an entity capable  of dynamically building upon  an external absences, OK quite
clear,  but what about internal absences? I mean at the origins of
communication there could be the sensing of the internal-- lets call it
functional voids, needs, gaps, deficiencies, etc. Cellularly there are some
good arguments about that, even in the 70's there was a metabolic code
hypothesis crafted on the origins of cellular signaling. For instance, one of
the most important environmental  internal detections concerns cAMP, which
means you/me are in an energy trouble... some more evolutionary arguments can
be thrown.  Above all, this idea puts the life cycle and its self-production
needs in the center of communication, and in the very origins of the
interpretive capabilities. Until now I have not seen much reflections around
the life cycle as the true provider of both communications and meanings, maybe
it conduces to new avenues of thought interesting to explore...

All the best!
--Pedro

Pedro C. Marijuan wrote:

Dear FIS colleagues,
Herewith the comments received from Terry several weeks ago. As I said
yesterday, the idea is to properly conclude that session, not to restart
the discussion. Of course, scholarly comments are always welcome, but
conclusively and not looking for argumentative rounds. Remember that in
less than ten days we will have a new session on info science and library
science. best --Pedro

--

Retrospective comments on the January 2015 FIS discussion

Terrence Deacon (dea...@berkeley.edumailto:dea...@berkeley.edu)

During the bulk of my career since the early 1980s I studied brain
organization with a particular focus on its role in the production and
interpretation of communication```

### Re: [Fis] [Fwd: Re: Steps to a theory of reference significance] Terry Deacon

```Hi Terry,

I have a question about your ‘PS’.  I think of MEP as being constrained by
potentials and a limited set of material opportunities (the adjacent
possibilities).  I think of it as a thermodynamic version of natural selection
in which some alternative states are thermodynamically favored over others, but
this does not guarantee that dissipation will proceed to completion or that the
particular alternative that absolutely generates the most efficient or
effective dissipation will always be the manifested outcome (if there are a
number of similarly optimal paths available).  Contingency on idiosyncratic
configurations within and in the neighborhood of a system might lead the system
to follow a variety of alternative paths.  Would you argue that autogenesis is
not an MEP process from this somewhat fuzzy perspective?

Cheers,

Guy

Guy Hoelzer, Associate Professor
Department of Biology

Phone:  775-784-4860
Fax:  775-784-1302
hoel...@unr.edu

On Jan 9, 2015, at 3:35 AM, Pedro C. Marijuan pcmarijuan.i...@aragon.es
wrote:

Message from Terry Deacon

Original Message
Subject:  Re: [Fis] Steps to a theory of reference  significance
Date: Fri, 9 Jan 2015 03:32:22 +0100
From: Terrence W. DEACON dea...@berkeley.edu
To:   Pedro C. Marijuan pcmarijuan.i...@aragon.es

This very brief reply should be routed to the FIS list please...

One response: My choice of autogenesis is motivated by ...
1. It is the simplest dynamical system I have been able to imagine that
exhibits the requisite properties required for an interpretive system (i.e.
one that can assign reference and significance to a signal due to intrinsic
properties alone - that is these features are independent of any extrinsic
perspective). A simple organism is far too complex. As a result it is
and is not - for example just assuming that DNA molecule are intrinsically
informational). As I note when introducing this model, developing a simplest
but not too simple model system is the key to devising clear physical
principles.
2. Autogenesis is not the same as autopoiesis (which is only a description of
presumed requirements for life) rather autogenesis is a well-described
empirically testable molecular dynamic, that is easily model able in all
aspects. Autopoiesis fit with the class of models assuming that simple
autocatalysis is sufficient and then simply adds (by assertion) the
(non-realized) assumption that autopoiesis can somehow be causally closed and
unitary, whereas in fact autocatalytic systems are intrinsically dissipative*
and subject to error catastrophe. More importantly, the assumption about
coherent finite unity and internal synergy is the critical one, and so it
needs to be the one feature that is explicitly modeled in order to understand
these aspects of information. 3. The self-regulating self-repairing
end-directed dynamic of autogenesis provides a disposition to preserve a
reference target state (even when its current state is far from it). This
serves as the necessary baseline for comparative assessment, without which
reference and significance cannot be defined because these are intrinsically
relativistic informational properties (there is a loose analogy here to the
3rd law of thermodynamics and the relativistic nature of thermodynamic
entropy).

* PS: Autogenesis is also not a Maximim Entropy Production process because it
halts dissipation before its essential self-preserving constraints are
persistence depends.

— Terry

On Thu, Jan 8, 2015 at 1:48 PM, Pedro C. Marijuan pcmarijuan.i...@aragon.es
mailto:pcmarijuan.i...@aragon.es wrote:

Dear Terry and colleagues,

Thanks a lot for the opening text! It is a well crafted Essay full
of very detailed contents. My impression is that the microphysics
of information has been solved elegantly --at least at the level of
today's relevant knowledge-- with your work and the works of related
authors, one of them Karl Friston, who could be linked as a
complementary approach to yours (in particular his recent Life as
we know it, Royal Society Interface Journal, 10: 20130475). His
Bayesian approach to life's organization, coupled with (variational)
free energy minimization principle, conduces to the emergence of
homeostasis and a simple form of autopoiesis, as well as the
organization of perception/action later on. Thus, quite close to
the Essay, the very detailed points you deal with in section 4
(steps to a formalization of reference```

### Re: [Fis] Neuroinformation?

```Hi All,

Like many here, I am very interested in the notion of neuroinformation and the
contrast between information as static pattern or temporal process.  I want to
suggest a way to think of the static and process views of information as
identical concepts.  I take the static view to be something like the existence
of a physical gradient or contrast in state between proximate spaces.  The 2nd
law of thermodynamics tells us that all such gradients will tend to bread down
(disorganize) over time.  Therefore, maintenance of static information requires
a process.  This idea could apply nicely to neuroinformation.  For example,
memories can fade if they are not accessed occasionally.  From this point of
view, static contrasts and the processes that maintain them cannot be
separated, much like pattern and process cannot be separated in the dissipative
systems of Prigogine.

Regards,

Guy

Guy Hoelzer, Associate Professor
Department of Biology

Phone:  775-784-4860
Fax:  775-784-1302
hoel...@unr.edumailto:hoel...@unr.edu

On Dec 4, 2014, at 6:57 AM, Krassimir Markov
mar...@foibg.commailto:mar...@foibg.com wrote:

Dear Bob,
I think, there is no conflict between two points of view – information may be a
process and it may be a static depending of what kind of reflection it is.
For instance, we reflect the world around:
- as static - by photos, art images, sculptures, etc.;
- as dynamic - by movies, theater plays, ballet, etc.;
- and, at the end, by both types – by static text which creates dynamical
imaginations in our consciousness.
Friendly regards
Krassimir

PS: This is my second post for this week. So, I say: Goodbye to the next one!

From: Bob Loganmailto:lo...@physics.utoronto.ca
Sent: Thursday, December 04, 2014 3:54 PM
To: Joseph Brennermailto:joe.bren...@bluewin.ch
Cc: fis@listas.unizar.esmailto:fis@listas.unizar.es
Subject: Re: [Fis] Neuroinformation?

Dear all - I support Joseph's remarks and would suggest that information in
general is a process that unfortunately is formulated as a noun. Inspired by
Bucky Fuller's I think I am a verb I suggest that Information is a verb It is
a verb because it describes a process. Although that solves one problem we need
to be able to describe a set of signs that have the potential to initiate the
process of informing through interpretation. I would not suggest we create
another word but recognize that the word information has many meanings and that
when it is describing a process it has a verb-like quality to it and when it
describes a set of sign that have the potential to be interpreted and hence
become information it is acting as a noun. I would also suggest that a simple
definition of the term information is not possible because its meaning is so
context dependent. This is true of all words but even more so for information.
For those that agree with my sentiments the above is information and for those
that do not it is nonsense. My best wishes to both groups,  Bob Logan
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto
Chief Scientist - sLab at OCAD
www.physics.utoronto.ca/Members/loganhttp://www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publicationshttp://www.researchgate.net/profile/Robert_Logan5/publications

On 2014-12-04, at 6:40 AM, Joseph Brenner wrote:

Dear Dr. Isiegas,

I will add my support to the extended concept of information that inheres in
the work of Robert Ulanowicz and John Collier. I would just add that I like to
call it information-as-process, to call attention to its 'structure' being
dynamic, with individual neurones involved in a cyclic (better spiral or
sinusoidal) movement between states of activation and inhibition. I have
ascribed an extension of logic to this form of alternating actual and potential
states in complex processes at all levels of reality.

Best wishes,

Joseph B.

- Original Message - From: Robert E. Ulanowicz
u...@umces.edumailto:u...@umces.edu
To: Carolina Isiegas cisie...@gmail.commailto:cisie...@gmail.com
Cc: fis@listas.unizar.esmailto:fis@listas.unizar.es
Sent: Wednesday, December 03, 2014 6:30 PM
Subject: Re: [Fis] Neuroinformation?

Dear Dr. Isiegas:

I envision neuroinformation as the mutual information of the neuronal
network where synaptic connections are weighted by the frequencies of
discharge between all pairs of neurons. This is directly analogous to a
network of trophic exchanges among an ecosystem, as illustrated in
http://people.biology.ufl.edu/ulan/pubs/SymmOvhd.PDF.

Please note that this measure is different from the conventional
sender-channel-receiver format of communications theory. It resembles more
the structural information inhering in the neuronal network. John
Collier (also a FISer) calls such information enformation to draw
attention to its different nature.

With best wishes for success,

Bob Ulanowicz```

### Re: [Fis] The Travelers

``` dynamics of the hurricane change in response to this information.

Regards,

Guy

Guy Hoelzer, Associate Professor
Department of Biology

Phone:  775-784-4860
Fax:  775-784-1302
hoel...@unr.edumailto:hoel...@unr.edu
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Informational Bookkeeping

```I have never understood the process of telomere shortening and this is a
tantalizing idea.  What factor might drive the evolution of a bookkeeping
mechanism like telomere shortening in cells?  I ask my question this way to
intentionally avoid the assumption that it must represent an adaptation
benefitting cell fitness.  It might be an adaptation of sorts, or it might be,
for example, a side effect of other cellular processes that are critical for
cellular function.

Cheers,

Guy

On Sep 9, 2014, at 12:09 AM, Pridi Siregar pridi.sire...@ibiocomputing.com
wrote:

Hello,

One process that could be viewed as a form of bookeeping in cells is
telomeric shortening subsequent to cell division. Telomeric shortening, it
seems, is related to the diminishing life capital of each living cells.
Just as our bank accounts diminish with spending.

This could be a good example where information is somewhat seperated from
energy. Indeed, the cells' internal energy do not seem to diminish with
each cell division. What seems to be reduced is the potentia of each cell
to live a long life. This potentia must be related to the cell's capacity to
adapt to its environment which in turn could be related to the its
informational content...It's just an idea.

Best

Pridi

- Mail original -
De: Raquel del Moral rdelmoral.i...@aragon.es
À: fis@listas.unizar.es
Envoyé: Lundi 8 Septembre 2014 14:11:48
Objet: Re: [Fis] Informational Bookkeeping

Dear Pedro,

The concept of bookkeeping looks very interesting for biology, however,
I can't see clearly how to apply it below the level of nervous systems.
Counting maybe found in most biomolecules, but really registering in a
book keeping manner is possible in the lower levels? e.g. unicellulars.
Do you think that they modify their behaviour after checking their own
bookkeeping registers?

Just this brief comment!

Best,
Raquel

El 05/09/2014 14:14, Pedro Marijuan escribió:
Dear FIS colleagues,

A very interesting comment by Bob about energy as a bookkeeping device
in the other discussion track motivates these rough reflections.

Actually, within the culture of mechanics (following Frank Wilczek)
energy appears as the more reliable concept, beyond its cousins force
and mass. Mechanics, like most scientific theories, finally is but a
method to count upon variable aspects of simplified phenomena and
provide inter-subjective objectivity(?). Numbers are due to our mental
counting operations; and concepts, formulas and theories become
bookkeeping devices to obtain more complex counting that dovetail with
more complex phenomena. That our mental counting dovetails with nature's
pretended counting is what the experimental side of science tries to
establish. It becomes of great merit that energy constructs such as
those mentioned by Bob do their bookkeeping accurately, in spite of
their intrinsic limitations.

My concern with the views expressed in the other track is that
informational bookkeeping appears to be rather different from the
mechanical physical bookkeeping or counting. There are new aspects not
covered by the extensive and inflexible mechanical-dynamic counting,
and which are essential to the new informational organizations we are
discovering --and practicing around-- and to the new worldview that
presumably we should search and promote. Is there bookkeeping in life?
Do molecules count? Do bacteria or unicellulars bookkeep--and organisms?
And complex brains? And individuals? And social groups? And companies
and markets? And cities, regions and countries?

Admittedly it is a potpourri; but yes, there are some clear instances
where quite explicit a bookkeeping is maintained. It may be about
inextricable mixing--involving whatever aspects. But these bookkeepings
are made with attentional flexibility and different closure
procedures that allow for new forms of compositional hierarchy
recognize, they are productively engaged in life cycles where the
meaning is generated, they co-create new existential realms... In our
own societies, the  exaggerated importance of new informational devices
(historically: numbers, alphabets, books, calculi, computers, etc.)
derives from their facilitation and acceleration of all the enormous
bookkeeping activities that subtend the social complexity around.

Who knows, focusing on varieties of bookkeeping might be quite
productive!

best ---Pedro

*Pedro C. Marijuán Fernández*
Dirección de Investigación

Instituto Aragonés de Ciencias de la Salud (IACS)
Instituto de Investigación Sanitaria Aragón (IIS Aragón)
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 1
50009 Zaragoza
Tfno. +34 976 71 ```

### Re: [Fis] Krassimir's Information Quadruple and GIT.

```Dear Krassimir et al.,

I like your view very much with one exception.  I think it confounds
information with meaning, which I think can lead to problems.  For example, I
could give two people the same message written on your identical pieces of
paper.  It is written in English, but only one of the readers understands
English.  My message might be meaningful to one reader, but it cannot be
meaningful to the other.  I would argue that both pieces of paper contain the
same information.  In other words, for me it is important to recognize
information as existing in the absence of its appreciation or interpretation.
Perception and interpretation are generated by an agent, so they are not direct
representations of the information and (perhaps universally?) add some error or
distortion in the process.  I would suggest a revision to what you wrote as
follows:

Energy AND INFORMATION are objective phenomena.  PERCEPTION AND MEANING are
subjective phenomena.

Can anybody see a problem with this form of the statement?

Regards,

Guy Hoelzer

On Aug 25, 2014, at 11:51 AM, Krassimir Markov
mar...@foibg.commailto:mar...@foibg.com wrote:

Dear Colleagues,

Thank you for comments and remarks.
Many thanks to Mark for his interesting post.
Really, the correspondence between energy and information is fundamental and
needs to be clearly explained.

I want to present my point of view because it is different from other ones.

It is clear, the energy is needed to create a reflection.
Without energy no internal changes (reflections) in the entities may be
realized.
This means that energy is needed to realize reflection which may become
information for given subject.
Without energy information is impossible.

But the opposite correspondence does not exist.
Energy does not depend on information.
It exists in reality without subjects’ “decisions”.
Energy is objective phenomenon , Information is subjective phenomenon.

Let see a simple example.

Let we have two equal pieces of paper.
They contain some energy, let assume that its quantities are equal in both
pieces.
In other words, for instance, if we burn these pieces they will release
practically the same quantities of energy.
If I have such piece of paper  and you have another such one, we may exchange
them as equivalent without any additional conditions.

Let now the pieces of paper are painted with some colors.
Let assume that again it is in equal quantities in both pieces.
Again, we may exchange pieces as equivalent without any additional conditions.

At the end, let pieces of paper are painted as follow:
- the first piece is painted as USD 100 (one hundred dollars)
- the second one is pained as RUB 100 (one hundred rubles)
i.e. let have two real banknotes.

Now, we will not agree to exchange these pieces of paper without additional
conditions.
As it is shown by Bloomberg, on 08/25/2014, 12.59:59,
(http://www.bloomberg.com/quote/USDRUB:CUR)
US DOLLAR-RUSSIAN RUBLE Exchange Rate is:
Price of 1 USD in RUB is 36.1646,
i.e now the first piece of paper is equivalent to more than 36 pieces of second
one.
Because of information for the subjects, the pieces became different
notwithstanding that the energy quantities are equal in both pieces.
The subjective decisions have important role in this case.

In conclusion, the energy and information are different phenomena – objective
and subjective, respectively.

Energy may be explained by triple (see Mark’s nice explanations about triples!)
: (source, recipient, transition) = (x, y, f) = y=f(x) .
Information has to be explained by quadruple (source, recipient, evidence,
subject). Here, it is important to remember Mark’s “Infological System”  as
Subject.
The triples are object of study by Mathematics, quadruples – by Informatics.

Friendly regards
Krassimir

From: Stanley N Salthemailto:ssal...@binghamton.edu
Sent: Monday, August 25, 2014 4:51 PM
To: fismailto:fis@listas.unizar.es
Subject: Re: [Fis] Fw: Krassimir's Information Quadruple and GIT. Quintuples?

Bob wrote:

Recall that some thermodynamic variables, especially work functions like
Helmholz  Gibbs free energies and exergy all are tightly related to
information measures. In statistical mechanical analogs, for example, the
exergy becomes RT times the mutual information among the molecules

S: So, the more organized, the more potential available energy.

I happen to be a radical who feels that the term energy is a construct
with little ontological depth.

It is a bookkeeping device (a nice one, of course, but bookkeeping nonetheless).
It was devised to maintain the Platonic worldview. Messrs. Meyer  Joule simply
gave us the conversion factors to make it look like energy is constant.

S: It IS constant in the adiabatic boxes used to measure it.

*Real* energy is always in decline -- witness what happens to the work
functions I
just mentioned.

S```

### Re: [Fis] COLLECTIVE INTELLIGENCE

```I think of ‘collective intelligence’ as synonymous with collective ‘information
processing’.  I would not test for its existence by asking if group-level
action is smart or adaptive, nor do I think it is relevant to ask whether
‘collective intelligence’ informed or misinformed individuals.  I would say
that in the classic example of eusocial insect colonies (like honey bees, for
example) there is no reasonable doubt that information is processed at the
level of the full colony, which can be detected by the coordination of
individual activities into coherent colony-level behavior.  Synchronization and
complementarity of individual actions reflect the top-down influences of
colony-level information processing.

It is the existential question that I think is key here, and I hope our
conversation includes objective ways to detect the existence or absence of
instances where a ‘collective intelligence’ has manifested as a way to keep
this concept more tangible and less metaphorical.

Cheers,

Guy

On Mar 6, 2014, at 9:22 PM, Steven Ericsson-Zenith
ste...@iase.usmailto:ste...@iase.us wrote:

Is there such a thing as Collective Intelligence?

I am concerned that the methods of the Harvard paper demonstrate nothing at all
and, however well intended, they appear to be insufficiently rigorous and one
might say unscientific.

If the question were: are there things that a group of individuals may achieve
that an individual may not, build the Pyramids or go to the Moon, for example,
then manifestly this is the case.

However, can we measure the objective efficiency of a group by considering the
problems solved by individuals working together in groups such that we may
identify whether there is an environment independent quantifiable addition or
loss of efficiency in all cases? Perhaps, but one suspects not.

Bottomline: I think you must stop worrying about collective intelligence and
speak to quantifiable efficiencies in all cases.

How does IT effect the existence or non-existence of Collective Intelligence?

The internet does not seem to have especially improved general intelligence -
it has made apparent the ignorance what what there all along. On the other
hand, it appears to have misinformed more individuals than it has benefitted.

Steven

--
Dr. Steven Ericsson-Zenith

http://iase.infohttps://urldefense.proofpoint.com/v1/url?u=http://iase.infok=WP1bMHVseboJL1xhlI0uEw%3D%3D%0Ar=d4Y8aWA0V2xv%2BdMiVpackg%3D%3D%0Am=H8%2FN6AP1uSzRqq56AQ22xIcAxRocp4ZYrmvgeVW8Bg4%3D%0As=99f1ed93596839c7447d6f86384f92c306b2d65df45cf96f624b1cee8de53033

+1-650-308-8611

On Thursday, March 6, 2014, Pedro C. Marijuan
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es wrote:
Dear John P. and FIS Colleagues,

Thanks for the kickoff text. It a discussion on new themes that only
occasionally and very superficially has surfaced in this list.
Intelligence, the information flow in organizations, distributed
knowledge, direct crowd enlistment in scientific activities... It sounds
rather esoteric, but in the historical perspective the phenomenon is far
from new. Along the biggest social transformations, the new information
orders have been generated precisely by new ways to circulate
knowledge/information across social agents--often kept away from the
previous informational order established. In past years, when the
initial Internet impact was felt, there appeared several studies on
those wide historical transformations caused by the arrival of new
social information flows --O'Donnell, Hobart  Schiffman, Lanham, Poe...

But there is a difference, in my opinion, in the topic addressed by John
P., it is the intriguing, more direct involvement of software beyond the
rather passive, underground role it generally plays.  Organizational
processes frozen into the artifact--though not fossilized. Information
Technologies are producing an amazing mix of new practices and new
networkings that generate growing impacts in economic activities, and in
the capability to create new solutions and innovations. So, the three
final questions are quite pertinent. In my view, there exist the
collective intelligence phenomenon, innovation may indeed benefit from
this new info-crowd turn,  and other societal changes  are occurring
(from new forms of social uprising  and revolt, to the detriment of the
natural info flows --conversation--, an increase of individual
isolation, diminished happiness indicators, etc.)

Brave New World? Not yet, but who knows...

best ---Pedro

Prpic wrote:
ON COLLECTIVE INTELLIGENCE: The Future of IT-Mediated Crowds
John Prpić
Simon Fraser University
pr...@sfu.ca

Software (including web pages and mobile applications etc) is the key
building block of the IT field in terms of human interaction, and can be
construed as an artifact that codifies organizational process “…in the form
of software embedded ```

### Re: [Fis] Physics of computing

```Greetings All,

While I like to think that I am not limited to reductionistic thinking, I find
it difficult to understand any perspective on information that is not limited
to physical manifestation. I would appreciate further justification for a
non-physicalist perspective on information.  How can something exist in the
absence of physical manifestation?  I am not interested in a metaphysical
perspective here, which might have heuristic value even if it is not 'real'.
The issue of 'content' and 'meaning' strikes me as entirely physical, so
mentioning those issues doesn't help me understand what non-physical
information might be.  I would say that if information is physically manifested
by contrasts (gradients, negentropy, …), then content or meaning refers to the
internal dynamics of complex systems induced by interaction between the system
and the physically manifested information.  If there is no affect on internal
dynamics, then the system did not 'perceive' the information.  If the
information merely causes a transient fluctuation of the internal dynamics,
then the perceived information was not meaningful to the system.  At least this
is a sketch of my view that I hope illustrates why the notions of 'content' and
'meaning' does not depart the physical realm for me.

Regards,

Guy

From: Pedro Clemente Marijuan Fernandez
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es
Date: Fri, 16 Mar 2012 04:19:31 -0700
To: Foundations of Information Science Information Science
fis@listas.unizar.esmailto:fis@listas.unizar.es
Subject: Re: [Fis] Physics of computing

Dear discussants,

I tend to disagree with the motto information is physical if taken too
strictly. Obviously if we look downwards it is OK, but in the upward
direction it is different. Info is not only physical then, and the dimension of
self-construction along the realization of life cycle has to be entered. Then
the signal, the info, has content and meaning. Otherwise if we insist only
in the physical downward dimension we have just conventional computing/ info
processing. My opinion is that the notion of absence is crucial for advancing
in the upward, but useless in the downward.
By the way, I already wrote about info and the absence theme in a 1994 or 1995
paper in BioSystems...

best

---Pedro

walter.riof...@terra.com.pemailto:walter.riof...@terra.com.pe escribió:

Thanks John and Kevin to update issues in information, computation, energy and
reality.

I would like point out to other articles morefocused in how coherence and
entanglement are used by living systems (far from thermal equilibrium):

Engel G.S., Calhoun T.R., Read E.L., Ahn T.K., Mancal T., Cheng Y.C.,
Blankenship R.E., Fleming G.R. (2007) Evidence for wavelike energy transfer
through quantum coherence in photosynthetic systems. Nature, 446(7137): 782-786.

Collini E., Scholes G. (2009) Coherent intrachain energy in migration in a
conjugated polymer at room temperature.  Science, vol. 323 No. 5912 pp. 369-373.

Gauger E.M., Rieper E., Morton J.J.L., Benjamin S.C., Vedral V. (2011)
Sustained Quantum Coherence and Entanglement in the Avian Compass. Phys. Rev.
Lett., 106: 040503.

Cia, J. et al, (2009)  Dynamic entanglement in oscillating molecules.
arXiv:0809.4906v1 [quant-ph]

Sincerely,

Walter

___
fis mailing list
fis@listas.unizar.esmailto:fis@listas.unizar.eshttps://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Avda. Gómez Laguna, 25, Pl. 11ª
50009 Zaragoza, Spain
Telf: 34 976 71 3526 ( 6818) Fax: 34 976 71 5554
-

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Meaning Information Theory ---From Gavin

```Hi Gavin,

and corresponding logic and mathematics, although I'm not sure I agree that
things are so bleak.  The 'implicit' part I detect (please let me know if I am
wrong) is that you seem to be confounding meaningfulness (semantics) with a
physical basis for information.  I personally embrace the notion that
information can only exist as physically manifested structure and process
(negentropy), but I don't connect meaningfulness to the existence of
information.  I see meaning as something that might or might not become
attached to information when it is perceived.  In fact, I think Shannon was
focused on meaning, rather than the physicality of information.  He was, after
all, working on codes and code-breaking.  Are you indeed tying physical
manifestation of information to meaningfulness?  If so, why?

Regards,

Guy

From: Gavin Ritz garr...@xtra.co.nzmailto:garr...@xtra.co.nz
Date: Mon, 24 Oct 2011 15:33:01 -0700
To: 'Christophe Menant'
christophe.men...@hotmail.frmailto:christophe.men...@hotmail.fr,
Foundations of Information Science Information Science
fis@listas.unizar.esmailto:fis@listas.unizar.es
Subject: [Fis] Meaning Information Theory ---From Gavin

Hi there Christophe
Thank you for your papers I have had a look through them to identify the
propositions and arguments.

And I guess this is what my contention is none of this is based on any
evidence, tests, corroboration and corresponding logic and mathematics.

It does not look like we are any further down the road that Ogden  Richards
(the Meaning of Meaning-1923) or Plotkin (Darwin Machines 1993) 70 years later.

I have delved into many of the original papers around information and cannot
find any corroborative evidence or propositions with logical arguments that can
highlight the concept of information per se as presented by the meaning
informationists(those that are not proposing Shannon’s theory).

If we are going into the concept of meaning that would include, human
knowledge, learning and creativity (is learning acreative act?) etc.

Your final conclusions are that you need some notion of constraint, possibly
the co-limit, subobject classifier, object-arrow, associativity, and identity
of Mathematical Category Theory.

Afterthought, the concept of meaning information also includes this concept of
memes presented by Dawkins in Chapter 11 in the Selfish Gene, there is not on
scrap of evidence or tests, or any factual data to conclude that the concept of
memes are anything but a conjecture.

Dear Gavin,
As you find some interest for a Theory of Meaningful Information, it may be
pertinent to recall a systemic approach to meaning generation:
When a system submitted to a constraint (stay alive, avoid obstacle, ...)
receives from its environment an information that has a connection with the
constraint , it generates a meaning (a meaningful information) that willl be
used to implement an action aimed at satisfying the constraint.
It’s this I don’t understand, where is the evidence and the tests to prove this
proposition. How do we know that this is what a biological system does? Where
is the evidence?

I have searched to find evidence for this statement  “receives information from
its environment”. It just cannot be proved, plainly there’s something wrong
here.

Regards
Gavin

The approach makes available a simple Meaning Generator System applicable to
all cases where you can define the system and the constraint. Is not Shannon
information theory. It links with Dretske and philosophy of mind. It has been
used in several evolutionary approaches.
2003 Entropy paper on subject: http://www.mdpi.org/entropy/papers/e5020193.pdf
2010 short paper: http://crmenant.free.fr/ResUK/MGS.pdf
Part of IACAP 2011 presentation: http://cogprints.org/7584/
Best
Christophe

Date: Mon, 24 Oct 2011 18:22:08 +0200
From: pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es
To: fis@listas.unizar.esmailto:fis@listas.unizar.es
Subject: [Fis] [Fwd: Re: FW: Meaning Information Theory] ---From Gavin

Message from Gavin Ritz

On Fri, Oct 21, 2011 at 12:02 AM, Gavin Ritz
garr...@xtra.co.nzmailto:garr...@xtra.co.nz wrote:

Stan, John list members

I have had a number of off list email dialogue with list members, from this
list and others.

There seems to be a group of listers that have a Theory of Meaningful
Information (It’s not Shannon’s mathematical Information theory), it’s all
about meaning and electrical communication (I guess in this case
neurological).

The common links seem to be Dawkins, Dennett, Searle and a few others.

Does anyone have any clear propositions, with their logical arguments,
evidence. tests, ```

### Re: [Fis] meaningful information

```This is an interesting question.  What is the meaning of meaning?  I would
define as something like the affects of perception on a perceiving system.
Once a system has been affected it might change its behavior, but I would
hesitate to equate a behavioral response directly to the meaning of a
observation, I think behavior is too far removed from internal
meaningfulness.  I wouldn't be comfortable, for example, saying that
Skinner's bell meant salivation to his dog subjects.

Regards,

Guy

On 7/20/11 11:47 AM, Steven Ericsson-Zenith ste...@semeiosis.org wrote:

There is a lot of concept overloading in the community involving the term
meaning. So it would help me if you and Antony could just give a one
sentence definition of the term. For example, for me:

meaning = the behavior that is the product of apprehending a sign.

Which is an extreme pragmatic definition in the spirit of Peirce. Note that
this definition excludes, or rather characterizes differently, descriptive
sentences of the form The dog runs toward the house. The meaning of which is
not that the dog runs toward the house, but the behavior of the apprehender.

With respect,
Steven

On Jul 20, 2011, at 10:41 AM, Loet Leydesdorff wrote:

Dear colleagues,

Some of you may be interested in this context in my forthcoming article ³
Meaning as a sociological concept: A review of the modeling, mapping, and
simulation of the communication of knowledge and meaning, Social Science
Information 50(3-4) (2011) 1-23. In preprint available at
http://arxiv.org/ftp/arxiv/papers/1011/1011.3244.pdf .

I argue that the dynamics of meaning are very different from those of
information.

Best wishes,
Loet

Loet Leydesdorff
Professor, University of Amsterdam
Amsterdam School of Communications Research (ASCoR),
Kloveniersburgwal 48, 1012 CX Amsterdam.
Tel.: +31-20- 525 6598; fax: +31-842239111
l...@leydesdorff.net ; http://www.leydesdorff.net/

From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On
Behalf Of Pedro C. Marijuan
Sent: Wednesday, July 20, 2011 1:38 PM
To: fis@listas.unizar.es
Subject: Re: [Fis] meaningful inforamtion

Thanks, Anthony, for the info on your book. As you will see during future
discussion sessions (currently we are in the vacation pause) some parties in
this list maintain positions not far away from your own views. In our archive
you can check accumulated mails about the matter you propose --e.g.
discussions during the last spring. But I think you are right that the whole
biological scope of information has been rarely discussed.  best wishes
---Pedro

FIS website and discussions archives: see http://infoscience-fis.unizar.es/

I emailed an earlier version of the following contribution to the listserve a
few days ago and am interested in finding out if it is suitable  for
dissemination and, if os, when it might be included. My main interest is in
promoting discussion about the approach it takes to dealing with the
observer-dependent aspects of information.

My book  Meaningful Information: The BridgeBetween Biology, Brain and
way of thinking about information and the importantrole it plays in living
systems. Thiså opens up new avenues for exploring howcells and organisms
change and adapt, since the ability to detect and respondto meaningful
information is the key that enables them to receive their geneticheritage,
regulate their internal milieu, and respond to changes in their
environment.The types of meaningful information that different species and
different celltypes are able to detect are finely matched to the ecosystems
in which theylive, for natural selection has shaped what they need to know to
functioneffectively within them. Biological detection and response systems
range fromthe chemical configurations that govern genes and cell life to the
relativelysimple tropisms that guide single-cell organisms, the rudimentary
nervoussystems of invertebrates, and the complex neuronal structures of
mammals andprimates. The scope of meaningful information that can be detected
andresponded to reaches its peak in our own species, as exemplified by our
specialabilities in language, cognition, emotion, and consciousness, all of
which areexplored within this new framework.

http://www.springer.com/life+sciences/evolutionary+%26+developmental+biology/
book/978-1-4614-0157-5

I am eager tofind out what members think about it.

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto ```

### Re: [Fis] ON INFORMATION THEORY--Mark Burgin, Colophon

```Hi Mark,

The only part that I take exception to is at the end of your colophon.
Specifically, I disagree with the statement “it is evident that to consider
that everything IS information is unreasonable and contradicts principles of
science.”  I see contrast, or difference, as fundamental to the concept of
information.  All ‘things’ must be bounded such that there is a distinction
between the inside and outside of the thing; therefore I don’t see how it is
possible or reasonable for anything not to be information.

Regards,

Guy

On 6/7/11 6:34 PM, Mark Burgin mbur...@math.ucla.edu wrote:

Discussion colophon

Dear all participants of the discussion (active and passive),

I would like to express my gratitude to Pedro for asking me to start a
discussion about basic problems of information theory and methodology, in which
many qualified researchers have participated. I also appreciate efforts of all
active participants of the discussion, who shared their interesting ideas
related to information theory and practice, and especially to Joseph Brenner,
who expertly distilled communication of different participants separating more
or less direct answer to the suggested questions. As these questions have
quintessential importance for information theory and methodology, I would like
to suggest tentative answers to these questions, giving arguments in support of
this approach.

Question 1. Is it necessary/useful/reasonable to make a strict distinction
between information as a phenomenon and information measures as quantitative or
qualitative characteristics of information?

All educated people understand that a person and her/his measure, for example
height, are essentially different entities. It’s impossible to reduce a person
to one measure. The same is true for subatomic particles and other physical,
chemical and biological objects. However, when it comes to information, even
qualified researchers don’t feel a necessity to make a strict distinction
between information as a phenomenon and information measures, although there
are infinitely many information measures. We can often hear and read such
expressions as “Shannon information” or “Fisher information”.

Question 2. Are there types or kinds of information that are not encompassed by
the general theory of information (GTI)?

A grounded answer to this question depends what we understand when we
say/write “types or kinds of information”, that is, on information definitions.
If we take intrinsic information definitions, then the answer is YES as it is
demonstrated in the book (Burgin, 2010).

At the same time, if we take all information definitions suggested by
different people, then the answer is NO because some of those definitions
define not information but something else, e.g., information measure or
knowledge or data. There are also other “definitions” that actually define
nothing. Naturally, these definitions and related concepts (if there are any)
are not encompassed by the GTI. However, GTI organizes all existing knowledge
on information and information processes in one unified system, allowing one to
discern information from other phenomena.

Question 3. Is it necessary/useful/reasonable to make a distinction between
information and an information carrier?

In the mundane life, it is possible not to make a distinction between
information and an information carrier. For instance, we do not make
distinctions between an envelope with a letter and the letter itself, calling
both things “a letter”, or between a book with a novel and the novel itself,
calling both things “a novel”.

At the same time, a proper theory of information demands to make a distinction
between information and an information carrier, especially, because any thing
contains information and thus, is an information carrier, but it is evident
that to consider that everything IS information is unreasonable and contradicts
principles of science.

I would appreciate any feedback to the ideas from this e-mail.

Sincerely,

Mark

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] [Fwd: Re: [Fwd: Info Theory]--From John Collier

```Hi Gavin,

I’m not quite sure how to respond as you didn’t ask a particular question.

Waves are indeed about energy, which I think fits nicely into the scheme I
described regarding information.  I suggested a very simple definition of
information as a contrast.  Physical gradients provide a nice example of
contrast between different conditions on either side of a gradient.  Energy
generically fits this view whether you think about it in either particle (e.g.,
photon) or wave form.  I am not a physicist, but I think energy always exists
as some sort of localized concentration with a gradient between regions of
higher energy and regions of lower energy.  In this sense, energy can always be
considered as a spatially configured pattern, and thus as information.

I also agree that flows are about entropy production, and they must always be
channeled in a way that requires a structural configuration.   This is how I
dissipate those gradients in the process, which diminishes the contrast and
thus the amount of information exhibited by the gradient.  I would describe the
emergent structure of such systems as information captured by the system, or
transferred to the system, as the gradient is diminished.  I see this as an
alternative way to say that the system captures free energy from the flow and
uses it to construct itself.  I generally see information as the inverse of
entropy, so the existence of information goes hand-in-hand with the existence
of entropy.  Whether information/entropy exist or are just heuristic concepts
is an issue for others to debate.  I do think, though, that it IS related to
baryonic matter.

I hope this helps to make more sense of my previous post.

Regards,

Guy

On 2/4/11 4:22 PM, Gavin Ritz garr...@xtra.co.nz wrote:

Hi there Guy

I'm at a loss still about information you mention below.

If one talks about waves, light, sound these are all energy (frequency)
concepts. Chemistry and physics are really only about energy, entropy and
transduction's and conversions of energy in one form or the other of matter.

Any flows of available energy are more than likely entropy production or free
energy. (Gibbs type free energy)

The only codes, and notations are the ones we give it, it is of our own making,
if information does have an existence then its more than likely related to non
baryonic matter.

After all we are making assumptions about a universe with only a less than 4%
understanding of its contents.

Regards
Gavin

Greetings All,

I want to second Joseph’s claim that something may be transferred as
information, even if Stan’s “stuff” itself is not transferred.  Waves, for
example, can often pass from one medium into another without a concomitant
transfer of stuff, and the form of the wave may be changed when it enters the
new medium.  The energy of the wave, which can generally be measured by its
physical manifestations (e.g., particle densities, free energy concentrations,
local gradients and potentials...) may be sustained in a temporally and
spatially coherent way as it flows.  I personally like to think about
information as contrast, such as with local gradients, and in this sense we
might say that it it the information itself that flows into a recipient.
Interpretation, then, involves the change in form that can occur in the new
medium.  Of course, information, like waves, are not always able to penetrate
any new medium or system.  It can be damped out in some transitions, and
amplified through resonance in others.

I think this perspective bridges some of the seemingly disparate views that
have been voiced over the last week.

Regards,

Guy

On 1/31/11 9:29 AM, joe.bren...@bluewin.ch joe.bren...@bluewin.ch wrote:

Dear All,

In coming to Krassimir's defense, I do not wish to abrogate the science of the
last 100-150 years, but to suggest only that the appeal to authority, here as
elsewhere, should not block criticism. The standard meaning of information is
also restricted in some senses.

The dimension that Krassimir and his source are pointing to is not just
poetic, but describes real interactions between sender and receiver. However, I
would criticize absolute statements such as nothing is transferred. In my
approach to logic (which I hope John includes in his various logics), it is
not necessary to make an absolute distinction between the concept of
information and its causal and material properties. They are dialectically

On the other hand, I think it is important to emphasize, as Krassimir does,
that there are properties of information that cannot be measured. This point,
and the others above, will not constitute an entire, monolithic Information
Theory, nor its entire essence. But they should be taken into account as part
of the common meanings of various theories which I, /pace/ John :-) find ```

### Re: [Fis] [Fwd: Re: [Fwd: Info Theory]--From John Collier

```Greetings All,

I want to second Joseph’s claim that something may be transferred as
information, even if Stan’s “stuff” itself is not transferred.  Waves, for
example, can often pass from one medium into another without a concomitant
transfer of stuff, and the form of the wave may be changed when it enters the
new medium.  The energy of the wave, which can generally be measured by its
physical manifestations (e.g., particle densities, free energy concentrations,
local gradients and potentials...) may be sustained in a temporally and
spatially coherent way as it flows.  I personally like to think about
information as contrast, such as with local gradients, and in this sense we
might say that it it the information itself that flows into a recipient.
Interpretation, then, involves the change in form that can occur in the new
medium.  Of course, information, like waves, are not always able to penetrate
any new medium or system.  It can be damped out in some transitions, and
amplified through resonance in others.

I think this perspective bridges some of the seemingly disparate views that
have been voiced over the last week.

Regards,

Guy

On 1/31/11 9:29 AM, joe.bren...@bluewin.ch joe.bren...@bluewin.ch wrote:

Dear All,

In coming to Krassimir's defense, I do not wish to abrogate the science of the
last 100-150 years, but to suggest only that the appeal to authority, here as
elsewhere, should not block criticism. The standard meaning of information is
also restricted in some senses.

The dimension that Krassimir and his source are pointing to is not just
poetic, but describes real interactions between sender and receiver. However, I
would criticize absolute statements such as nothing is transferred. In my
approach to logic (which I hope John includes in his various logics), it is
not necessary to make an absolute distinction between the concept of
information and its causal and material properties. They are dialectically

On the other hand, I think it is important to emphasize, as Krassimir does,
that there are properties of information that cannot be measured. This point,
and the others above, will not constitute an entire, monolithic Information
Theory, nor its entire essence. But they should be taken into account as part
of the common meanings of various theories which I, /pace/ John :-) find most
interesting.

Best,

Joseph

Ursprüngliche Nachricht
Von: pcmarijuan.i...@aragon.es
Datum: 31.01.2011 17:35
An: fis@listas.unizar.es
Betreff: [Fis] [Fwd: Re:  [Fwd:  Info Theory]--From John Collier

(Msg. from John Collier)

Unfortunately for your position, Krassimir, there is a well established usage
of information in physics going back to Szillard's discussion of Maxwell's
Demon in 1929, well before the dawn of communication theory. This usage is
firmly entrenched in physics, used by such notables as Gell Mann, Wheeler and
Hawking. So as far as usage of the word information is concerned, you were
trumped long ago. I suggest that you, when using the word information make
clear that you are using a specific restricted meaning rather than the general
term. In fact I think that everyone on the list should practice similar hygiene.

The word information has a range of meanings that are related much like
Wittgenstein's family resemblances. It is perhaps a paradigmatic case of this.
Anything in common is pretty basic, and not very interesting, to my mind, but
worth working out in any case.

There are connections of information theory to various logics, including the
logic of distinctions and its extensionally equivalent propositional logic,
predicate logic, and various other logics of a more restricted realm. These are
all worth working out.

However I think it is pointless, or nearly so, to try to find the one true
meaning of 'information' (I use the philosopher's convention for single and
double quotes in this post). I wish people would just let it go, and learn to
be more flexible and open to different approaches that they don't find
intuitively or experientially appealing.

John

At 01:22 PM 1/31/2011, Pedro C. Marijuan wrote:

From: Krassimir Markov mailto:mar...@foibg.com
Sent: Sunday, January 30, 2011 2:13 AM
To:fis@listas.unizar.es
Subject: Re: [Fis] Info Theory
Â
Dear Colleagues,
Â
In the beginning of the XX-th century (approximately 100 years ago!) the great
Bulgarian poet Pencho Slaveikov wrote:
Â
The speaker doesn't deliver his thought to the listener,
but his sounds and performances provoke the thought of the listener.
Â
Between them performs a process like lighting the candle,
where the flame of the first candle is not transmitted to another flame,
Â
but only cause it.
Â
Â
From my point of view, this is the essence of the Info Theory and, especially,
of the Communication Theory.
Â
Really, nothing is transferred but everything CAUSE our mind to âlightâ.
Â
âInformationâ is a human concept.

### Re: [Fis] fis Digest, Vol 543, Issue 19

```Dear Colleagues,

I have some sympathy for Pedro's call for acceptance of a fuzzy definition
for intelligence, or perhaps a large set of operational definitions. This
is familiar to me as an evolutionary biologist.  We treat the concept of
fitness exactly this way, and I think both concepts hold great heuristic
value even in fuzzy form.  My concern under these circumstances is that we
have a sufficiently clear definition that it sustains a cogent discussion.
If the definition is so fuzzy that disagreements commonly boil down to
presumptive differences, then serious discussion is likely to be
unproductive.  I would personally find it helpful to know what the
limitations are on the meaning of intelligence, and what operational
definitions are being used when individuals intend to address more narrow
definitions.  Is it acceptable for a single entity or action to be
considered intelligent by one observer and unintelligent by another?

Regards,

Guy

On 11/22/10 9:01 AM, Pedro Clemente Marijuan Fernandez
pcmarijuan.i...@aragon.es wrote:

Dear FIS colleagues,

very briefly stated (ugh, no spare time, devoured by ugly application
forms!), I think that quantification as Guy demands can only occur in
some small corners of our discussion areas, but not in the fundamental
ideas, not well crafted yet. For instance, I take from a recent response
of Raquel to Stan the notion of intelligence as the capability to
process information for the purpose of adaptation or problem solving
activities. In the case of cells, problems can be caused by the
environment, extracellular aggressions, communications, etc. Well, we
can quantify (and have already done) the portions of the signaling
system involved, their correlation with genome size, etc., but have not
developed a good conceptual integration of signaling with transcription
---and to my taste nobody as done yet, as signaling means the
topological governance of an enormous gene network... I mean,
premature emphasis on quantificationmay backtrack and obfuscate on
misunderstanding the big picture.

I understand Joseph lamentations, but do not share them, as logical
clarification of an intrinsically evolutionary phenomenon --without any
major discontinuity-- as intelligence is (at least in my view), becomes
too big or too daring an undertaking. To make better sense of the
evolutionary phenomenon of intelligence, I suggested populational
thinking (see msg. below). Now I ad optimality to the mix, meaning
the presence or better the emergence of collective principles of
optimality that guide the distributed processes by the agent populations
participating in the game (roughly, optimization principles running
within cells, nervous systems, social markets). And a third ingredient ,
very subtle one, could be labeled as doctrine of limitation. It refers
to consequences of the fundamental limitations of all participants at
whatever level to have a complete info on the occurring collective
game, or a complete processing capability. In my view, this is the
most difficult and consequential point --besides, it directly militates
against the God's view we attribute to scientific observer... we already

best wishes

---Pedro

Guy A Hoelzer escribió:
Pedro et al.,

My previous cautionary post did not get much traction in this thread, but I
still think my point was an important one to ensure that we are all talking
about the same thing.  My point was that ³intelligence² in inherently
subjective (in the eye of the beholder), unless we can agree on the criterion
of performance quality.  I think this is necessary if we are to jump from
mere information processing (cascades of effects resulting from the input of
information to a system) to a notion of ³intelligence².  We could, for
example, define human intelligence as measured by performance on an IQ test.
We could more generally define intelligence in an evolutionary context as
measured by the fitness effects of information processing.  I am personally
not a big fan of either of these criteria.  John and Pedro seem to suggest
using the degree of ³functionality² resulting from information processing as
a general criterion.  I am intrigued by this option, although I¹m not sure
how functionality can be measured objectively.

I wonder whether this point did not get much traction previously because
others disagree, or just don¹t think it is important.  If my point is both
correct and important, then I think we should agree on a sufficiently general
performance criterion for the evaluation of intelligence early in this
point?

Regards,

Guy

On 11/19/10 4:11 AM, Pedro Clemente Marijuan Fernandez
pcmarijuan.i...@aragon.es wrote:

Dear John and FIS colleages,

I much agree (below) with the return to the biological; also Gordana and

### Re: [Fis] fis Digest, Vol 543, Issue 19

```Pedro et al.,

My previous cautionary post did not get much traction in this thread, but I
still think my point was an important one to ensure that we are all talking
about the same thing.  My point was that “intelligence” in inherently
subjective (in the eye of the beholder), unless we can agree on the criterion
of performance quality.  I think this is necessary if we are to jump from mere
information processing (cascades of effects resulting from the input of
information to a system) to a notion of “intelligence”.  We could, for example,
define human intelligence as measured by performance on an IQ test.  We could
more generally define intelligence in an evolutionary context as measured by
the fitness effects of information processing.  I am personally not a big fan
of either of these criteria.  John and Pedro seem to suggest using the degree
of “functionality” resulting from information processing as a general
criterion.  I am intrigued by this option, although I’m not sure how
functionality can be measured objectively.

I wonder whether this point did not get much traction previously because others
disagree, or just don’t think it is important.  If my point is both correct and
important, then I think we should agree on a sufficiently general performance
criterion for the evaluation of intelligence early in this thread.  Is there a
perspective on “intelligence” that would contradict this point?

Regards,

Guy

On 11/19/10 4:11 AM, Pedro Clemente Marijuan Fernandez
pcmarijuan.i...@aragon.es wrote:

Dear John and FIS colleages,

I much agree (below) with the return to the biological; also Gordana and Raquel
had already argued along these guidelines. It does not mean that things become
very much clearer initially in the connection between information and
intelligence, but there is room for advancement. Thus, in Yixin's question,
What is the precise relation between intelligence and information?, one of
the basic aspects to explore becomes populational thinking --not much
considered in AI schools (perhaps very secondarily in the neural networks
school.

In fact, in all realms of intelligence in Nature (cellular, nervous systems,
societies), we find populations of processing agents. In cells, it is the
population of many millions of enzymes and proteins performing catalytic tasks
and molecular recognition activities --any emphasis in molecular recognition
will get short of the enormous importance this phenomenon has in biological
organization, it is the alpha and omega (Shu-Kun-Lin has produced one of the
best approaches to the generality of this phenomenon). How populations of
enzymes achieve an emergent capability of intelligence? Unfortunately, we can
show). The discussion on neuronal intelligence carries a similar problem, as
the neurodynamic underpinnings of animal behavior and animal intelligence still
lack a central theory (most of the debate on consciousness is but an
uninteresting quagmire)... Finally, a much debated contemporary topic related
with social intelligence deals with the problem solving capacity of markets. A
very extended conception about social organization hinges in the faith that the
creativity of individuals coupled with the invisible hand of markets can
solve all problems, climate change included... given the magnitude of
civilization survival problems of today, the topic of social intelligence
deserves some second thoughts.

Anyhow, the above were just tidbits. Taken seriously, populational thinking
can produce a new discourse in the relationship between information and
intelligence. I keep saying what I argued during the Beijing conference, we
need a new way of thinking.

best regards

---Pedro

This is a common situation in
biology. In fact I have been told that some
proteins pass through membranes through
successive conformational changes that remove
energy barriers to the transfer, much like the
simple experiment reported in the article. This
has been known for at least 15 years, I think.
Inasmuch as there is functionality here, semiotic
considerations may be relevant in this case. But
not in the case in the article. Intelligence is a
special case of the biological (so far).
Conformational change is even more important and
less dependent on the energetic substrate, and
more on other conformations and their changes (e.g., in inference).

The intelligent systems mainly do the same.

Everything does the same. It is how it is done that is important.

My best,
John

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Tactilizing processing

```Hi All,

I appreciate this topic and discussion.  I find myself in strong agreement
with the basic point made by Stan and Bob.  Not all fluctuations penetrate
upwardly across levels of functional organization.  Structural resonance in
the organization at some level makes it sensitive to certain kinds of
fluctuations in embedded lower levels of organization.  John seems to raise
an issue that follows this view.  That is, are there processes
(synchronization, harmonization, rhythmic entrainment...) that lead complex
adaptive systems to evolve mechanisms for sensing particular kinds of lower
level fluctuations?  It certainly seems that such mechanisms have indeed
evolved in biological systems, but is this a generic expectation?

Regards,

Guy

On 11/1/10 2:46 AM, John Collier colli...@ukzn.ac.za wrote:

At 09:13 AM 01/11/2010, Loet Leydesdorff wrote:
Dear colleagues,

It seems to me that we have a more elaborated apparatus for discussing the
distances of a perturbation across a number of interfaces.

Two information processing systems can be considered as structurally
coupled when the one cannot process without the other. A single
(system/environment) interface is then involved. If two interfaces are
involved, the in-between system mediates; the coupling can then be
considered as operational since the mediating system has to operate before
transfer can take place across the interfaces. When more than two interfaces
are involved, the coupling becomes increasingly loose, and another mechanism
thus has to be specified for the explanation.

There is an attempt to deal with this sort of thing at
http://complex.unizar.es/~yamir/papers/phys_rep_08.pdf

It is quite a bit more general. With my administrative load right now
I haven't had time to read the paper, just to glance over it. Their
central interest is not information or information processing,
but it is mentioned in several places.

Synchronization is another term like harmonization and
rhythmic entrainment.

A friend who sent me the reference says that the mathematics
starts off well, but gets shakier through the paper.

Now back to the budget.

John

--
Professor John Collier, Acting HoS  and Acting Deputy HoS
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Revisiting... --From Dieter Gernert

```Hi all,

I have been enjoying the current discussion and appreciate Dieter’s focus on
process.  I am an evolutionary biologist, not a physicist, but I would like to
suggest one way in which some of the views expressed in different posts might
be reconciled.

From a simplistic point of view, I think it is fair to posit that spatial
pattern (e.g., the existence of particles) is manifested information, and that
pattern is generated by process (e.g., particle interaction).  Process itself
can also be viewed as information in the form of temporal pattern.  Pattern
and process are inextricably linked in self-organizing dissipative systems,
which represent a special class of “its”.  Other kinds of “its” include
artifacts of dissipative system dynamics, which stumble from one local entropy
peak to another under thermodynamic constraints.  Of course, particulate
artifacts can also be swept up in other thermodynamic cascades, including
those exploited by other dissipative systems.

The Prigogine notion of dissipative systems provides a compelling case, in my
view, for including both pattern and process in generic treatments of
information.

Regards,

Guy
--
Dr. Guy A. Hoelzer
Department of Biology, MS 314
Reno, NV  89557

On 9/29/10 3:38 AM, Pedro Clemente Marijuan Fernandez
pcmarijuan.i...@aragon.es wrote:

(herewith a very interesting text received off-line from a newcomer to our list
--welcome Dieter!---Pedro)

--

1. For many years I highly estimate the work of Michael Conrad – whom I never
could see or hear in person. So the study was restricted to reading some
papers, and to store them as a separate file. I am very glad for the references
to more recent work.
2. Before making any comment on the transmitted text, I must admit that I do
not have sufficient knowledge on biology to give convincing remarks.
3. Modern physics must necessarily be physics at the Planck scale. I do not
know whether in this moment there is a sufficient, explicit physics at the
Planck scale such that one build up on this basis. Anyway, it must be a theory
of processes, not of particles.
4. Anti-entropy or negentropy are children of the classical Shannon-Weaver
theory, which is incorrectly (only due to a certain historical development)
called information theory. There are specific (narrow, local) situations in
biology where Shannon-Weaver is sufficient. But in the general case – and for a
modern, futuristic theory – it can really be doubted whether Shannon-Weaver
(here it is always meant: together with extensions and ramifications) will be
sufficient. It seems to me that the comprehensive theory is needed, which
(again for historical reasons) is named theory of pragmatic information. This
is not opposed to Shannon-Weaver, but the latter is included as a special case
(one can state conditions under which Sh.-W. will be adequate for a situation).
An overview (including the historical development) can be found:
Gernert, D., Pragmatic information: historical development and general
overview. Mind and Matter, vol. 4 no. 2 (2006) 141-167.
Here I am really only a reporter and historian – I did not make concrete
5. For any concept setting out to connect the manifest and the unmanifest a
mathematical structure is required which permits us to describe the manifest
and the nonmanifest and the interaction between both realms, or more precisely:
conditions for an influence to occur in a single situation. It seems to me that
one can do this along the lines sketched in my paper:
Gernert, D., Formal treatment of systems with a hidden organizing structure,
with possible applications to physics. Int. J. of Computing Anticipatory
Systems 16 (2004) 114-124.
It will become inevitable to use a vector space on the basis C (the algebraic
field of complex numbers). Best candidates in this moment are C^3 and C^4 (such
that we have 6-  or 8-parametric manifolds – not 6 or 8 dimensions!). Equally
important is a measure for the similarity between complex structures. To both
issues I published proposals, and if there will be better ones, I shall quickly
6. Models like particle/anti-particle pair production is a matter of the
underlying physical structure; it will not contribute to explain the
interaction or non-interaction between two complex structures. Any answer to
the question interaction between these two or not? must take into account the
entire structure of those two.
7. I do not believe that consciousness has something to do with rather
elementary processes like the unmasking mentioned in the text. From the
viewpoint of a research strategy one can put off this question and first try to
understand the processes.

Kindest regards,

Dieter Gernert
Professor of Computer Science
Technical University of Munich```

### Re: [Fis] The Assymetry of Information

```Hi Joseph,

This is an interesting topic having to do specifically with the way humans
process and weigh the validity of socially transmitted information.  I would
like to add entry order effects to the positive/negative bias you describe.
I personally view cognition as a process that generates and relies on heuristic
models, which are generally prone to entry order effects.  In this context I
think we tend to attribute validity to ideas or claims that we encounter first.
The validity-bar is raised for subsequently encountered alternatives.  Neither
of these biases seem to reflect natural probabilities, although I can imagine
that they might have had (still have?) selective value for our ancestors.

Regards,

Guy

on 11/12/09 9:26 AM, Joseph Brenner at joe.bren...@bluewin.ch wrote:

Dear Colleagues,

Here is a possible, and timely, FIS discussion topic:

The refusal of large numbers of people, especially health professionals, to be
vaccinated against H1/N1 flu poses the question of the information, if any, on
which they base this decision.

If one goes back to the beginning of the epidemic, one notes several systems of
highly negative (frightening) information: the new flu might be extremely
dangerous; the pharmaceutical industry will be unable to produce a vaccine;
even if it does, it will be dangerous because not tested sufficiently for
profit reasons.

Assuming the sources of this information are the media and certain experts,
one can see the assymetry in terms of responsibility: if the media are right to
be negative, they look prescient; if they are wrong, and all goes well, as
appears to be the case, no one cares and their role in propagating the
information is quickly overlooked, if not forgotten. If, however, the media
information had been neutral or positive, i.e. there was no reason to be
alarmed, they might be attacked legally if a flu pandemic occurred. Negative
information sells better than positive information.

The really frightening result of the original negative information, however, is
that it seems to override the current positive information, statements by
Ministers of Health, etc: people don't get vaccinated, and the pandemic becomes
more probable!

My questions are to what extent has the theory of value-laden information and
this kind of dynamic been studied and what kind of public discussion of it is
desirable and possible?  I look forward to comments.

Best wishes,

Joseph

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] information(s)

```Hi Michel,

You are correct about the use and concept of information in English.
General use of the term information refers to a fuzzy concept that is
continuously distributed from none to much, so the plural form
informations feels incorrect.  Of course, in scientific discourses the
term has been operationally sharpened and discretized into bits.

Cheers,

Guy

on 12/6/08 6:35 AM, Michel PETITJEAN at [EMAIL PROTECTED] wrote:

Hello FISers.

Recently, one of my colleagues attract my attention on the following point.
In French, we often use information as a countable quantity,
so that we can write informations.
In English, it seems that it is unusual, if not incorrect, to do that.
(1) Please can some English native FISers give their opinion about that ?
(2) Please can some FISers from non English-speaking countries tell us
how is the situation in their own language ?

Thank you very much.

Michel.

Michel Petitjean,
DSV/iBiTec-S/SB2SM (CNRS URA 2096), CEA Saclay, bat. 528,
91191 Gif-sur-Yvette Cedex, France.
Phone: +331 6908 4006 / Fax: +331 6908 4007
E-mail: [EMAIL PROTECTED]
http://petitjeanmichel.free.fr/itoweb.petitjean.html

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Economic modeling

```Hi Robin (and other FISers),

I hope this isn't just being picky.

I would argue that both booms and busts are driven by positive feedback.
Buying begets more buying in one instance and selling begets selling in the
other.  Negative feedback tends to stabilize the dynamics of a system.

Regards,

Guy Hoelzer

on 11/14/08 5:00 AM, Robin Faichney   (by way of Pedro Marijuan
[EMAIL PROTECTED]) at [EMAIL PROTECTED] wrote:

Thursday, November 13, 2008, 7:54:55 PM, I wrote:

Not only economists have economic models.

In my opinion the most important complicating factor in economics, as
in other aspects of human culture, is the fact that every agent,
including institutions as well as individuals, models both other
agents and, in many cases, the system as a whole.

For instance, agents observe the economic behaviour of, and the
deals obtained by, many other agents. This informs consumer and
business confidence and is what enables the negative and positive
feedback loops that lead to booms and busts, respectively.

Oops! It is, of course, positive feedback that causes bubbles (in
particular sectors) and booms (across an economy), and negative
feedback that causes busts.

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Re: info meaning

```Dear Giuseppe et al.,

I find the issues of meaning and interpretation very interesting, but I
think this FIS discussion needs to find some common ground if we are to get
anywhere.  For example, Giuseppe wrote:

There is no purely physical status of information, since a physical
structure yields no information, per se.

I couldn't disagree more, although I'm not sure that we disagree at all in
substance.  I take structure, organization, contrast, and gradients to be
the physical essence of all information by definition.  This is why I see a
fundamental connection between information and entropy.  The problem is that
I find myself unwilling to suspend my personal lexicon in order to better
appreciate the substance of posts like the one below,  and others seem to be
equally inflexible with semantics in this context.   I wonder if we can
agree upon a set of terms for our discussion (and beyond?) that will help to
clarify the scope and limitations of the ideas we are discussing.

Here is my attempt to apply Stan's specification hierarchy to the levels
targeted for the term 'information' in our discussion:

(physical structure (observer perception and interpretation) signals and
communication)))

As I see it, there is nothing for an observer to perceive in the absence of
physical structure, and signals cannot transmit meaning' if observers are
unable to perceive and interpret them.  My personal preference is to ally
'information' with structure at the base of it all, but we should find a set
of terms to keep these levels distinct in our conversation that is agreeable
to all of us.  We may be working too hard in arguing about which of these
three levels is the basis of 'information'.

Regards,

Guy

on 10/15/07 10:04 AM, Giuseppe Longo at [EMAIL PROTECTED] wrote:

On Sunday 14 October 2007, mjs wrote:
If information is not physical, and therefore governed by physical
principles, then what is its ontological status?

why any scientific notion should have a physical ontological status?
the issue is never ontological, but just theoretical: which theory, with its
own theoretical principles, can handle this or that notion? that is the
question.
And, within theories of inert, within which physical (theoretical) principles?
classical, relativistic, quantum?

Information is in signs and languages, it needs an interpreter, or a compiler
as in operational semantics (in computers).
In some contexts, information may be formalised by the same equations as
(neg-)entropy. But the coincidence of equations does imply the formation of
the same invariants, the underlying objects: the wave equation applies to
water waves as well as to Quantum Mechanics (Schroedinger, modulo the passage
to the complex field and Hilbert spaces). In no way a quantum state yields
the same invariants or intended physical object as a water wave: formalisms
may have very different (structural, physical) meanings.
The connection between information and (physical) entropy is not ontological;
indeed, not even theoretical, just formal: a theory requires both a formalism
and the formation of invariants (like with Noether's theorems in Physics:
invariance as symmetries defines the physical objects, by their properties;
no common invariants between Shannon and Boltzmann)

There is no purely physical status of information, since a physical
structure yields no information, per se. Signs must be implemented in
physical ink or digits, of course, but this needs a writer and, then, an
interpreter. This shows up clearly in the issue of finiteness.
In a finite space-time volume, typically, we can only put a finite amount of
signs, thus of information.
But is there, per se, a finite amount of information in a finite space-time
volume?  What then about Riemann sphere, as a model of Relativity, which is
finite, but illimited? how much information does it contain? Infinite? The
question simply does not make sense, in absence of a specification of a
writer and an interpreter (or compiler).
And in a finite space-time volume in Quantum Physics? one needs a wave
equation in a, possibly infinite, dimensional Hibert space, to talk of one
quanton within it; is this finite or infinite information?
A finite number of quanta may, of course, be represented by finitely many
independent state vectors, n say, but quantum superposition allow to obtain
any result, as measure, in R^d, an infinite space.

What is this mystic, absolute, reference to physical principles?  we just
have (current) theories.
Classical, relativistic principles or quantum mecanical happen to be
incompatible as entaglement or, more specifically, the quantum field have no
classical nor relativistic sense, as physical principles.
Which is the physical connection between the (wild) DNA and the (normal)
form of the nose? according to which physical theory can we relate them?
The differential method, as used in molecular biology, radically differs form
its use in ```

### Re: [Fis] info meaning

```Bob,

If the notions of Entropy and Shannon Information are alternative approaches
to characterize the same phenomenon in Nature, then the ways they have been
modeled would not necessarily reveal an underlying and fundamental
commonality.  I think that many of us suspect that this is the case and we
are striving to understand and articulate (model) the fundamental
commonality.  We may be wrong, of course, but your argument doesn't dissuade
me.  In fact, I must admit sheepishly that I'm not sure how one would go
about analyzing the relationship between these ideas in a way that could
dissuade me.  If such an analysis is not possible, then I suspect that the
question of a fundamental commonality will have to die slowly as progress
fails to occur.

Regards,

Guy

on 10/12/07 12:01 PM, bob logan at [EMAIL PROTECTED] wrote:

Loet et al - I guess I am not convinced that information and entropy
are connected. Entropy in physics has the dimension of energy divided
by temperature. Shannon entropy has no physical dimension - it is
missing the Boltzman constant. Therefore how can entropy and shannon
entropy be compared yet alone connected?

I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.

On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a random
soup of
organic chemicals have more Shannon info than an equal number of
organic
chemicals organized as a living cell where knowledge of some
chemicals
automatically implies the presence of others and hence have less
surprise
than those of the  soup of random organic chemicals? -  Bob

Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved (N).
The maximum entropy is therefore log(N). (Because of the randomness of
the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system increases
and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,

Loet

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

```

### Re: [Fis] Re: info meaning

```Greetings All,

In my view meaning¹ exists (or not) exclusively within systems.  It exists
to the extent that inputs (incoming information) resonate within the
structure of the system.  The resonance can either reinforce the existing
architecture (confirmation), destabilize it (e.g., cognitive
disequilibrium), or construct new features of the architecture (e.g.,
learning).  Social communication often involves the goal of re-constructing
architectural elements present in the mind of one agent by another agent.  I
am using highly metaphorical language here, but a very straightforward
example of this at the molecular level is the transfer of structural
information between prions and similar proteins folded in ordinary¹ ways.
In this sense, meaning itself cannot be transferred between agents; although
a new instance of meaning can be constructed.  This is essentially the idea
behind the Dawkins model of populations of memes (concept analogs of genes).

From this point of view, the exactness¹ of a meaning doesn¹t seem to make
sense.  A meaning defines itself without error.  It would make sense,
however, to talk about the degree of similarity between meanings when the
social goal was to replicate a particular instance of meaning.  Perhaps this
is what Jerry meant and I have over-analyzed the idea here, but if this is a
novel or erroneous perspective I would like to see some discussion of it.  I
guess my main point here is to separate the notion of meaningfulness from
the social context that demands the sharing of meanings and constrains the
construction of meanings to resonate at the level of the social network.

Regards,

Guy Hoelzer

on 10/2/07 3:24 AM, Pedro Marijuan at [EMAIL PROTECTED] wrote:

Dear colleagues,

Answering to a couple of Jerry's questions,

Under what circumstances can the speaker's meaning or the writer's meaning be
_exact_?

Is _meaning_ a momentary impulse with potential for settling into a local
minimum in the biochemical dynamic?

A previous point could be---what entities are capable of elaborating that
obscure item we call meaning? Just anything (eg, some parties have stated
that molecules or atoms may communicate), or only the living beings?

My understanding of what Bob has proposed along the POE guideliness is that
only the living cell would be capable --and of course, all the further more
complex organisms.  This point is of some relevance.

After decoding and interpretation of the organic codes, the meaning of my
message about meaning and information may have meaning to you.

Maybe. But I suffer some information overload (perhaps overload is just the
incapablity to elaborate meaning under the present channels or means of
communication).

best

Pedro
=
Pedro C. Marijuán
Cátedra SAMCA
Institute of Engineering Research of Aragon (I3A)
Maria de Luna, 3. CPS, Univ. of Zaragoza
50018 Zaragoza, Spain
TEL. (34) 976 762761 and 762707, FAX (34) 976 762043
email: [EMAIL PROTECTED]
=

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

```

### Re: [Fis] Continuing Discussion of Social and Cultural Complexity

```Greetings,

I agree with Loet and Pedro that it seems important to distinguish between
environmental constraints (including material constraints emanating from the
qualities of components of a system) and self-imposed limitations associated
with the particular path taken as a dynamical system unfolds through time.
In other words, I see some information being generated by the dynamics of a
system, much of which can emerge from the interaction between a system and
the constraints of it's environment.  I have come to this view largely by
considering the process of biological development.  For example, I have come
to the conclusion that the genome is far from a blueprint of a phenotype,
although it is more than a static list of building parts.  I see the genome
as containing a small fraction of the information ultimately represented by
an adult organism, and I think that most of that information is generated
internally to the system as a consequence of the interaction between the
genome and its environment.

Regards,

Guy

on 2/27/07 6:24 AM, Pedro Marijuan at [EMAIL PROTECTED] wrote:

Dear colleagues,

As for the first track (planning vs. markets) I would try to plainly put
the informational problem in terms of distinction on the adjacent (Guy
has also argued in a similar vein). Social structures either in markets or
in central plans become facultative instances of networking within the
whole social set. Then the market grants the fulfillment of any
weak-functional bonding potentiality, in terms of say energy, speed,
materials or organization of process; while the planning instances restrict
those multiple possibilities of self-organization to just a few rigid
instances of hierarchical networking. This is very rough, but if we relate
the nodes (individuals living their lives, with the adjacency-networking
structure, there appears some overall congruence on info terms... maybe.

On the second track, about hierarchies and boundary conditions, shouldn't
we distinguish more clearly between the latter (bound. cond.) and
constraints? If I am not wrong, boundary conditions talk with our
system and mutually establish which laws have to be called into action,
which equations.. But somehow constraints reside within the laws, polishing
their parameter space and fine-tuning which version will talk, dressing
it more or less. These aspects contribute  to make the general analysis of
the dynamics of open systems a pain on the neck--don't they? I will really

best regards

Pedro

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

```

### RE: [Fis] Continuing Discussion of Social and Cultural Complexity

```Stan,

Aren't all constraints a form of information?  I see constraints as informing
the bounds of the adjacent possible and adjacent probable.  If this is correct,
then it would seem to render the economy as almosst pure information.  In
fact, I think it would render all emergent systems as pure information.
Wouldn't it?

Regards,

Guy

-Original Message-
From: [EMAIL PROTECTED] on behalf of Stanley N. Salthe
Sent: Sat 2/24/2007 2:51 PM
To: fis@listas.unizar.es
Subject: Re: [Fis] Continuing Discussion of Social and Cultural   Complexity

Pedro said:

Dear Igor and Stan,

-snip-

The realm of economy is almost pure information. Rather than planning,
markets are very clever ways to handle informational complexity. They
partake a number of formal properties (eg, power laws) indicating that they
work as info conveyors on global, regional  sectorial, local scales.
Paradoxically, rational planning can take a man to the moon, or win a
war, but cannot bring bread and butter to the breakfast table every day.
Planning only, lacks the openness, flexibility, resilience, etc. of
markets. A combination of both, with relative market superiority looks
better...
It is hard for me to visualize the economy as being almost pure
information!  This is to forget about so-called 'externalities' -- sources
and sinks, storms, wars, climate change -- even holidays!  The larger scale
material environment constrains the economy, while that(perhaps mostly as
information) constrains human action.

STAN

with regards,

Pedro

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

```

### Re: [Fis] Joseph Tainter's Social and Cultural Complexity

```Dear Pedro and colleagues,

I want to respond only to the first paragraph of your recent post.

on 12/15/06 3:11 AM, Pedro Marijuan at [EMAIL PROTECTED] wrote:

Dear FIS colleagues,

I disagree with the comments by Steven and Stan on the nature of
complexity. How can one substantiate and quantify social complexity if the
previous complexity within the society's individuals has not been solved?
At the time being, there is no accepted rigorous evaluation of biological
complexity --neither number of genes, RNA transcripts, proteins, nor genome
size, chromosome number etc., provide individually any solid estimation;
together more or less. Perhaps, the only accepted single number as a proxy
of organismic complexity is the number of differentiated cell types
---becoming similar to Joe's approach in societies (social roles, or
professions, plus other issues related to number of artifacts, etc.).
[snip]

In my view, measures of complexity at one level of organization ought not
depend on the details or complexity of the lower levels upon which it is
built.  This is to me the essence of systems emergence, which is the
functional unification of lower level parts.   These parts may or may not be
highly complex themselves.   In the social sphere of biology, the parts are
organisms, or groups of organisms, but I see the complexity of  a social
system as utterly independent of the complexity of organisms.  The stock
exchange, or the economy in general, is extraordinarily complex.   We would
indeed need an objective measure to compare compare the complexities of
organisms to that of economies, but economies need not be more complex than
organisms.  The system manifested by food coops, for example, is a system of
very low complexity compared to the complexity of the people who compose the
coop.

I am not saying that I expect there to be no correlation between a system's
complexity and the complexities of its component parts.  Indeed, I think
this is a reasonable expectation, because more complex parts are likely to
have a much more diverse and unpredictable range of behaviors than less
complex, or non-complex, parts.   However, this need not always hold true,
and it is not the only factor determining system complexity.

To sum up, I like the catch phrase complexity breeds simplicity, because
it emphasizes the notion that functional unification through system
emergence releases us from the need to drill down to the bottom in order to
FULLY understand higher order systems.  In other words, it frees us from the
tedious demands of the reductionistic paradigm.

Regards,

Guy Hoelzer

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

```

### Re: [Fis] genetics: the most outstanding problem, SOLVED

```Dear Arne,

I count myself as a realist and, to paraphrase your statement, I see any
reason in what you wrote to convert me.  Your points about limitations and
biases of the mind regarding our understanding of reality are good ones, and
I think it is important for realists to keep these cautions in mind.
However, your position seems to be that the one thing we can know exists is
the mind itself.  I see this as a useful proposition in the scientific study
of brain function and mental phenomena, but I don't see how it can permit
science to study anything else.  If we don't start with the premise that
there is a reality to study, and which can be understood in some way, it
would seem that science (other than a science of mind) would have no
validity.  In my view, science has demonstrated plenty of validity (e.g.,
modern technology), and I don't see how this could have been achieved if a
coherent reality did not exist or if science was not sufficiently effective
at revealing aspects of reality.  Therefore, I think that the realists
perspective has proven itself.  Nevertheless, it will be interesting to
learn about the ways our minds can mislead us within the context of science.
It would be even more fantastic to learn about particular limitations of our
minds that might prevent us from understanding aspects of reality that might
exist, if it is possible to learn such a thing.

Regards,

Guy Hoelzer

on 11/8/06 1:28 AM, Arne Kjellman at [EMAIL PROTECTED] wrote:

Dear Karl and FIS-collegues,

Yes Karl, in spite of your touch of scorn, I can see 'things' but contrary
to realists I (like Bohr and Feynman and many others) don't place the 'thing'
of my perception into some obscure reality endowed with some magic
properties like objectivity or accessibility on equal terms like some
imaginary world of Platonian ideas. We place this concept where it occurs -
namely IN OUR MINDS. And this is the deciding difference between the realist
and constructivist (or antirealist if you prefer). And as I have told you
before - unless you can resolve the REALIST'S DILEMMA I can see no sound
reason for me converting to realism - and neither for you being one, no
matter how many mantras you are able to produce.

There is something most annoying in the realist's efforts to run away from
the realist's dilemma (the silence is total) - I mean after all almost
everybody can understand that a physicist using a measuring instrument he
doesn't understand (lacks a model of) cannot understand the status of the
entities or phenomena he is measuring. When he claims these entities to be
the things of a common world he, to my mind, simply admits he doesn't
understand the worldly things. So what is then the point of discussing man's
eventual reception of information from such unknown worlds? All we know
then is its scientific name - reality. And this parallels the way other
believers use the word God. When saying so I do not say we are not entitled
to believe in a God, Evolution or reality - or that it might be something
like this beyond human reason. Not at all - I simply say that that certain
knowledge in these questions is beyond human knowing and doesn't belong to
science. Whereof one cannot speak, thereof one must be silent::
Wittgenstein.

Therefore we have better discuss WHAT WE KNOW ABOUT - our impressions and
clarify the realist's dilemma because I think such a discussion could
resolve the realist/antirealist controversy and provide a useful step
towards the understanding of information.

Best Arne

- Original Message -
From: Karl Javorszky [EMAIL PROTECTED]
To: 'Stanley N. Salthe (by way of Pedro Marijuan[EMAIL PROTECTED])'
[EMAIL PROTECTED]; fis@listas.unizar.es
Sent: Tuesday, November 07, 2006 4:04 PM
Subject: [Fis] genetics: the most outstanding problem, SOLVED

Dear Stan,

In your last posting, you said:
SS:  Of course, the origin of the genetic system is arguably the most
outstanding problem facing natural science.  It seems that, other than
the
(to me) unconvincing RNA World idea, there is no compelling model of it.

The model that the RNA (together with the DNA) is a sequence and that the
genetic mechanism copies the information from a sequence (the dna/rna)
into
a nonsequenced assembly (the living organism) and from there (by means of
the ovaries and the testes) back into a sequence is a quite compelling
model.

The term information has been shown in this chatroom to mean the cuts
that
segregate, separate and distinguish summands;
The term sequence has been defined by Peano;
The term nonsequenced /=commutative/assembly is indeed hairy, as there
exists no definition for multidimensional partitions, although this is
what
it means;
The term copies means a filter restriction on a set of entries into a
database (a restricted, in optimal case, bijective map between```

### Re: [Fis] Post-concluding remarks:Realism/anturealism: Laws of nature?

```Hi Bob,

I doubt we disagree in substance here, but I would take issue with the
statement that there are no laws for biology in the same sense as the laws
of physics, because I think the laws of physics apply in all realms.  In
other words, the laws of physics are not limited to physics in an
exclusionary way, because all other disciplines exist within the bounds of
physics.  Therefore, the laws of physics are also laws of biology to me.
After picking this nit, I would agree that there are no additional,
proprietary laws of this sort within biology that do not extend to
non-biological physical systems.

Cheers,

Guy Hoelzer

on 10/26/06 7:16 AM, Robert Ulanowicz at [EMAIL PROTECTED] wrote:

On Thu, 26 Oct 2006, Andrei Khrennikov wrote:

If we follow the line of Arne of realism/antirealism, then what should
we say about LAWS OF NATURE? I think that we would come to the
conclusion that there is no such laws at all. Such a conclusion is not
astonishing in the light of modern views to QM. Since QM (by the
conventional Copenhagen interpretation) declared the death of
determinism (and not because our impossibility to find such
deterministic dynamics, but because quantum randomness is irredusible),
it seems that at the quantum level we are not able to consider physical
laws. We are able only to find some statistical correlations.

I think that this is totally wrong position. As Newton was, I am also
surprised by harmony and consistence in Nature. It could not be just a
product of our social agreement. Well, finally Newton came to the idea
of God who was responsible for this harmony.

Dear Andrei:

Like Walter Elsasser, I believe there are no laws for biology in the same
sense as the laws of physics.

Yes, I agree there is regularity and order in the biological world.

Whence the order? Processes, not law.

http://www.cbl.umces.edu/~ulan/ISEPP.DOC

The best,
Bob

-
Robert E. Ulanowicz|  Tel: (410) 326-7266
Chesapeake Biological Laboratory   |  FAX: (410) 326-7378
P.O. Box 38|  Email [EMAIL PROTECTED]
1 Williams Street  |  Web http://www.cbl.umces.edu/~ulan
Solomons, MD 20688-0038|
--

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

```