Re: [Fis] What is “Agent”?

2017-10-19 Thread Bob Logan
Dear Terry and FIS friends - I agree with all that Terry has said about agency. 
I do wish to however to point out that an agent has choice and a non-agent has 
no choice. I would suggest that the defining characteristic of an agent is 
choice and therefore an agent must be a living organism and all living 
organisms are agents. Agents/living organisms have choice or are capable of 
choice or agency and they are the only things that have choice or can interpret 
information. Abiotic non-agents do not have information because they have no 
choice. We humans can have information about abiotic objects but those objects 
themselves do not have that information as they have no mind to be informed. 
That includes this email post, it is abiotic an has no agency. It has 
information by virtue of you reading it because you are able to interpret the 
visual signs with which I have recorded my thoughts. Marshall McLuhan would add 
to my comments that “the user is the content” as well as saying that Shannon’s 
work was not a theory of information but a "theory of transportation”. I think 
of Shannon’s work in a similar light. I also do not regard Shannon’s work as a 
theory of information but it is a theory of signals. Shannon himself said his 
theory was not about meaning and I say what is information without meaning and 
that therefore Shannon only had a theory of signals. 

Another insight of McLuhan’s that of figure and ground is useful to understand 
why we have so many different definitions of information. McLuhan maintained 
that one could not understand a figure unless one understood the ground in 
which it operates in. (McLuhan might have gotten this idea from his professor 
at Cambridge, I. A. Richards, who said that in order to communicate one needs 
to feedforward [he coined the term btw] the context of what one is 
communicating.) The different definitions of information we have considered are 
a result of the different contexts in which the term information is used. We 
should also keep in mind that all words are metaphors and metaphor literally 
means to carry across, derived from the Greek meta (literally ‘across') and 
phorein (literally 'to carry'). So the word information has been carried across 
from one domain or area of interest to another. It entered the English language 
as the noun associated with the verb 'to inform', i.e. to form the mind. Here 
is an excerpt from my book What Is Information? (available for free at 
demopublishing.com):
"Origins of the Concept of Information - We begin our historic survey of the 
development of the concept of information with its etymology. The English word 
information according to the Oxford English Dictionary (OED) first appears in 
the written record in 1386 by Chaucer: 'Whanne Melibee hadde herd the grete 
skiles and resons of Dame Prudence, and hire wise informacions and techynges.' 
The word is derived from Latin through French by combining the word inform 
meaning giving a form to the mind with the ending “ation” denoting a noun of 
action. This earliest definition refers to an item of training or molding of 
the mind.” This is why abiotic objects have no information as I claimed above 
because they have no mind that can be informed.

I hope that by informing you of the origin of the word information I have shed 
some light on our confusion about what is information and why we have so many 
definitions of it. It might even shed some light for that matter as to what is 
an agent. Got the ticket? If so that makes me a ticket agent. I hope you get 
the joke. all the best - Bob


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/


On Oct 19, 2017, at 7:11 PM, Terrence W. DEACON  wrote:

AUTONOMOUS AGENCY: The definition I propose for autonomous agency It is open to 
challenge. Of course, there are many ways that we use the term 'agent' in more 
general and metaphoric ways. I am, however, interested in the more fundamental 
conception that these derived uses stem from. I do not claim that this 
definition is original, but rather that it is what we implicitly understand by 
the concept. So if this is not your understanding I am open to suggestions for 
modification.  

I should add that it has been a recent goal of my work to describe an 
empirically testable simplest model system that satisfies this definition. 
Those of you who are familiar with my work will recognize that this is what I 
call an autogenic or teleodynamic system. In this context, however, it is only 
the adequacy of the definition that I am interested in exploring. As in many of 
the remarks of others on this topic it is characterized by strange-loop 
recursivity, self-reference, and physicality. And it may be worth while 
describin

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-14 Thread Bob Logan
Hello Sung and Arturo. Entropy is a measure of disorder and ΔS > 0. If entropy 
is zero at T = 0 K because there is no disorder at T = absolute zero then 
entropy can only increase from T = 0 K. If that is the case how can entropy 
ever be negative?

Arturo asked me to share a private email I sent to him about the elephant and 
the 3 blind men. He urged me to share it with the group. Because we are limited 
to 2 posts per week I waited until I had something else to post. So here by 
Arturo’s request is our correspondence:

Dear Bob, 
A nice story!
I think you must share it with FISers.

...even if I think that, while one blind man is touching the elephant, another 
is touching a lion, and another a deer...
In other words, your tale says that a single elephant does exist, while I'm not 
so sure...
However, don't worry, I will not make a public critic to your nice tale!

--
Inviato da Libero Mail per Android

venerdì, 06 ottobre 2017, 02:15PM +02:00 da Bob Logan lo...@physics.utoronto.ca 
<mailto:lo...@physics.utoronto.ca>:

Caro Arturo - and thanks for your feedback. 

The discussion of info on the FIS list (re what is info) is like the 3 blind 
men inspecting an elephant. It is a rope said the blind man holding the tail; 
no It is a snake said the blind man holding the trunk; no It is a tree said the 
blind man touching the elephant’s leg.

I refrained from using this story on the FIS list as it might come off as 
insulting - what do you think - should I share it with the group 

tanti auguri - Bob 


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan


On Oct 14, 2017, at 9:25 PM, Sungchul Ji  wrote:

Hi Arturo,

I agree.  Engtropy can be negative MATHEMATICALLY, as Shroedinger assumed.
But what I am claiming is that that may be a mathematical artifact, since, 
according to the Third Law of Thermodynamics, therer is no negative entropy.  

All the best.

Sung


From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 6:02 PM
To: Sungchul Ji
Cc: fis@listas.unizar.es
Subject: Re[2]: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
 
Dear Sung, 
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds 
true.  
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.  
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."
https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html
--
Inviato da Libero Mail per Android
venerdì, 13 ottobre 2017, 10:11PM +02:00 da Sungchul Ji 
s...@pharmacy.rutgers.edu <mailto:s...@pharmacy.rutgers.edu>:

Hi Arturo,

(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,

"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."

(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?
(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].

Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:

 - S = - k lnW = k ln (1/W)

and then equating W with disorder, D, led him to 

- S = k ln (1/D).

Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that

"negative entropy = order".

As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.

Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]

"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless." 

(4) If my argument in (3) is valid, this may provide an example of what may be 
called 

the "Unreasonable Ineffectiveness of Mathematics"

which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.  

All the best.

Sung



 





References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.
   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press

Re: [Fis] Heretic

2017-10-05 Thread Bob Logan
Dear Arturo - I enjoyed your expression of your opinion  because of its 
directness and honesty even though I do not quite agree with everything you 
said. I enjoyed it because it provoked the following thoughts.

Yes you are right there seems to be a variety of opinions as to just what 
information is. All of them are correct and all of them are wrong including 
mine which I will share with you in a moment. They are right in that they 
describe some aspect of the notion of information and they are all wrong 
because they are attempting to be precise and that is not possible. All words 
including the word ‘information’ are metaphors and a metaphor cannot be right 
or wrong - it can only be illuminating if inspired or irrelevant if too narrow. 
I am afraid caro Arturo that there cannot be a scientific definition of 
‘information’ because definitions cannot be falsified and as Karl Popper once 
suggested for a proposition to be scientific it has to be falsifiable. Of 
course this is Popper’s definition of science so some may disagree. So I am 
with you so far. But where I have to disagree is when you call the activity of 
trying to define information a useless activity. I think it is useful if only 
for us to see the various dimensions of this notion.

Now as promised my thoughts re: what is information? In fact I have written a 
whole book on the subject which I invite all FISers to read free of charge as 
it is available in an open access format at demopublishing.com
The availability of the book for free is part of an experiment in which I 
wanted to explore if a book could be a two-way form of communication between an 
author and his or her readers. So FISers please help yourself to my book and if 
you do please honour me with a comment or two as the Web site you access the 
book at also has provisions for you feedback. PS - The book is also available 
in hard copy from Amazon.

So now for my definition of information as can be found in the book.
• Data are the pure and simple facts without any particular structure or 
organization, the basic atoms of information, 

• Information is structured data, which adds meaning to the data and gives it 
context and significance,

• Knowledge is the ability to use information strategically to achieve one's 
objectives, and

 • Wisdom is the capacity to choose objectives consistent with one's values and 
within a larger social context 


In the book I also quote T. S. Eliot whose lines of poetry provide another 
perspective on wisdom, knowledge and information

Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information? ­– TS Eliot

My definition of information as well as that of TS Eliot does not encompass the 
notion of physicists who talk about information in terms of Wheeler’s "it from 
bit” idea. 
For me inanimate objects have no information because they have no choice. They 
slavishly follow the laws of physics. Only biological, living organisms have 
information because they have choice and information is that which allows them 
to make their choices. And information is that which they perceive through 
their senses from the simplest bacteria to us humans that ee cummings described 
as "fine specimen(s) of hypermagical ultraomnipotence”   So this is my second 
notion of what is ‘information’.
Even a book is not a form of information. It is the record of information 
created by its author and it is a medium that allows its readers to recreate 
that original information of its author. From a McLuhan perspective we could 
also ask is information the medium or the message. McLuhan would say they are 
the same since he said 'the medium is the message'. And he would also agree 
that it is the reader that recreates information when the book 
is read since he also said “the user is the content”. 
Since composing this response a post from Lars-Göran Johansson appeared with 
which I am in agreement
Best wishes to all - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/












On Oct 4, 2017, at 1:49 PM, tozziart...@libero.it wrote:

 Messaggio inoltrato  Da: tozziart...@libero.it 
<mailto:tozziart...@libero.it> A: Alex Hankey alexhan...@gmail.com 
<mailto:alexhan...@gmail.com> Data: mercoledì, 04 ottobre 2017, 07:37PM +02:00 
Oggetto: Re[2]: [Fis] Heretic

Dear Prof. Hankey,
I come from a free country, where everybody can say his own opinion, in 
particular if his opinion is not totally stupid.  
The times of Giordano Bruno and Inquisition are gone.  


--
Inviato da Libero Mail per Android

mercoledì, 04 ottobre 2017, 06:20PM +02:00 da Alex Hankey alexhan...@gmail.com 
<mailto:alexhan...@gm

Re: [Fis] Can the can drink beer ? - No way!

2017-03-26 Thread Bob Logan
Hello Krassimir - I agree with the sentiments you expressed - they seem to 
parallel my thoughts.

I am often  puzzled by the use of the term ‘information’ in the way it is often 
used by physicists re the info of material objects . The way the term 
information is used in physics such as Wheelers its from bits does not conform 
to my understanding of information as a noun describing the process of 
informing. How can abiotic matter be informed as it cannot make any  choices 
and hence cannot be informed. Living organisms make choices and use information 
to make those choices for all living creatures from bacteria to humans 
including physicists :-). The only information involved in the uses by 
physicists describing our universe of the word information is that associated 
with physicists becoming informed of what is happening in the universe they 
observe. I am happy that they want to discuss this info but I believe there is 
a need to distinguish between info (biotic) and info (abiotic) as used in 
physics. The use of a single word information for both categories is confusing, 
at least it is for me. This ambiguity reminds me of Shannon's use of the term 
entropy to define his notion of information having taken the advice of Von 
Neumann.  A story is told that Shannon did not know what to call his measure 
and von Neumann advised him to call it entropy because nobody knows what it 
means and that it would therefore give Shannon an advantage in any debate 
(Campbell, Jeremy 1982, p. 32  Grammatical Man: Information, Entropy, Language, 
and Life. New York: Simon and Schuster. ).  Shannon defined information in such 
a way that he admitted was not necessarily about meaning. Information without 
meaning has no meaning for me.  Kind regards to all - Bob Logan



__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/












On Mar 26, 2017, at 5:39 AM, Krassimir Markov  wrote:

Dear Brian, Arturo, Karl, Alex, Lars-Goran, Gyuri, and FIS colleagues,
 
Thank you for your remarks!
 
What is important is that every theory has its own understanding of the 
concepts it uses.
For “foreigners”, theirs meaning may be strange or unknown.
Some times, concepts of one theory contradict to corresponded concepts from 
other theory.
 
For years, I have met many different definitions of concept “information” and 
many more kinds of its use.
From materialistic up to weird point of view...
 
To clear my own understanding, I shall give you a simple example:
 
CAN THE CAN DRINK BEER ?
 
CAN THE CAN EXCHANGE BEER WITH THE GLASS ?
 
The can is used by humans for some goals, for instance to store some beer for a 
given period.
But the can itself “could not understand” its own functions and what the can 
can do with beer it contains.
All its functionality is a human’s  consciousness model.
Can cannot exchange beer with the glass if there are no human activity or 
activity of additional devices invented by humans to support this.
 
Further:
 
CAN THE ARTIFICIAL LEG WALK  ?
You know the answer ... Human with an artificial leg can walk ...
All functionality of artificial leg is a result from human’s  consciousness 
modeling and invention. 
 
In addition:
 
IS THE “PHYSICAL INFORMATION” INFORMATION ?
If it is, the first question is how to measure the quantity and quality of such 
“information” and who can do this? 
I prefer the answer “NO” – “physical information” is a concept which means 
something else but not “information” as it is in my understanding.
From my point of view, “physical information” is a kind of reflection (see 
“Theory of reflections” of T.Pavlov).
Every reflection may be assumed as information iff (if and only if) there exist 
a subjective information expectation to be resolved by given reflection.
For physical information this low is not satisfied. Because of this, I prefer 
to call this phenomenon simply “a reflection”.
 
And so on ...
 
 
Finally:
 
Human been invented too much kinds of prostheses including ones for our 
intellectual functionalities, i.e. many different kinds of electronic devices 
which, in particular, can generate some electrical, light, etc. impulses, which 
we assume as “information”; usually a combination of impulses we assume as s 
structure to be recognized by us as “information”. 
 
A special kind of prostheses are Robots. They have some autonomous 
functionalities but are still very far from living consciousness. The level of 
complexity of robot’s consciousness is far of human’s one. Someone may say that 
robots understand and exchange “information”, but still they only react on 
incoming signals following the instructions given by humans. Theirs functioning 
is similar to human ones but only similar. They may reco

Re: [Fis] I finished my thesis work. The defense was March, 21th - Thank you very much to all list members :-)

2017-03-23 Thread Bob Logan
Dear Moisés -  congratulations and a hearty mazel tov. I look forward to your 
continued work. Please email a digit version of your thesis if possible - Bob


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/












On Mar 23, 2017, at 10:48 AM, Moisés André Nisenbaum 
 wrote:

Dear FISers.
Finally I finished my doctorate work :-)
The FIS' list messages, the survey and the interviews was very important data 
sources of my research.
This mailing list, and it's members, are amazing! :-)
Thank you all, very much!!!

Among other things, I was able to classify and group more than 5000 messages 
posted from 1997 to 2016 and found that the most discussed topic was 
Information and Physics. But there are ohter interesting things :-)
I'm going to submit an extended abstract for IS4SI with some of the results.
If anyone wants to know more details, please contact me. 

Special thanks to my friend Rafael, for introducing FIS to me, to Pedro for the 
support and incentive and to Bob Logan for the many conversations and the 
excellent reception at Toronto.

Um abraço!!!

Moisés

-- 
Moisés André Nisenbaum
Doutor em Ciência da Informação (IBICT/UFRJ)
Instituto Federal do Rio de Janeiro - IFRJ
Campus Rio de Janeiro
moises.nisenb...@ifrj.edu.br 
<mailto:moises.nisenb...@ifrj.edu.br>___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-20 Thread Bob Logan
Loet - thanks for the mention of our (Kauffman, Logan et al) definition our 
definition of information which is a qualitative description of information. As 
to whether one can measure information with our description, my response is no 
but I am not sure that one can measure information at all. What units would one 
use to measure information? E = mc 2 contains a lot of information but the 
amount of information depends on context. A McLuhan one-liner such as 'the 
medium is the message' also contains a lot of information even though it is 
only 5 words or 26 characters long. 

Hopefully I have provided some information but how much information is 
impossible to measure.

Bob




__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan/












On Dec 20, 2016, at 3:26 AM, Loet Leydesdorff  wrote:

Dear colleagues, 
 
A distribution contains uncertainty that can be measured in terms of bits of 
information.
Alternatively: the expected information content H of a probability distribution 
is .
H is further defined as probabilistic entropy using Gibb’s formulation of the 
entropy .
 
This definition of information is an operational definition. In my opinion, we 
do not need an essentialistic definition by answering the question of “what is 
information?” As the discussion on this list demonstrates, one does not easily 
agree on an essential answer; one can answer the question “how is information 
defined?” Information is not “something out there” which “exists” otherwise 
than as our construct.
 
Using essentialistic definitions, the discussion tends not to move forward. For 
example, Stuart Kauffman’s and Bob Logan’s (2007) definition of information “as 
natural selection assembling the very constraints on the release of energy that 
then constitutes work and the propagation of organization.” I asked several 
times what this means and how one can measure this information. Hitherto, I 
only obtained the answer that colleagues who disagree with me will be cited. J 
Another answer was that “counting” may lead to populism. J
 
Best,
Loet
 
Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net  ; 
http://www.leydesdorff.net/  
Associate Faculty, SPRU,  University of Sussex; 
Guest Professor Zhejiang Univ. , Hangzhou; 
Visiting Professor, ISTIC,  Beijing;
Visiting Professor,  <>Birkbeck , University of London; 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en 

 
From: Dick Stoute [mailto:dick.sto...@gmail.com ] 
Sent: Monday, December 19, 2016 12:48 PM
To: l...@leydesdorff.net 
Cc: James Peters; u...@umces.edu ; Alex Hankey; FIS 
Webinar
Subject: Re: [Fis] What is information? and What is life?
 
List,
 
Please allow me to respond to Loet about the definition of information stated 
below.  
 
1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  
 
It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.
 
By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 
 
Dick
 
 
On 18 December 2016 at 15:05, Loet Leydesdorff mailto:l...@leydesdorff.net>> wrote:
Dear

Re: [Fis] What is information? and What is life?

2016-12-19 Thread Bob Logan
Dear Dick - I loved your analysis. You are right on the money. It also explains 
why Shannon dominated the field of information. He had a mathematical formula 
and there is nothing more appealing to a scientist than a mathematical formula. 
But you are right his formula only tells us of how many bits are needed to 
represent some information but tells us nothing about its meaning or its 
significance. As Marshall McLuhan said about Shannon information it is figure 
without ground. A figure only acquires meaning when one understands the ground 
in which it operates. So Shannon’s contribution to engineering is excellent but 
it tells us nothing about its nature or its impact as you wisely pointed out. 
Thanks for your insight.

I would like to refer to your insight the next time I write about info and want 
to attribute you correctly. Can you’ll me a bit about yourself like where you 
do your research. thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications


On Dec 19, 2016, at 6:48 AM, Dick Stoute  wrote:

List,

Please allow me to respond to Loet about the definition of information stated 
below.  

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)
 
I agree.  I struggled with this definition for a long time before realising 
that Shannon was really discussing "amount of information" or the number of 
bits needed to convey a message.  He was looking for a formula that would 
provide an accurate estimate of the number of bits needed to convey a message 
and realised that the amount of information (number of bits) needed to convey a 
message was dependent on the "amount" of uncertainty that had to be eliminated 
and so he equated these.  

It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of water in 
liters, but this does not tell us what water is and likewise the measure we use 
for "amount of information" does not tell us what information is. We can, for 
example equate the amount of water needed to fill a container with the volume 
of the container, but we should not think that water is therefore identical to 
an empty volume.  Similarly we should not think that information is identical 
to uncertainty.

By equating the number of bits needed to convey a message with the "amount of 
uncertainty" that has to be eliminated Shannon, in effect, equated opposites so 
that he could get an estimate of the number of bits needed to eliminate the 
uncertainty.  We should not therefore consider that this equation establishes 
what information is. 

Dick


On 18 December 2016 at 15:05, Loet Leydesdorff mailto:l...@leydesdorff.net>> wrote:
Dear James and colleagues,

 

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:

 

1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)

2. "In particular, information must not be confused with meaning." (p. 8)

 

The definition of information as relevant for a system of reference confuses 
information with "meaningful information" and thus sacrifices the surplus value 
of Shannon's counter-intuitive definition.

 

information observer

 

that integrates interactive processes such as

 

physical interactions such photons stimulating the retina of the eye, 
human-machine interactions (this is the level that Shannon lives on), 
biological interaction such body temperature relative to touch ice or heat 
source, social interaction such as this forum started by Pedro, economic 
interaction such as the stock market, ... [Lerner, page 1].

 

We are in need of a theory of meaning. Otherwise, one cannot measure meaningful 
information. In a previous series of communications we discussed redundancy 
from this perspective.

 

Lerner introduces mathematical expectation E[Sap] (difference between of a 
priory entropy [sic] and a posteriori entropy), which is distinguished from the 
notion of relative information Iap (Learner, page 7).

 

) expresses in bits of information the information generated when the a priori 
distribution is turned into the a posteriori one . This follows within the 
Shannon framework without needing an observer. I use this equation, for 
example, in my 1995-book The Challenge of Scientometrics (Chapters 8 and 9), 
with a reference to Theil (1972). The relative information is defined as the 
H/H(max).

 

I agree that the intuitive notion of information is derived from the Latin 
“in-formare” (Varela, 1979). But most of us do no longer use “force” and “mass” 
in the intuitive (Aristotelian) 

Re: [Fis] Fwd: What is life?

2016-12-18 Thread Bob Logan
Hello Loet, Bob U, Alex et alia - I agree with Bob U when he wrote "The problem 
is that information is not an absolute.”  As I have stated before on this list  
information is relative and depends on the context.

And I agree with Loet when he wrote, "Different systems of reference, of 
course, can attribute different meanings to the same information.”
   In fact McLuhan expressed Loet's thought long ago with the one-liner: "The 
user is the content” Applying this idea to Shannon’s original paper might 
explain why we have so many definitions of information as each reader of his 
original paper has their own interpretation of Shannon because as McLuhan said 
"the user is the content”. 


For this user, Shannon information theory should be called Shannon signal 
theory as Shannon himself said his theory was not concerned with meaning only 
the accuracy of sending a signal from point A to point B. What does information 
without meaning mean anyway? 

I enjoy these conversations about the meaning of information and even wrote a 
book on the subject entitled What Is Information? (Available for free at 
demopublishing.com) but I get the feeling we are like the dog chasing its own 
tail.  I agreed with Bob U’s point that information is not an absolute and then 
I agreed with Loet’s disagreement with Bob U’s point. Talk about a dog chasing 
its tail. 

And if this is not enough consider another McLuhan one-liner: "The medium is 
the message” in which he suggests that there is additional information in the 
medium that carries the signals from point A to point B independent of those 
signals. 

cheers - Bob L



__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications



On Dec 18, 2016, at 2:16 AM, Loet Leydesdorff  wrote:

The problem is that information is not an absolute. The same code when measured 
against different references (English vs. Spanish in this case) will yield 
different measures. It's the obverse of the Third Law of Thermodynamics. See 
>
 
Dear Bob, 
 
It seems to me that you confuse information with what information means for a 
system of reference. Different systems of reference, of course, can attribute 
different meanings to the same information.
 
@Alex: this confusion is unfortunately pervasive. Unlike Shannon-type 
information, “information” is often defined (following Bateson and McKay) as “a 
difference which makes a difference”, without articulation that the second 
difference presumes the specification of a system of reference. 
 
A series of differences of the first type can be considered as a probability 
distribution that contains uncertainty. “A difference which makes a 
difference”, however, can be considered as “meaningful information”. In my 
opinion, this “meaningful information” should not be equated with information 
because one then uses the same word for two different things and thus generates 
confusion.
 
Best,
Loet
 
Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net  ; 
http://www.leydesdorff.net/  
Associate Faculty, SPRU,  University of Sussex; 
Guest Professor Zhejiang Univ. , Hangzhou; 
Visiting Professor, ISTIC,  Beijing;
Visiting Professor,  <>Birkbeck , University of London; 
http://scholar.google.com/citations?user=ych9gNYJ&hl=en 

 
 
___
Fis mailing list
Fis@listas.unizar.es 
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] A provocative issue

2016-12-11 Thread Bob Logan
Bravo Arturo - I totally agree - in a paper I co-authored with Stuart Kauffman 
and others we talked abut the relativity of 
information and the fact that information is not an absolute. Here is the 
abstract of the paper and an excerpt from the paper that discusses the 
relativity of information. The full papers available at: 
https://www.academia.edu/783503/Propagating_organization_an_enquiry

Best wishes - Bob Logan

Kauffman, Stuart, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich. 2007. Propagating Organization: An Inquiry. Biology and 
Philosophy 23: 27-45.

Propagating Organization: An Enquiry - 
Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
lIlya Shmulevich

Institute for Systems Biology, Seattle Washington

 Abstract: Our aim in this article is to attempt to discuss propagating 
organization of process, a poorly articulated union of matter, energy, work, 
constraints and that vexed concept, “information”, which unite in far from 
equilibrium living physical systems. Our hope is to stimulate discussions by 
philosophers of biology and biologists to further clarify the concepts we 
discuss here. We place our discussion in the broad context of a “general 
biology”, properties that might well be found in life anywhere in the cosmos, 
freed from the specific examples of terrestrial life after 3.8 billion years of 
evolution. By placing the discussion in this wider, if still hypothetical, 
context, we also try to place in context some of the extant discussion of 
information as intimately related to DNA, RNA and protein transcription and 
translation processes. While characteristic of current terrestrial life, there 
are no compelling grounds to suppose the same mechanisms would be involved in 
any life form able to evolve by heritable variation and natural selection. In 
turn, this allows us to discuss at least briefly, the focus of much of the 
philosophy of biology on population genetics, which, of course, assumes DNA, 
RNA, proteins, and other features of terrestrial life. Presumably, evolution by 
natural selection – and perhaps self-organization - could occur on many worlds 
via different causal mechanisms.

Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.

Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.


Section 4. The Relativity of Information

 In Sections 2 we have argued that the Shannon conception of information are 
not directly suited to describe the information of autonomous agents that 
propagate their organization. In Section 3 we have defined a new form of 
information, instructional or biotic information as the constraints that direct 
the flow of free energy to do work.

The reader may legitimately ask the question “isn’t information just 
information?”, i.e., an invariant like the speed of light. Our response to this 
question is no, and to then clarify what seems arbitrary about the definition 
of information. Instructional or biotic information is a useful definition for 
biotic systems just as Shannon information was useful for telecommunication 
channel engineering, and Kolmogorov (Shiryayev 1993) information was useful for 
the study of information compression with respect to Turing machines.

The definition of information is relative and depends on the context in which 
it is to be considered. There appears to be no such thing as absolute 
information that is an invariant that applies to all circumstances. Just as 
Shannon defined information

Re: [Fis] Fis Digest, Vol 32, Issue 13

2016-11-14 Thread Bob Logan
Dear FISers - I enjoyed Jose’s distinction of meaning and sense making. 
Employing McLuhan’s notion of figure and ground in which one can only 
understand a figure in terms of the ground in which it operates, I would 
therefore say that meaning is figure and sense-making is the ground in which 
the full significance of the meaning emerges. The same information with 
identical meanings can have very different significances to two different 
recipients of that information. Put another way in terms of a McLuhan one-liner 
"the user is the content".

I hope you got the meaning and the significance of my remark - best wishes - Bob

__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










On Nov 13, 2016, at 10:26 PM, Jose Javier Blanco Rivero  
wrote:

Dear Malcolm,

I think that is useful to distinguish between sense-making (Sinn in german, 
sentido in spanish) and meaning (Bedeutung, significado). Meaning is 
linguistic, while sense-making mixes linguistic and non linguistic dimensions. 
For the social sciences, like intellectual history, this distinction helps to 
clear further the difference between semantics (a field of meaning) and social 
structure (communicative information processing structures, like condes and 
communication media -in Luhmanns terms). 
I am aware that maybe in physics this might not be quite convincing...

Best,

El nov 12, 2016 4:43 PM, "Malcolm Dean" mailto:malcolmd...@gmail.com>> escribió:
To an animal about to be attacked and eaten, the meaning of an approaching 
predator is quite clear.

Obviously, meaning is produced by, within, and among Observers, and not by 
language.

Meaning may be produced *through* language, not *in* language, as a medium of 
interaction (aka communication).

I wish scientific specialists had more awareness of the effects of their 
specialization.

Malcolm Dean

 
Date: Sat, 12 Nov 2016 20:29:21 +0100
From: "Loet Leydesdorff" mailto:l...@leydesdorff.net>>
To: "'Alex Hankey'" mailto:alexhan...@gmail.com>>, "'FIS 
Webinar'"
mailto:Fis@listas.unizar.es>>
Subject: Re: [Fis] Is quantum information the basis of spacetime?

Dear Alex and colleagues,

Thank you for the reference; but my argument was about meaning. Meaning can 
only be considered as constructed in language. Other uses of the word are 
metaphorical. For example, the citation to Maturana.

Information, in my opinion, can be defined content-free (a la Shannon, etc.) 
and then be provided with meaning in (scholarly) discourses. I consider physics 
as one among other scholarly discourses. Specific about physics is perhaps the 
universalistic character of the knowledge claims. For example: "Frieden's 
points apply to quantum physics as well as classical physics." So what? This 
seems to me a debate within physics without much relevance for non-physicists 
(e.g., economists or linguists).
 
Loet Leydesdorff
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

___
Fis mailing list
Fis@listas.unizar.es 
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis 


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Is quantum information the basis of spacetime?

2016-11-04 Thread Bob Logan
Hello Andrei - I am with you - sharing you sentiment. Information only pertains 
to living organisms and entails some signals that help them make a choice. A 
black hole makes no choices - it is ruled by the laws of physics. Abiotic 
systems have no information. A book is a set of signals that a reader can 
convert into information if they know the language which the book is written. A 
book written in Urdu contains no information for me other than this appears to 
be a set of signals that contains information for a reader in the language in 
which this book was written. Who reads a black hole. How does it contain 
information that makes a difference. When we launch a satellite to orbit the 
earth we do not say that the sun is informing the satellite how to behave. The 
satellite is just following the laws of physics. It has no choice and so it is 
not being informed. There are many different forms of information (biotic and 
Shannon as found in the 2007 paper Propagating Organization: An Inquiry by 
Kauffman, Logan et al. in Biology and Philosophy 23: 27-45)  so we do not 
need to complicate things even more by ascribing the laws of physics as the 
communication of information.
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications

On Nov 4, 2016, at 4:17 AM, Andrei Khrennikov  wrote:

 Dear all, 
I want to comment so called information approach to physics, by speaking with 
hundreds of leading experts
in quantum foundations, I found that nobody can define rigorously the basic 
term "information" which is so widely 
used in their theories and discussions, the answers are as "information is the 
basic entity" which cannot be defined 
in other terms. Well, my impression is that without novel understanding and 
definition of information all these "theories" 
are practically empty, well very good mathematical exercises. May be I am too 
critical... But I spent so much time by trying 
to understand what people are talking about. The output is ZERO.

all the best, andrei

Andrei Khrennikov, Professor of Applied Mathematics,
Int. Center Math Modeling: Physics, Engineering, Economics, and Cognitive Sc.
Linnaeus University, Växjö, Sweden
My RECENT BOOKS:
http://www.worldscientific.com/worldscibooks/10.1142/p1036
http://www.springer.com/in/book/9789401798181
http://www.panstanford.com/books/9789814411738.html
http://www.cambridge.org/cr/academic/subjects/physics/econophysics-and-financial-physics/quantum-social-science
http://www.springer.com/us/book/9783642051005


From: Fis [fis-boun...@listas.unizar.es] on behalf of Gyorgy Darvas 
[darv...@iif.hu]
Sent: Thursday, November 03, 2016 10:23 PM
To: John Collier; fis
Subject: Re: [Fis] Is quantum information the basis of spacetime?

John:
The article describes very really the conflicting attitudes. Interesting to see 
the diverse arguments together.
I agree, some think so, some do not. I do the latter, but this does not make 
any matter.
Gyuri

On 2016.11.03. 19:52, John Collier wrote:
Apparently some physicists think so.

https://www.scientificamerican.com/article/tangled-up-in-spacetime/?WT.mc_id=SA_WR_20161102

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier




___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] What is Information? - Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere and the Econosphere

2016-06-15 Thread Bob Logan
Dear FIS colleagues - I received three complimentary emails re my paper, 
Propagating Organization: An Enquiry, the paper I wrote with Stuart Kauffman 
and others and which I shared with the list. As a result  I thought some of you 
might be interested in the book I wrote based on that paper entitled What is 
Information? - Propagating Organization in the Biosphere, the Symbolosphere, 
the Technosphere and the Econosphere with a foreword written by Terry Deacon.

The ebook version of the books available for free at demopublishing.com 
<http://demopublishing.com/>. Please feel free to have a look at it and grab a 
copy of it - Thanks - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] "Mechanical Information" in DNA

2016-06-08 Thread Bob Logan
Thanks to Moises for the mention of my paper with Stuart Kauffman. If anyone is 
interested in reading it one can find it at the following Web site:

https://www.academia.edu/783503/Propagating_organization_an_enquiry

Here is the abstract:

Propagating Organization: An Inquiry. 
Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich. 

2007. Biology and Philosophy 23: 27-45.

Abstract

Our aim in this article is to attempt to discuss propagating organization of 
process, a poorly articulated union of matter, energy, work, constraints and 
that vexed concept, “information”, which unite in far from equilibrium living 
physical systems. Our hope is to stimulate discussions by philosophers of 
biology and biologists to further clarify the concepts we discuss here. We 
place our discussion in the broad context of a “general biology”, properties 
that might well be found in life anywhere in the cosmos, freed from the 
specific examples of terrestrial life after 3.8 billion years of evolution. By 
placing the discussion in this wider, if still hypothetical, context, we also 
try to place in context some of the extant discussion of information as 
intimately related to DNA, RNA and protein transcription and translation 
processes. While characteristic of current terrestrial life, there are no 
compelling grounds to suppose the same mechanisms would be involved in any life 
form able to evolve by heritable variation and natural selection. In turn, this 
allows us to discuss at least briefly, the focus of much of the philosophy of 
biology on population genetics, which, of course, assumes DNA, RNA, proteins, 
and other features of terrestrial life. Presumably, evolution by natural 
selection – and perhaps self-organization - could occur on many worlds via 
different causal mechanisms.

Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.

Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.








__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










On Jun 8, 2016, at 4:40 PM, Moisés André Nisenbaum 
 wrote:

Hi, John. It is amazing!!
I would like to highlight the word "constraints" at the caption of the DNA 
diagram (http://phys.org/news/2016-06-layer-dna.html 
<http://phys.org/news/2016-06-layer-dna.html>)
"The rigid base-pair model is forced, using 28 constraints (indicated by red 
spheres), into a lefthanded superhelical path that mimics the DNA conformation 
in the nucleosome. Credit: Leiden Institute of Physics"
The same word is used by Bob Logan and Stuart Kauffman to relate mechanical 
concepts with 'information' (http://philpapers.org/rec/KAUPOA 
<http://philpapers.org/rec/KAUPOA>)
Could it have any parallel between these two approaches?

Also, you usually think "DNA" associated with Biological Sciences, but this 
research is made at Leiden Institute of Physics! Of course, to work current 
(complex, innovative) science you must have an interdisciplinary approach.

Abraço.

Moisés

2016-06-08 16:40 GMT-03:00 John Collier mailto:colli...@ukzn.ac.za>>:
A previously hypothesized “second layer” of information in DNA may have been 
isolated.

 

http://phys.org/news/2016-06-layer-dna.html 
<http://phys.org/news/2016-06-layer

Re: [Fis] NEW DISCUSSION SESSION: THE BIOLOGIC

2016-03-06 Thread Bob Logan
Please resend the attachment 


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










> On Mar 6, 2016, at 8:22 AM, PEDRO CLEMENTE MARIJUAN FERNANDEZ 
>  wrote:
> 
> Dear FISers and New Colleagues,
> 
> Louis H. Kauffman (Department of Mathematics, Statistics and Computer 
> Science, University of Illinois at Chicago) is going to start another short 
> discussion session, this time it will be focussed on the biologic. He will 
> concentrate on the relationships of formal systems and biology, "to consider 
> and reconsider philosophical and phenomenological points of view in relation 
> to natural science and mathematics."
> 
> If there is any trouble in his kickoff message with the attached presentation 
> (ineffable filters of Unizar server!) we would arbitrate some dropbox 
> solution, as we did in the previous session.
> 
> Best regards to all,
> 
> --Pedro
> fis list coordination
>  
> ___
> Fis mailing list
> Fis@listas.unizar.es 
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis 
> 
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Miracles and Natural Order Fis Digest, Vol 23, Issue 24

2016-02-22 Thread Bob Logan
Dear Friends - the miracle is not that there is stuff but rather that the stuff 
of which we are made can ask the question why is there stuff and what does that 
mean? The fact that the questions can not be answered does not matter because 
it is always the questions that are most important and not the answers. Warm 
regards and happy pondering - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










> On Feb 22, 2016, at 3:07 PM, John Collier  wrote:
> 
> Dear fis list: <>
>  
> It is impossible to have a system that does not have some order in it. Even a 
> mathematically random collection has regions the are ordered. So order itself 
> is hardly miraculous, given that there is something. 
>  
> I agree with Gordana that the question of why there is something rather than 
> nothing is currently, and on any projection of current understanding, 
> inexplicable. Invoking God doesn’t help (why God, rather than nothing?). It 
> would be nice to be able to show that some being is necessary, but so far all 
> attempts (and there are many in history of thought) have failed.
>  
> John Collier
> Professor Emeritus and Senior Research Associate
> University of KwaZulu-Natal
> http://web.ncf.ca/collier <http://web.ncf.ca/collier>
>  
> From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Gordana 
> Dodig-Crnkovic
> Sent: Monday, 22 February 2016 12:53 PM
> To: l...@leydesdorff.net; 'Bruno Marchal'; 'fis Science'
> Subject: [Fis] Miracles and Natural Order Fis Digest, Vol 23, Issue 24
>  
> To me the miracle is not so much order, as it is relation, and thus as Loet 
> says “order is always constructed (by us)”-
> but the miracle is the very existence of anything (us, the rest of the 
> universe).
> Why there is something rather than nothing (that would be much simpler)?
> To me miracle is how it all started. From vacuum fluctuations? But where the 
> vacuum comes from?
> But then, why should we call it a miracle? 
> Perhaps the better name is just natural law, finally equally inexplicable and 
> given, 
> but sounds more general and less mystic.
>  
> Best,
> Gordana
>  
>  
> From: Fis  <mailto:fis-boun...@listas.unizar.es>> on behalf of Loet Leydesdorff 
> mailto:l...@leydesdorff.net>>
> Organization: University of Amsterdam
> Reply-To: "l...@leydesdorff.net <mailto:l...@leydesdorff.net>" 
> mailto:l...@leydesdorff.net>>
> Date: Monday 22 February 2016 at 20:36
> To: 'Bruno Marchal' mailto:marc...@ulb.ac.be>>, 'fis 
> Science' mailto:fis@listas.unizar.es>>
> Subject: Re: [Fis] Fis Digest, Vol 23, Issue 24
>  
> All worldviews begin in a miracle. No exceptions.
> 
>  
> I agree. Nevertheless, we should, and can, minimize the miracle. 
>  
> Why would one need a worldview? The whole assumption of an order as a Given 
> (in a Revelation) is religious. Order is always constructed (by us) and 
> can/needs to be explained. 
>  
> No “harmonia praestabilita”, but ex post. No endpoint omega. No cosmology, 
> but chaology.
>  
> With due respect for those of you who wish to hold on to religion or nature 
> as a given; however, vaguely defined.
>  
> Best,
> Loet
>  
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] _ Pirate Bay of Science

2016-02-13 Thread Bob Logan
Dear FISers fyi - Bob




> Begin forwarded message:
> 
> 
> 
>> http://www.sciencealert.com/this-woman-has-illegally-uploaded-millions-of-journal-articles-in-an-attempt-to-open-up-science
>>  
>> 
>> 
>> Researcher illegally shares millions of science papers free online to spread 
>> knowledge
>> Welcome to the Pirate Bay of science.
>> FIONA MACDONALD12 FEB 2016
>> A researcher in Russia has made more than 48 million journal articles - 
>> almost every single peer-reviewed paper every published - freely available 
>> online. And she's now refusing to shut the site down 
>> , despite a court 
>> injunction and a lawsuit from Elsevier, one of the world's biggest 
>> publishers.
>> 
>> For those of you who aren't already using it, the site in question is 
>> Sci-Hub , and it's sort of like a Pirate Bay of the 
>> science world. It was established in 2011 by neuroscientist Alexandra 
>> Elbakyan, who was frustrated that she couldn't afford to access the articles 
>> needed for her research, and it's since gone viral, with hundreds of 
>> thousands of papers being downloaded daily. But at the end of last year, the 
>> site was ordered to be taken down by a New York district court 
>> 
>>  - a ruling that Elbakyan has decided to fight, triggering a debate over who 
>> really owns science. 
>> 
>> "Payment of $32 is just insane when you need to skim or read tens or 
>> hundreds of these papers to do research. I obtained these papers by pirating 
>> them,"Elbakyan told Torrent Freak last year 
>> .
>>  "Everyone should have access to knowledge regardless of their income or 
>> affiliation. And that’s absolutely legal."
>> 
>> If it sounds like a modern day Robin Hood struggle, that's because it kinda 
>> is. But in this story, it's not just the poor who don't have access to 
>> scientific papers - journal subscriptions have become so expensive that 
>> leading universities such as Harvard 
>> 
>>  and Cornell 
>>  have 
>> admitted they can no longer afford them. Researchers have also taken a stand 
>> - with 15,000 scientists vowing to boycott publisher Elsevier 
>>  in part for its excessive paywall fees.
>> 
>> Don't get us wrong, journal publishers have also done a whole lot of good - 
>> they've encouraged better research thanks to peer review, and before the 
>> Internet, they were crucial to the dissemination of knowledge.
>> 
>> But in recent years, more and more people are beginning to question whether 
>> they're still helping the progress of science. In fact, in some cases, the 
>> 'publish or perish' mentality 
>> 
>>  is creating more problems than solutions, with a growing number of 
>> predatory publishers now charging researchers to have their work published - 
>> often without any proper peer review process or even editing 
>> .
>> 
>> "They feel pressured to do this," Elbakyan wrote in an open letter to the 
>> New York judge last year 
>> . "If a researcher wants 
>> to be recognised, make a career - he or she needs to have publications in 
>> such journals."
>> 
>> That's where Sci-Hub comes into the picture. The site works in two stages. 
>> First of all when you search for a paper, Sci-Hub tries to immediately 
>> download it from fellow pirate database LibGen 
>> . If that 
>> doesn't work, Sci-Hub is able to bypass journal paywalls thanks to a range 
>> of access keys that have been donated by anonymous academics (thank you, 
>> science spies).
>> 
>> This means that Sci-Hub can instantly access any paper published by the big 
>> guys, including JSTOR, Springer, Sage, and Elsevier, and deliver it to you 
>> for free within seconds. The site then automatically sends a copy of that 
>> paper to LibGen, to help share the love.  
>> 
>> It's an ingenious system, as Simon Oxenham explains for Big Think 
>> :
>> 
>> "In one fell swoop, a network has been created that likely has a greater 
>> level of access to science than any individual university, or even 
>> government for that matter, anywhere   in the world. Sci-Hub 
>> repres

Re: [Fis] Shouldn't we talk in FIS about LIGO and Gravitational Waves experimental measures and informational consequences?

2016-02-13 Thread Bob Logan
Hi Moises and FIS colleagues. The gravity wave that was detected last Sept 
informed us of the event of two black holes colliding 1.3 billion years ago - 
we learned how much energy was released which was approximately the 3 times the 
mass of the sum times c squared. We also learned that the event was very short 
but that the power (Energy/time) of the event was 50 times the power of all the 
star in the universe. Gravity waves can only provide info for large 
accelerations of mass because the gravitational force is so weak. So it can not 
be used to send info from Earth based agents to each other. My advice is stick 
to E&M radiation. :-) Bob


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










> On Feb 12, 2016, at 6:34 PM, Moisés André Nisenbaum 
>  wrote:
> 
> Dear FISers.
> 
> Yesterday LIGO ( Laser Interferometer Gravitational-Wave Observatory) 
> announced, for the first time, the detection of Gravitational Waves predicted 
> by Einstein 100 years ago.
> Wouldn't be Gravitational Waves a good discussion in FIS?
> We all know that regular waves can carry information. Can gravitational waves 
> carry  information? Which kind?
> Panelists (I saw some videos) say that "gravitational waves carry information 
> about astronomical phenomena never before observed by human".
> The question is: is it possible to "modulate" some information in 
> gravitational waves?
> I found at Scopus 17 articles containing "information" and "gravitational 
> waves" in title, so I think that people are working around this...
> I think that it is a very important moment in history of science and it is 
> extremely related with 'information'.
> 
> Surely we  have colleagues (specialists) in FIS that can share interesting 
> thoughts about this theme.
> I talked with Pedro about, and he agree we may have a discussion about after 
> two that are scheduled.
> So, this message is not for starting a discussion. It is only to register 
> this fabulous new way that, we have now, to 'listen' the universe.
> 
> Links:
> http://www.ligo.org/news/media-advisory.php 
> 
> https://ligo.caltech.edu/ 
> https://www.advancedligo.mit.edu/ 
> http://www.ligo.org/ 
> https://www.ligo.org/partners.php 
> 
> Um abraço!
> Moisés
> 
> -- 
> Moisés André Nisenbaum
> Doutorando IBICT/UFRJ. Professor. Msc.
> Instituto Federal do Rio de Janeiro - IFRJ
> Campus Rio de Janeiro
> moises.nisenb...@ifrj.edu.br 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] _ Re: _ Re: _ Closing lecture

2016-02-02 Thread Bob Logan
Krassimir - I enjoyed your post and your definition of information. For more 
definitions of information especially the notion of the relativity of 
information you might wish to see my book
What is Information? - Propagating Organization in the Biosphere, the 
Symbolosphere, the Technosphere and the Econosphere which is available for free 
on the following Web sites 

demopublishing.com  or at  
demopublishing.com/book/what-is-information

Since I am offering this book for free this shameless promotion of my book is 
not done for commercial gain although I will mention the book is available in a 
printed codex format through Amazon.


Here is the excerpt on the relativity of information from What Is Information? 
for those that might be interested in this idea

The Relativity of Information

Robert M. Losee (1997) in an article entitled A Discipline Independent 

Definition of Information published in 
the Journal of the American Society for 
Information Science defines information as follows:

Information may be defined as the characteristics of the output of a process, 
these being informative about the process and the input. This discipline 
independent definition may be applied to all domains, from physics to 
epistemology.
The term information, as the above definition seems to suggest, is generally 
regarded as some uniform quantity or quality, which is the same for all the 
domains and phenomena it describes. In other words information is an invariant 
like the speed of light, the same in all frames of reference. The origin of the 
term information or the actual meaning of the concept is all taken for granted. 
If ever pressed on the issue most contemporary IT experts or philosophers will 
revert back to Shannon’s definition of information. Some might also come up 
with Bateson definition that information is the difference that makes a 
difference. Most would not be aware that the Shannon and Bateson definitions of 
information are at odds with each other. Shannon information does not make a 
difference because it has nothing to do with meaning; it is merely a string of 
symbols or bits. On the other hand, Bateson information, which as we discovered 
should more accurately be called MacKay information, is all about meaning. And 
thus we arrive at our second surprise, namely the relativity of information. 
Information is not an invariant like the speed of light, but depends on the 
frame of reference or context in which it is used.

We discovered in our review of POE that Shannon information and biotic or 
instructional information are quite different. Information is not an absolute 
but depends on the context in which it is being used. So Shannon information is 
a perfectly useful tool for telecommunication channel engineering. Kolmogorov 
(Shiryayev 1993) information, defined as the minimum computational resources 
needed to describe a program or a text and is related to Shannon information, 
is useful for the study of information compression with respect to Turing 
machines. Biotic or instructional information, on the other hand, is not 
equivalent to Shannon or Kolmogorov information and as has been shown in POE is 
the only way to describe the interaction and evolution of biological systems 
and the propagation of their organization.

POE refers to the following paper:   Kauffman, Stuart, Robert K. Logan, Robert 
Este, Randy Goebel, David Hobill and Ilya Shmulevich. 2007. Propagating 
organization: an enquiry. Biology and Philosophy 23: 27-45.
 

__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










> On Feb 2, 2016, at 6:44 AM, Krassimir Markov  wrote:
> 
> Dear Howard,
>  
> Thank you very much for your great effort and nice explanation!
> I like it!
>  
> Only what I needed to see is a concrete answer to the question “what it the 
> Information?”
> You absolutely clearly described it and I totally agree with your 
> considerations.
> Only what is needed is to conclude with a short definition.
> I think it may be the next:
>  
> The Information is a reflection which may be interpreted by its receiver in 
> the context the receiver has in his/her memory.
>  
> From this definition many consequences follow. In future we may discuss them.
>  
> Friendly regards
> Krassimir
>  
> PS:
> Dear FIS Colleagues,
> 1. At the ITHEA web side, the conferences for year 2016 have been announced.
> One of them is the XIV-th International Conference on “General Information 
> Theory”.
> Please visit link:
> http://www.ithea.org/conferences/conferences.html 
> 
> Welcome in Varna, Bulgaria !
>  
> 2. May be it will be interesting to read the paper, published in our
> International Journal “I

[Fis] _ Re: _ RE: Cho 2016 The social life of quarks

2016-01-21 Thread Bob Logan
Dear Colleagues - Using the terms communication and information to talk about 
the interaction of quarks and all non-sentient particles for that matter is 
strictly metaphoric and as a teacher of the poetry of physics how could I 
object. I do not but at the same time I want to invoke my relativity of 
information principle so that we are clear about the terms communication and 
information applied to quarks and other non-sentient particles as opposed to 
the sentient folks who can read this post unlike any quark on the face of the 
universe. In a paper I co-authored with Stuart Kauffman and others including R. 
Este, R. Goebel, D. Hobill and I. Shmulevich entitled  Propagating 
Organization: An Enquiry  we invoked the principle of the relativity of 
information, namely that the word information refers to many different 
phenomena and that its meaning depends on the context in which the word is 
used. 

best wishes - Bob Logan 

PS Here is the abstract:

Abstract

Our aim in this article is to attempt to discuss propagating organization of 
process, a poorly articulated union of matter, energy, work, constraints and 
that vexed concept, “information”, which unite in far from equilibrium living 
physical systems. Our hope is to stimulate discussions by philosophers of 
biology and biologists to further clarify the concepts we discuss here. We 
place our discussion in the broad context of a “general biology”, properties 
that might well be found in life anywhere in the cosmos, freed from the 
specific examples of terrestrial life after 3.8 billion years of evolution. By 
placing the discussion in this wider, if still hypothetical, context, we also 
try to place in context some of the extant discussion of information as 
intimately related to DNA, RNA and protein transcription and translation 
processes. While characteristic of current terrestrial life, there are no 
compelling grounds to suppose the same mechanisms would be involved in any life 
form able to evolve by heritable variation and natural selection. In turn, this 
allows us to discuss at least briefly, the focus of much of the philosophy of 
biology on population genetics, which, of course, assumes DNA, RNA, proteins, 
and other features of terrestrial life. Presumably, evolution by natural 
selection – and perhaps self-organization - could occur on many worlds via 
different causal mechanisms.

Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.

Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.

The whole article is available at: 
https://www.academia.edu/783503/Propagating_organization_an_enquiry


And here is the section on the relativity of information:

Section 4. The Relativity of Information

 

In Sections 2 we have argued that the Shannon conception of information are not 
directly suited to describe the information of autonomous agents that propagate 
their organization. In Section 3 we have defined a new form of information, 
instructional or biotic information as the constraints that direct the flow of 
free energy to do work.

 

The reader may legitimately ask the question “isn’t information just 
information?”, i.e., an invariant like the speed of light. Our response to this 
question is no, and to then clarify what seems arbitrary about the definition 
of information. Instructional or biotic information is a useful definition for 
biotic systems just as Shannon information was useful for telecommunication 
channel engineering, and Kolmogorov (Shiryayev 1993) information was useful for 
the study of

Re: [Fis] Cho 2016 The social life of quarks

2016-01-16 Thread Bob Logan
Stan et al - you honour me by asking the question. We know that matter (and 
here I do not include dark matter or dark energy) is made up of a small number 
of quarks and gluons. As we go higher and higher energy we will continue to 
create these "freaks of nature" freaks in the sense that we create the 
conditions for them to come into existence using our high energy colliders. I 
am sure they sometimes occur naturally in stars from time to time but they do 
not have any long term effects - they are a passing fancy, a novelty, and an 
amusing one at that. Perhaps they will help us understand the quark gluon 
interaction. The analogy I see with the transition from prokaryotes to 
eukaryotes that I sent to Malcolm was my indulging in scientific based poetry. 
BTW I teach an undergrad course since 1971 called the Poetry of Physics (also 
the title of a book available on Amazon) to teach physics to humanities 
students without using math to promote science literacy among humanists.

Another analogy that came to mind was that of proliferation of nucleic acids 
made up of the same 4 elements: C, G, A, and T.  They are the quarks of biology 
and their chemical bonds the gluons.  

Metaphorically your - Bob Logan
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications

On 2016-01-16, at 10:33 AM, Stanley N Salthe wrote:

> One way to complicate anything is to smash it into bits.  I wonder, Bob, if 
> you would comment on this point as a former particle physicist!
> 
> STAN
> 
> On Fri, Jan 15, 2016 at 11:13 PM, Malcolm Dean  wrote:
> Yes. I don't know enough about Biology, but I noticed the 3+2 business some 
> time ago. I'm automatically suspicious of theories which are "vexingly 
> complex" (QCD) and only "beautiful" (String Theory) to a few people with 
> certain math backgrounds. But the Two and the Three have been important to 
> humans for thousands of years. I think Nature is actually very simple, but we 
> get overwhelmed and confused by its enormous scales and by our attempts to 
> manage observation by (necessarily) creating over-simplified Objects.
> 
> M.
> 
> Malcolm Dean
> Member, Higher Cognitive Affinity Group, BRI
> Research Affiliate, Human Complex Systems, UCLA
> Member, BAFTA/LA
> On Google Scholar
> 
> On Fri, Jan 15, 2016 at 6:47 PM, Bob Logan  wrote:
> eukaryote came about by two prokaryotes joining together and 5 quark combo 
> can be thought of as a nucleon (3 quarks) and a meson(2quarks) combining and 
> the 4 quqrk state as 2 mesons combining. By this logic perhaps there will be  
> 6 quark beast if 2 nucleons combine.
> 
> 
> On 2016-01-15, at 4:17 PM, Malcolm Dean wrote:
> 
>> Could you specify the relata?
>> 
>> Malcolm
>> 
>> On Jan 15, 2016 5:31 AM, "Bob Logan"  wrote:
>> Hi Malcolm - thanks for this article that supports my notion that my former 
>> field of particle physics is becoming like biology. The 4 and 5 quark combos 
>> represent an analogy of the transition in biology from prokaryotes to 
>> eukaryotes. :-) - Bob
>> 
>> 
>> On 2016-01-14, at 7:39 PM, Malcolm Dean wrote:
>> 
>>> http://science.sciencemag.org/content/351/6270/217.summary
>>> Science 351(6270):217-219, 15 January 2016; DOI: 
>>> 10.1126/science.351.6270.217
>>> The social life of quarks
>>> Adrian Cho
>>> 
>>> Particle physicists at Europe's CERN laboratory in Switzerland say they 
>>> have observed bizarre new cousins of the protons and neutrons that make up 
>>> the atomic nucleus. Protons and neutrons consist of other particles called 
>>> quarks, bound by the strong nuclear force. By smashing particles at high 
>>> energies, physicists have blasted into fleeting existence hundreds of other 
>>> quark-containing particles. Until recently, all contained either two or 
>>> three quarks. But since 2014, researchers working with CERN's Large Hadron 
>>> Collider have also spotted four- and five-quark particles. Such tetraquarks 
>>> and pentaquarks could require physicists to rethink their understanding of 
>>> quantum chromodynamics, or they could have less revolutionary implications. 
>>> Researchers hope that computer simulations and more collider studies will 
>>> reveal how the oddball newcomers are put together, but some wonder whether 
>>> experiments will ever provide a definitive answer.
>>> 
>>> ...
>> 
>> 
>> 
>> 
> 
> 
> 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Shannon-Weavers' Levels A, B, C.

2015-10-15 Thread Bob Logan
Dear Loet - I agree with your remarks: "Bob and his colleagues define 
information (2008; p. 28) as “natural selection assembling the very constraints 
on the release of energy that then constitutes work and the propagation of 
organization.” This may have meaning in a biological framework, in which 
selection is considered “natural” resulting in organization(s). In the cultural 
domain, organization (of meaning) remains constructed and contingent; selection 
is never “natural”, but based on codified expectations."

Our definition that you kindly referred do was meant strictly for the biotic 
domain and we contrasted it with symbolic information.

Thank for the mention of our work and best wishes - Bob


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications










On 2015-10-15, at 8:38 AM, Loet Leydesdorff wrote:

> Dear Marcus, Mark, Bob, and colleagues,
>  
> My ambition was a bit more modest: the paper does not contain a theory of 
> meaning or a theory of everything. It is an attempt to solve a problem in the 
> relation between sociology (i.c. Luhmann) focusing on meaning processing (and 
> autopoiesis) and (Shannon-type) information theory. Luhmann left this problem 
> behind by defining information as a selection, while in my opinion entropy is 
> a measure of diversity and therefore variation. I was very happy to find the 
> clues in Weaver’s contributions; Katherine Hayles has signaled this 
> previously.
>  
> Another author important in the background is Herbert Simon who specified the 
> model of vertical differentiation (1973), but without having Maturana & 
> Varela’s theory of autopoiesis for specification of the dynamics. I agree 
> with Luhmann that one has to incorporate ideas from Husserl about horizons of 
> meaning and Parsons’ symbolically generalize media as structuring these 
> horizons for understanding the differentia specifica of the social as 
> non-biological.
>  
> Mark more or less answers his own questions, don’t you? The constraints of 
> the body provide the contingency. The options are not given, but constructed 
> and need thus to be perceived, either by individuals or at the organizational 
> (that is, social) level. The contingency also positions (as different from 
> others) with whom we can then entertain “double contingencies” as the basis 
> for generating variation in the communication. How this works and feeds back 
> on the persons involved seems to me the subject of other disciplines like 
> psychology and neurology. The subject of study is then no longer (or no 
> longer exclusively) res cogitans.
>  
> For example, if a deaf person is provided with a cochlear implant, s/he may 
> enter other domains of perception and be able to provide other contributions 
> to the communication. The double contingencies between him/her and others can 
> be expected to change.
>  
> Bob and his colleagues define information (2008; p. 28) as “natural selection 
> assembling the very constraints on the release of energy that then 
> constitutes work and the propagation of organization.” This may have meaning 
> in a biological framework, in which selection is considered “natural” 
> resulting in organization(s). In the cultural domain, organization (of 
> meaning) remains constructed and contingent; selection is never “natural”, 
> but based on codified expectations. The codes steer the system from above. 
> Differently from biological and engineered systems, this next-order level 
> does not have to be slower than the systems level (Simon). Expectations can 
> proliferate intersubjectively at higher speeds than we can follow. For 
> example, we have to catch up with the literature. Stock exchanges operate 
> faster than local markets because of the more sophisticated codes that 
> mediate the financial exchanges.
>  
> Maturana (1978, at p. 56) introduced the biologist as super-observer who does 
> not participate in the biological phenomena under study, but constructs them: 
> “Thus, talking human beings dwell in two non-intersecting phenomenal 
> domains.” (italic added). Systems which operate exclusively in terms of 
> expectations and anticipations of future states cannot be found in nature; 
> they can only be considered reflexively. They allow us to de- and reconstruct 
> in terms of improving the models, and thus sometimes find new options for 
> technological intervention. Paradoxically, biology as a science is itself 
> part of this cultural domain. For example, we have access to our body only in 
> terms of perceptions (that are steered by expectations) and at the other end 
> by knowledge-based interventions.
>  
> This is my second posting for this week.
>  
> Best,
> Loet
>  
> Loet Leydesdorff
> Professor Emeritus, University

Re: [Fis] Shannon-Weavers' Levels A, B, C.

2015-10-14 Thread Bob Logan
The list is far from complete - organization is a form of information so is DNA 
see Kauffman, Logan et al. Propagation of Organization: An Enquiry at 
www.academia.edu/783503/Propagating_organization_an_enquiry  

Regards to all my FIS friends - Bob


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications



On 2015-10-14, at 12:38 PM, Marcus Abundis wrote:

> RE Mark Johnson's post of Thu Oct 1 09:47:13 on Bateson and imagination . . .
>  – Me Too!
> 
> RE Loet & Stan's postings beginning Thu Oct 1 21:19:50 . . . 
> >  I suggest to distinguish between three levels (following Weaver): <
> > A. (Shannon-type) information processing ; <
> > B. meaning sharing using languages;<
> > C. translations among coded communications.<
> > So, here we have a subsumptive hierarchy"<
> 
> I was wondering if this note means to imply an *all inclusive* list of traits 
> to be considered in modeling information? Or, alternatively . . . what would 
> such an all inclusive list look like?
> 
> Thanks!
> 
>  
> 
> Marcus Abundis
> about.me/marcus.abundis
> 
>   
> 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] FIS newcomer

2015-06-24 Thread Bob Logan
Thanks to Moises Andre Nisenbaum I became acquainted with the very first person 
to write a computer program, namely Ada Lovelace. Her opinion which I also hold 
and seems to represent Bruno's opinion as well can be found in a quote of hers 
which can be found in the Wikipedia article about her. Here is an excerpt from 
that Wikipedia article, which I share for your edification and amusement. All 
the best - Bob

The notes are longer than the memoir itself and include (in Section G[57]), in 
complete detail, a method for calculating a sequence of Bernoulli numbers with 
the Engine, which would have run correctly had the Analytical Engine been built 
(only his Difference Engine has been built, completed in London in 2002).[58] 
Based on this work, Lovelace is now widely considered the first computer 
programmer[1] and her method is recognised as the world's first computer 
program.[59]
Section G also contains Ada Lovelace's famous dismissal of artificial 
intelligence. She wrote that "The Analytical Engine has no pretensions whatever 
to originate anything. It can do whatever we know how to order it to perform. 
It can follow analysis; but it has no power of anticipating any analytical 
relations or truths."
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2015-06-24, at 8:26 AM, Bruno Marchal wrote:

> 
> On 23 Jun 2015, at 07:13, Emanuel Diamant wrote:
> 
>> My dear FIS-friends,
>>  
>> I apologize for not withstanding the pace of our discussion – you are 
>> already busy with the problem of “meaning” (Steven) and I am still preparing 
>> to answer Howard’s letter about linguistic biology…
>>  
>> Dear Howard,
>>  
>> Thank you for your suggestion to “add yet one more approach to the list: 
>> linguistic biology”. Unfortunately, I cannot accept it – because it is 
>> redundant and tautological.
>>  
>> My definition of information is Information is a linguistic description of 
>> structures observable in a given data set. (I apologize for non-providing 
>> any arguments for justifying this statement. Interested people have to go to 
>> my old papers in arXiv, Research Gate or on my web site 
>> http://www.vidia-mant.info ).
>>  
>> For the reasons provided just above (and elsewhere), any use of the term 
>> “Cognitive” implies the use of the term “information” (information 
>> processing) and, thus, already contains linguistic descriptions of data 
>> structures in a given data set (in a given object). Therefore, strengthening 
>> Cognitive biology with Linguistic biology is simply a tautology.
>>  
>> I also cannot accept the allusion to the Guenther Witzany’s work (as an 
>> attempt to justify the backup of Linguistic biology). Meanwhile, Witzany 
>> himself illuminate the issue in his response to Jerry Chandler (20.06.2015). 
>> I myself was enlightened about the subject by a 2004 paper of Eshel 
>> Ben-Jacob (et al) “Bacterial linguistic communication and social 
>> intelligence”, Trends in Microbiology, vol. 12, no. 8, pp. 366-372, August 
>> 2004. I have cited it in my 2009 paper “Some considerations on how the human 
>> brain must be arranged in order to make its replication in a thinking 
>> machine possible”, (.arXiv:1002.0184 [pdf]). I do not want to spend much 
>> more time on this issue and to draw our discussion farther in this direction.
>>  
>> Finally, I do not agree either with your statement that “each approach uses 
>> a helpful metaphor”. Brain as a computer metaphor (dominating in the past 
>> century) has exhausted its life cycle, “Computational” approach today is a 
>> harmful and a dangerous relict. It would be wise not to galvanize it again.
> 
> 
> People should not confuse the computationalist thesis in cognitive science, 
> and the use of this or that type of machine as a metaphor to understand the 
> brain functioning.
> 
> In fact those two things oppose themselves. It can be shown that IF we are 
> machine, then we cannot know for sure which machine we are, nor which 
> computations support us. Information arise from our statistical distribution 
> in the infinitely many computations (already present in a tiny segment of 
> arithmetic). The math shows that he quantum appearances are justified from 
> that computationalist hypothesis.
> 
> In fact computationalism appears to be a vaccine against all reductionistic 
> metaphors. The ideally correct machine, like Peano Arithmetic or 
> Zermelo-Fraenkel set theory, already know that their soul is not a machine, 
> when translating Theaetetus definition of the soul (the knower) with Gödel 
> technic (as I did).
> 
> To believe in the negation of computationalism means to believe in some magic 
> or in some special actual infinities playing some rôle in the brain. 
> 
> Many people still believe that mechanism and materiali

Re: [Fis] IS4IS Vienna 2015

2015-06-11 Thread Bob Logan
To those that attended the IS4IS conference in Vienna: My apologies for missing 
my talk and the cruise on the Danube.  I had to leave early Sun morn for 
Romania because my wife's mother had a stroke and I therefore missed giving my 
presentation. Never  had a chance to say goodbye to the many new friends I made 
in Vienna.  All the best to each of you. And for those that did not make it to 
this fabulous congress there is Gothenburg Sweden in 2017 to be organized by 
Gordana. I highly recommend you attend. With kind regards - Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2015-06-09, at 7:14 PM, Pedro C. Marijuan wrote:

> Dear FIS Colleagues,
> 
> The IS4IS (& FIS) Conference in Vienna is just over. It has been a nice 
> occasion of sociality, an opportunity to meet in person discussants we were 
> in contact many years ago --more than three or four dozen fis parties were 
> there, including a number of colleges from China. Several hundred people have 
> attended beyond our own circle, and that has represented a very good 
> possibility to enlarge the discussions and the contacts. In next discussion 
> sessions some of the new parties will bring their original themes into our 
> list. At the time being we could continue the discussion we were maintaining 
> on the Informational Fourth Domain of Science, as Loet and Moises did last 
> day.
> 
> Thanks are due to Wolfgang for his excellent organization job, including an 
> unforgettable cruise up the Danube river. Thanks to Robert for his 
> Secretariat job, and to other IS4IS members too; and best greetings to 
> Gordana as the new Acting President of IS4IS.
> 
> best wishes--Pedro
> 
> -- 
> -
> Pedro C. Marijuán
> Grupo de Bioinformación / Bioinformation Group
> Instituto Aragonés de Ciencias de la Salud
> Centro de Investigación Biomédica de Aragón (CIBA)
> Avda. San Juan Bosco, 13, planta X
> 50009 Zaragoza, Spain
> Tfno. +34 976 71 3526 (& 6818)
> pcmarijuan.i...@aragon.es
> http://sites.google.com/site/pedrocmarijuan/
> -
> 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL?

2015-05-24 Thread Bob Logan
I believe Moises meant this email for Pedro and all of fis so I am copying you 
with my reply to Moises

On 2015-05-24, at 7:17 AM, Moisés André Nisenbaum wrote:

> Hi, Pedro, Bob and FISers.
> It is interesting that the original post lead us to a variety of very 
> important subjects. Thank you Pedro and Bob for resuming, replying and 
> sending more ideas about those subjects.
> I understand that one of the greatest job of Information Science is to study 
> how Science was organized and how scientists communicate, historically since 
> the first paper was published in Philosophical Transactions at 1666. With the 
> advent of Information Society, this organization of Science is changing. 
> Because of the huge number of disciplines the inter and transdisciplinary has 
> becoming more and more important. In my opinion, Bob’s idea of “Scientific 
> Undisciplinarity” can be the start point of Interdisciplinarity. However, I 
> believe what Japiassu (a great Brazilian philosopher) said:  that 
> Interdisciplinarity is impossible without disciplinarity. 

This is my point too when I wrote: "Now I am not saying that learning a 
discipline is a bad thing as it provides a solid training and an understanding 
of how a set of principles describes certain phenomena. It is a model of how a 
scientific, scholarly or artistic practice can be carried out. As long as one 
does not become a disciple of one's discipline or disciplines they can be very 
useful for creating a new discipline or going beyond ones discipline."
> 
> Returning to the Four Great Domains, it is important to understand that it is 
> a “model” that we are using to understand this new way of Science 
> organization and scientific communication. As all models, this approach have 
> advantages but also limitations that we must know and deal with them. For 
> example, in his model, Rosenbloom proposes that disciplines in “Humanities 
> are part of a broad conception of Social Sciences great scientific domain” 
> (it is a big limitation). 
Good point
> 
> To make my Idea clear, here are my core questions: 
> 1) The scientific disciplines can be represented by a combination of four 
> Great Scientific Domains?
Science that is value free can be represented by a combination of four Great 
Scientific Domains but we need science with values - what good is knowledge if 
it is not put to good use to benefit humankind.
The four great science domains are not enough - they give us knowledge but we 
also need wisdom and hence humanistic studies 

> 2) The Informational is the fourth Great Scientific Domain? Informational or 
> computing does not matter they are similar - you cannot do information 
> without computing and similarly you can not do computing without information 
> - and why choose why not Five Great Scientific Domains and a few humanistic 
> ones as well.
> 3) Is choose of the great domains arbitrary? YES
> 
> The third question can be thought as an analogy (to be verified). The idea is 
> that disciplines in domains can be analogous to "events" in space time and 
> then can have a graphic representation (not scientometric) and have some 
> symmetries (coordinate transformation, for example).
> 
> My goal is to try to verify these questions empirically and I believe that 
> analysis of maps of science, as developed by Loet, can be a good approach.

Yes a good approach but you need to do more the classify - we need to 
synthesize science with value and with human-centric concerns
> 
> In Brazil, we send “hugs” (“abraços” in Portuguese) at the end of messages. 
> So,
> 
> Abraços
> Moisés.

Re-abracos and trans-abracos a todos/tutti/all - Bob
> 
> Reference:
> JAPIASSU, Hilton. Interdisciplinaridade e patologia do saber. Rio de Janeiro, 
> Imago, 1976.
> 
> -- 
> Moisés André Nisenbaum
> Doutorando IBICT/UFRJ. Professor. Msc.
> Instituto Federal do Rio de Janeiro - IFRJ
> Campus Maracanã
> moises.nisenb...@ifrj.edu.br

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL? - What is a discipline?

2015-05-23 Thread Bob Logan
Dear Colleagues - I have been reading the posts in this thread and enjoying the 
conversation. I started playing with the notion of discipline and came up with 
these undisciplined playful thoughts which I believe provide an interesting or 
at least an alternative perspective on the notion of a discipline. A discipline 
is a tool, a way of organizing ideas that result from scientific inquiry or any 
other form of scholarly activity and even artistic activity. Now every tool 
provides both service and disservice.  All of the posts so far have dealt with 
the service of discipline. Here are some thoughts about the possible disservice 
of discipline. Please take the following with a grain of salt. I believe the 
notion of a  discipline is anti-thetical to scientific inquiry in the sense 
that  it confines ones thinking to the confines of a discipline. One should not 
be disciplined by a discipline but be free to go beyond the boundaries of that 
discipline. Note that the root of the word discipline is disciple. If one is to 
be free to explore new ideas and new phenomena one should not be a disciple of 
the scientists or thinkers that created a discipline. Now I am not saying that 
learning a discipline is a bad thing as it provides a solid training and an 
understanding of how a set of principles describes certain phenomena. It is a 
model of how a scientific, scholarly or artistic practice can be carried out. 
As long as one does not become a disciple of one's discipline or disciplines 
they can be very useful for creating a new discipline or going beyond ones 
discipline. Perhaps the notion of trans-disciplinary is not such a bad notion 
if one thinks of trans as beyond. 

As to the notion that there are these four super categories of disciplines or 
great domains of science: Physics, biology, social and the 4th domain which is 
computing or infomation depending on how one likes to classify thing here are 
some thoughts. I find these classification schemes and their inter-relations 
fascinating and useful. But I believe another challenge worthy of consideration 
is to consider the interaction of the great domains of science with the great 
domains of the humanities, ethics, the arts. How does we connect the knowledge 
of the sciences with the wisdom of how to best use that knowledge to benefit 
humankind. Here are some thoughts I developed before this thread began that 
might be pertinent to our current discussion. Science can be thought of as 
organized knowledge given that the etymologically the word science derives from 
the Latin to know: 
en.wiktionary.org/wiki/science‎
[edit]. From Old French science, from Latin scientia (“knowledge”), from 
sciens, the present participle stem of scire (“know”).

Data, Information, Knowledge, Wisdom: The relationship of data, information, 
knowledge and wisdom
“Where is the wisdom we have lost in knowledge?

Where is the knowledge we have lost in information?” ­– TS Eliot

“Where is the meaning we have lost in information?” ­– RK Logan

“• Data are the pure and simple facts without any particular structure or 
organization, the

  basic atoms of information,

• Information is structured data, which adds meaning to the data and gives it 
context and

  significance,

• Knowledge is the ability to use information strategically to achieve one's 
objectives, and

• Wisdom is the capacity to choose objectives consistent with one's values and 
within a larger social context (Logan 2014).”

While checking out the etymology of science I encountered the following on 
http://www.luminousgroup.net/2013/05/on-etymology-of-science.html

"“This might be a good time to examine the etymology of the word science, It 
comes from the Latin scientia, from sciens, which means having knowledge, from 
the present participle of scire, meaning to know, probably—and here's where it 
gets exciting—akin to the Sanskrit Chyati, meaning he cuts off, and Latin 
scindere, to split, cleave."

Science operates by cutting off questions of value. And this is why I advocate 
consideration of the four great domains of science with the great domain of the 
humanities, the arts and ethics. The greatest challenges facing humanity is not 
just increasing our store of knowledge through science but also how we choose 
to deploy our scientific knowledge in the best interest of human kind. 

So ends my challenge to Moises Nisenbaum and Ken Herold with thanks for 
stimulating this interesting conversation

with kind regards -Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Concluding the Lecture

2015-02-19 Thread Bob Logan
I join Pedro in thanking all of the participants of the Deacon conversation - I 
enjoyed it and would love to receive more comments off line - Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2015-02-16, at 8:02 AM, Pedro C. Marijuan wrote:

> Dear FIS Colleagues,
> 
> February is well advanced, and it is time to put a definite end to the New 
> Year Lecture. We have had a very interesting discussion time, though rather 
> silent in these final days. It was nice counting with Terry chairmanship, he 
> has had a very hard work with all those responses--thanks a lot to him. 
> Thanks are also due to Bob Logan for his implication in organizing the 
> Lecture and to the Pirates thought collective for their participation. 
> In the next session, in ten days or so, we will approach the global 
> phenomenon of intelligence: artificial and natural, rational and emotional, 
> scientific and artistic, East and West... let us wait and see.
> 
> Also, hearing from  Ken Herold on Library Science and from Moises Andre on 
> interaction between disciplines was sort of a nice surprise: the connection 
> between the traditional approach and the new one becomes a highly strategic 
> goal for information science. Maybe it is one of the topics we have to 
> address in a specific discussion session. Moises, Ken--does it sound 
> interesting?
> 
> Best wishes to all
> 
> --Pedro
> 
> -- 
> -
> Pedro C. Marijuán
> Grupo de Bioinformación / Bioinformation Group
> Instituto Aragonés de Ciencias de la Salud
> Centro de Investigación Biomédica de Aragón (CIBA)
> Avda. San Juan Bosco, 13, planta X
> 50009 Zaragoza, Spain
> Tfno. +34 976 71 3526 (& 6818)
> pcmarijuan.i...@aragon.es
> http://sites.google.com/site/pedrocmarijuan/
> -
> 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Concluding the Lecture? - In Praise of Teleodynamics

2015-02-05 Thread Bob Logan
sms that 
evolve and reproduce themselves. This idea is extended in two ways. First by 
showing that organization, science, and economics can also be construed as 
organisms that evolve and reproduce themselves. Secondly we make use of the 
notion of teleodynamics that Deacon (2012) introduced and developed in his book 
Incomplete Nature to explain the nature of life, sentience, mind and a self 
that acts in its own interest. It is suggested that culture, language, 
organization, science, economics and technology (CLOSET) like living organisms 
also act in their own self-interest, are self-correcting and are to a certain 
degree autonomous even though they are obligate symbionts dependent on their 
human hosts for the energy that sustains them. Specifically it will be argued 
that the individual members of CLOSET are essentially teleodynamic systems, 
which Deacon defines as “self-creating, self-maintaining, self-reproducing 
individuated systems (ibid, 325).”  

I hope some of my FIS colleagues will be interested in these two articles and 
will provide me with some feedback.

Best wishes - Bob Logan


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2015-02-04, at 6:33 AM, Pedro C. Marijuan wrote:

> No problem Bob, we can prolong the NY Lecture some extra days. My concern was 
> the overload that these final messages ---more intense and argumentative-- 
> could be causing on Terry's time budget. It is upon him whether he wants to 
> continue responding in the current regime for instance until February the 
> 15th (it means 12 extra days) or if he prefers to finalize right now and 
> afterwards behave as a common participant, limited to two responding messages 
> per week. We would start the next discussion session some weeks later, so 
> there might be room for continuing the debate, but as an aftermath of the 
> finalized Lecture. In my experience, putting limits to things clarifies the 
> panorama and favors the debate. Very rarely we have had moderation conflicts 
> in this list--what I personally thank to the general good mood of FISers. 
> Nevertheless as a moderator I have to take care that we are not invaded by a 
> cacophony of messages that block interesting exchanges, as happened in the 
> first years of this list (18 years old!), and that our lecturing invitees do 
> not get into unnecessary burdens... Navigating in between Scylla and 
> Charibdis is not always easy! 
> best--Pedro 
> 
> Bob Logan wrote:
>> 
>> Dear Pedro, Terry and Fellow FISers - 
>> 
>> I was composing the email below when your email appeared asking us not to 
>> respond any further to Terry's final remarks. I disagree with this arbitrary 
>> cutoff as I was about to send out what follows below. It also seems an 
>> abridgement of free speech to ask us not to discuss an issue we might be 
>> interested in. Perhaps I am unfamiliar with the ground rules of the FIS list 
>> but the other listservs I belong to have never attempted to cutoff a topic. 
>> There have been occasions where they have asked an individual who posts too 
>> often to not turn the list into their own bully pulpit. Anyway as the guy 
>> who suggested that we ask Terry to lead a FIS conversation I will exercise 
>> the perogative to share my thoughts one more time. I would also be prepared 
>> to accept your restriction if you had given us advanced notice with an exact 
>> deadline of shutting down this thread.
>> Here is what I had written when you sounded the bell as a death knell to 
>> this discussion which is submitted with respect and the undertaking to abide 
>> by the referee's decision and not comment on Terry's final remarks although 
>> I would love to hear from my colleagues their final thoughts on Terry's 
>> teleodynamic approach - Bob 
>> 
>> In order to respect the "only 2 per week" constraint here are my comments to 
>> the flurry of recent posts in this thread. There is one caveat with which I 
>> wish to preface my remarks and it is this:
>>  I am a member of Terry research team and therefore I am biased, but I would 
>> like to share with my FIS  colleagues why I believe the teleodynamic 
>> approach that Terry has developed is the best game in town for understanding 
>> the origin of life and the nature of information.
>> 
>> Pedro wrote on Jan 30:
>> "At your convenience, during the first week of February or so we may put an 
>> end to the ongoing New Year Lecture --discussants willing to enter their 
>> late comments should hurry up. Y

Re: [Fis] Concluding the Lecture? - In Praise of Teleodynamics

2015-02-03 Thread Bob Logan
gt; 
> This is what I am trying to accomplish. Though deceptively simple, I
> believe that the autogenic model system is just sufficiently complex
> to provide complete illumination of each of the critical defining
> features of the information concept—sign medium properties (entropies,
> uncertainty, constraint), reference (aboutness), significance
> (function, value, normativity), and interpretation (adaptation,
> intelligence)—while not artificially simplifying the issue by ignoring
> one or the other of these facets.
> 
> Because of its simplicity none of these basic concepts are left in the
> dark as black boxes or excluded as taboo concepts. But of course,
> working at such a basic level means that the nature of more complex
> phenomena as thinking, subjectivity, language, and culture (to mention
> only a few) are not yet well illuminated by this light. This isn't to
> suggest that other pursuits in these other domains should be
> abandoned—for they at least clear away some of the underbrush creating
> paths that will help to ease the linkage between the different
> subterritories when finally the light brightens (to continue the
> metaphor). I just believe that this middle level is where the light
> best illuminates all the critical foundational issues.
> 
> I don't expect agreement, but so far I haven't felt that the specific
> components of this proposal have been addressed in this thread. And in
> these closing days of discussion (as well as in future privately
> shared emails after this window closes) I hope to receive some
> suggestions and constructive criticisms pointing to where I might go
> next with this approach.
> 
> Thanks for all your inputs.  Terry
> 
> On 1/30/15, Bob Logan  wrote:
>> Thanks Pedro for your remarks. We have not reached our destination as you
>> point out but the important thing is to enjoy the journey which I certainly
>> have. It is inevitable that with such a slippery concept as information that
>> there will be different destinations depending on the travellers but what I
>> like about FIS in general and the dialogue that Terry prompted in particular
>> is the interesting ideas and good company I encountered along the way. As
>> for your remark about searching where there is light I suggest that we pack
>> a flashlight for the next journey to be led by our tour guide Zhao Chuan.
>> One common theme for understanding the importance of both information and
>> intelligence for me is interpretation and context (figure/ground or
>> pragmatics). Thanks to all especially Terry for a very pleasant journey. -
>> Bob
>> __
>> 
>> Robert K. Logan
>> Prof. Emeritus - Physics - U. of Toronto
>> Chief Scientist - sLab at OCAD
>> http://utoronto.academia.edu/RobertKLogan
>> www.physics.utoronto.ca/Members/logan
>> www.researchgate.net/profile/Robert_Logan5/publications
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> On 2015-01-30, at 8:25 AM, Pedro C. Marijuan wrote:
>> 
>>> Dear Terry and colleagues,
>>> 
>>> At your convenience, during the first week of February or so we may put an
>>> end to the ongoing New Year Lecture --discussants willing to enter their
>>> late comments should hurry up. Your own final or concluding comment will
>>> be appreciated.
>>> 
>>> Personally, my late comment will deal with the last exchange between Bob
>>> and Terry, It is about the point which follows:  "...there was no thesis
>>> other than the word information is a descriptor for so many different
>>> situations and that it is a part of a semantic web - no roadmap only a
>>> jaunt through the countryside of associations - a leisurely preamble."
>>> In my own parlance, we have been focusing this fis session on the
>>> microphysical foundations of information (thermodynamic in this case)
>>> which together with the quantum would look as the definite foundations of
>>> the whole field, or even of the whole "great domain of information." But
>>> could it be so? Is there such thing as a "unitary" foundation? My
>>> impression is that we are instinctively working "where the light is",
>>> reminding the trite story of the physicists who has lost the car keys and
>>> is looking closest to the street lamp.  The point I suggest is that the
>>> different informational realms are emergent in the strongest sense: almost
>>> no trace of the underlying information realms would surface. Each realm
>>> has to invent throughout its own engines of invention the different
>>> in

Re: [Fis] Concluding the Lecture?

2015-01-30 Thread Bob Logan
Thanks Pedro for your remarks. We have not reached our destination as you point 
out but the important thing is to enjoy the journey which I certainly have. It 
is inevitable that with such a slippery concept as information that there will 
be different destinations depending on the travellers but what I like about FIS 
in general and the dialogue that Terry prompted in particular is the 
interesting ideas and good company I encountered along the way. As for your 
remark about searching where there is light I suggest that we pack a flashlight 
for the next journey to be led by our tour guide Zhao Chuan. One common theme 
for understanding the importance of both information and intelligence for me is 
interpretation and context (figure/ground or pragmatics). Thanks to all 
especially Terry for a very pleasant journey. - Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2015-01-30, at 8:25 AM, Pedro C. Marijuan wrote:

> Dear Terry and colleagues,
> 
> At your convenience, during the first week of February or so we may put an 
> end to the ongoing New Year Lecture --discussants willing to enter their late 
> comments should hurry up. Your own final or concluding comment will be 
> appreciated.
> 
> Personally, my late comment will deal with the last exchange between Bob and 
> Terry, It is about the point which follows:  "...there was no thesis other 
> than the word information is a descriptor for so many different situations 
> and that it is a part of a semantic web - no roadmap only a jaunt through the 
> countryside of associations - a leisurely preamble." 
> In my own parlance, we have been focusing this fis session on the 
> microphysical foundations of information (thermodynamic in this case) which 
> together with the quantum would look as the definite foundations of the whole 
> field, or even of the whole "great domain of information." But could it be 
> so? Is there such thing as a "unitary" foundation? My impression is that we 
> are instinctively working "where the light is", reminding the trite story of 
> the physicists who has lost the car keys and is looking closest to the street 
> lamp.  The point I suggest is that the different informational realms are 
> emergent in the strongest sense: almost no trace of the underlying 
> information realms would surface. Each realm has to invent throughout its own 
> engines of invention the different informational & organizational  principles 
> that sustain its existence. It is no obligate that there will be a successful 
> outcome In the extent to which this plurality of foundations is true, 
> solving the microphysical part would be of little help to adumbrating the 
> neuronal/psychological or the social information arena.
> 
> The roadmap Bob suggests is an obligatory exploration to advance; we may 
> disagree in the ways and means, but not in the overall goal. It is a mind 
> boggling exercise as we have to confront quite different languages and styles 
> of thinking. For instance, the next session we will have at FIS (in a few 
> weeks) is an attempt of an excursion on "Intelligence Science".  Presented by 
> Zhao Chuan, the aim is of confronting the phenomenon of intelligence from a 
> global perspective amalgamating science (artificial intelligence), emotions, 
> and art (poetic and pictorial). Not easy, but we will try
> 
> Anyhow,  Terry, we much appreciate your insights and the responses you  have 
> produced along the Lecture. It was a nice intellectual exercise.
> 
> Best wishes to all---Pedro
> 
> -
> Pedro C. Marijuán
> Grupo de Bioinformación / Bioinformation Group
> Instituto Aragonés de Ciencias de la Salud
> Centro de Investigación Biomédica de Aragón (CIBA)
> Avda. San Juan Bosco, 13, planta X
> 50009 Zaragoza, Spain
> Tfno. +34 976 71 3526 (& 6818)
> pcmarijuan.i...@aragon.es
> http://sites.google.com/site/pedrocmarijuan/
> -
> 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Fwd: Some ideas on the multidimensionality of information.

2015-01-26 Thread Bob Logan
Dear FIS colleagues - here is an exchange between Terry Deacon and me that I 
thought would be of interest to you. When I sent this email to Terry and the 
Pirates it was stopped at the border of FIS listserv because I had copied too 
many people. So I am sending it to you separately. Perhaps you should read 
Terry's email to me first which is a response to my Jan 25 email to him, the 
Pirates and copied to FIS. I am not sure if my Jan 25 posting made it onto the 
FIS listserv. If you have not seen it it is just after Terry's response to me 
also on Jan 25 - warm regards to all - Bob

Terry - there was no thesis other than the word information is a descriptor for 
so many different situations and that it is a part of a semantic web - no 
roadmap only a jaunt through the countryside of associations - a leisurely 
preamble. The post was meant to stimulate new ways to think about information 
and to provoke new thoughts like the one in your post below. You response is 
valuable so I have shared it with the rest of the T&P team and our FIS friends. 
It provides another context for the triad you introduced a few days ago of 
significance, reference and medium which inspired my mind map. By the way you 
also seemed to have made an association of that fourth element you mentioned, 
namely that interpretation is required to turn data into facts.

I called my musings a mind map because it mapped out all the associations with 
information that your triad plus interpretation provoked. It started out by my 
recording different associations that popped into my head which included some 
poetry. Being a somewhat uninhibited kind of guy I thought it was worth sharing 
and I am glad I did because of your significant association of the triad of 
significance, reference and medium with wisdom, knowledge and Shannon 
information and the notion that data aren't facts unless interpreted.

One more thought about wisdom from the Talmud: 
Ethical Teachings - Selections from Pirkei Avot
4:1 Ben Zoma said, "Who is wise? The one who learns from everyone, as it is 
said, 'From all who would teach me, have I gained understanding.' [Psalms 
119:99] 


warm regards to all - Bob


On 2015-01-25, at 9:24 PM, Terrence W. DEACON wrote:

> “Where is the wisdom we have lost in knowledge?
> Where is the knowledge we have lost in information?”
> – TS Eliot
> 
> His hierarchical prescinding is the essence of the significance
> (wisdom) - reference (knowledge) - medium (Shannon information) nested
> hierarchy.
> 
> Data aren't "facts" until interpreted and their reference is determined.
> 
> Bob, I'm afraid I don't quite get a clear sense of your intended
> thesis. So before I go online please provide a road map. — Terry


>> On 1/25/15, Bob Logan  wrote:
>>> Dear FISers -I have been following the FIS conversation re Terry's paper. I
>>> have let Terry and Jeremy carry the burden of the dialogue with FIS. As an
>>> FISer and a Pirate I have been neutral and did not want to enter the fray
>>> but I now have something worth sharing - some of it stimulated by the FIS
>>> dialogue and some by internal Terry and the Pirates conversations within our
>>> research group. I have a rather long post to make up for my absence in the
>>> conversation to date. I hope that I will have the benefit of your comments.
>>> I have just shared this paper with my T&P colleagues through our normal
>>> email channel.
>>> 
>>> The Many Dimensions of Information; No Word is an Island  – A Mind Map
>>> 
>>> Bob Logan
>>> 
>>> Prolegma and an Abstract: The concept of information has many dimensions and
>>> is described in many different ways. It has many different associations. It
>>> has many different definitions. It has many different interpretations. It
>>> has many different interpreters. This is an attempt to identify all of these
>>> associations, definitions, interpretations, and interpreters. It is in a
>>> certain sense a mind map but it is the map of my mind, my definitions, my
>>> associations, my interpretations, what is significant for me about
>>> information, what information means for me, the thoughts that thinking about
>>> information inspire and the thinkers that I believe have and can provide
>>> insights into the nature of information. It is a catalogue. It is a
>>> hypothesis. It has been compiled by induction, deduction and abduction. For
>>> you the reader it is to communicate the complexity and many dimensions of
>>> information. For me it is a starting point to rethink every thing I ever
>>> thought about information including
>>> 
>>> 1. My reading of Incomplete

[Fis] Some ideas on the multidimensionality of information.

2015-01-25 Thread Bob Logan
Dear FISers -I have been following the FIS conversation re Terry's paper. I 
have let Terry and Jeremy carry the burden of the dialogue with FIS. As an 
FISer and a Pirate I have been neutral and did not want to enter the fray but I 
now have something worth sharing - some of it stimulated by the FIS dialogue 
and some by internal Terry and the Pirates conversations within our research 
group. I have a rather long post to make up for my absence in the conversation 
to date. I hope that I will have the benefit of your comments. I have just 
shared this paper with my T&P colleagues through our normal email channel.

The Many Dimensions of Information; No Word is an Island  – A Mind Map

Bob Logan

Prolegma and an Abstract: The concept of information has many dimensions and is 
described in many different ways. It has many different associations. It has 
many different definitions. It has many different interpretations. It has many 
different interpreters. This is an attempt to identify all of these 
associations, definitions, interpretations, and interpreters. It is in a 
certain sense a mind map but it is the map of my mind, my definitions, my 
associations, my interpretations, what is significant for me about information, 
what information means for me, the thoughts that thinking about information 
inspire and the thinkers that I believe have and can provide insights into the 
nature of information. It is a catalogue. It is a hypothesis. It has been 
compiled by induction, deduction and abduction. For you the reader it is to 
communicate the complexity and many dimensions of information. For me it is a 
starting point to rethink every thing I ever thought about information including

1. My reading of Incomplete Nature by Terrence Deacon

2. The discussions I have had with Terry and the Pirates (Terry is Terrence 
Deacon, and the Pirates are the group that meets with Terry more or less once a 
week in his home in Berkeley California with others like me joining by Skype. 
The group continues those weekly discussions by email).

3. My FIS (Foundations of Information Science) listserv discussions.

4. The paper I co-authored with Stuart Kauffman and others entitled “The 
Propagation of Information: An Enquiry” where we posited that “the constraints 
that allow autonomous agents to channel free energy into work are connected to 
information: in fact, simply put, the constraints are the information, are 
partially causal in the diversity of what occurs in cells, and are part of the 
organization that is propagated.”

5. My book What is Information? - Propagating Organization in the 
Biosphere, the Symbolosphere, the Technosphere and the Econosphere (Logan 2014)

Acknowlegement: This mind map project is inspired by Terry Deacon’s remarks 
during our Jan 21 T&P session and by an email he sent the following day, where 
he wrote,
 
Maxwell formulated the laws of electromagnetism not as one equation but as four 
interrelated equations, each defining a fundamental relation, i.e. formalizing 
the findings of Gauss, Faraday, and Ampere. Perhaps to formalize information we 
will need at least three: corresponding to medium properties, referential 
properties, and significance properties— and possibly a fourth defining 
interpretation (though this may be what the
three together define)

I. Words

The Semantic Web: Words are interconnected –they form a Semantic Web. Their 
meaning arises in association with all the other words in their language and, 
as is the case with the word information, in association with its Latin and 
French origins. Words are entangled, networked, interdependent, interconnected, 
interwoven, elements of a web, contextualized.

No word is an island entire of itself; every word is a piece of the language 
from which it emerges, a part of the language; the death of any association 
with a word diminishes it because every word is involved with every other word. 
Never ask what is the exact meaning of a word or for whom that word has 
meaning; depending on all your experiences that word tolls for thee and has a 
particular meaning for thee. (A riff on John Donne’s 'No Man is an Island'):

No man is an island entire of itself; every man
is a piece of the continent, a part of the main;
if a clod be washed away by the sea, Europe
is the less, as well as if a promontory were, as
well as any manner of thy friends or of thine
own were; any man's death diminishes me,
because I am involved in mankind.
And therefore never send to know for whom
the bell tolls; it tolls for thee.  

That no word is an island is especially true of the following words:

Information, Inform, Form, Formal, Formal Cause, Formality, Formation (in 
English this word has the meaning of a form of organization whereas in French 
it has the meaning of training).  And these associations are just the beginning.

Formal cause links to efficient cause, material cause and final cause ala 
Ar

Re: [Fis] Neuroinformation?

2014-12-05 Thread Bob Logan
Cara Francesco - I enjoyed your post, particularly  your sentence: "Un'opera 
d'arte o un bene culturale è nello stesso tempo informato e informatore".  I 
interpreted it  to mean "a work of art or a cultural good is at the same time 
informed and an informer."  In other words it is informed and it informs. It 
informs the one who beholds the work of art but by whom is the work of art  
informed. Grazie e auguri - Bob Logan
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2014-12-05, at 1:15 AM, Francesco Rizzo wrote:

> Cari Tutti,
> Krassimir Markov ha ragione. L'informazione è un processo spazio-temporale 
> statico-dinamico. Un'opera d'arte o un bene culturale è nello stesso tempo 
> informato e informatore.Per questo è meglio parlare di tras-informazione. La 
> Neuroinformation è la più alta e completa forma di tras-informazione 
> emo-ra-zionale (intelligenza razionale e intelligenza emotiva). Essa si 
> articola in : significazione, informazione, comunicazione. Triade semiotica 
> indispensabile per comprendere e interpretare ogni  esistenza e ogni 
> conoscenza del mondo fisico, psichico e metafisico. Qualunque scienza 
> naturale o umana o sociale non può farne a meno.
> Grazie e auguri per Carolina Isiegas.
> Francesco Rizzo.
> 
> 
> 2014-12-04 15:57 GMT+01:00 Krassimir Markov :
> Dear Bob,
> I think, there is no conflict between two points of view – information may be 
> a process and it may be a static depending of what kind of reflection it is.
> For instance, we reflect the world around:
> - as static - by photos, art images, sculptures, etc.;
> - as dynamic - by movies, theater plays, ballet, etc.;
> - and, at the end, by both types – by static text which creates dynamical 
> imaginations in our consciousness.
> Friendly regards
> Krassimir
>  
> PS: This is my second post for this week. So, I say: Goodbye to the next one!
>  
>  
>  
> From: Bob Logan
> Sent: Thursday, December 04, 2014 3:54 PM
> To: Joseph Brenner
> Cc: fis@listas.unizar.es
> Subject: Re: [Fis] Neuroinformation?
>  
> Dear all - I support Joseph's remarks and would suggest that information in 
> general is a process that unfortunately is formulated as a noun. Inspired by 
> Bucky Fuller's I think I am a verb I suggest that "Information is a verb" It 
> is a verb because it describes a process. Although that solves one problem we 
> need to be able to describe a set of signs that have the potential to 
> initiate the process of informing through interpretation. I would not suggest 
> we create another word but recognize that the word information has many 
> meanings and that when it is describing a process it has a verb-like quality 
> to it and when it describes a set of sign that have the potential to be 
> interpreted and hence become information it is acting as a noun. I would also 
> suggest that a simple definition of the term information is not possible 
> because its meaning is so context dependent. This is true of all words but 
> even more so for information. For those that agree with my sentiments the 
> above is information and for those that do not it is nonsense. My best wishes 
> to both groups,  Bob Logan
> __
>  
> Robert K. Logan
> Prof. Emeritus - Physics - U. of Toronto
> Chief Scientist - sLab at OCAD
> http://utoronto.academia.edu/RobertKLogan
> www.physics.utoronto.ca/Members/logan
> www.researchgate.net/profile/Robert_Logan5/publications
>  
> 
>  
> 
> 
> 
> 
>  
> On 2014-12-04, at 6:40 AM, Joseph Brenner wrote:
> 
>> Dear Dr. Isiegas,
>> 
>> I will add my support to the extended concept of information that inheres in 
>> the work of Robert Ulanowicz and John Collier. I would just add that I like 
>> to call it information-as-process, to call attention to its 'structure' 
>> being dynamic, with individual neurones involved in a cyclic (better spiral 
>> or sinusoidal) movement between states of activation and inhibition. I have 
>> ascribed an extension of logic to this form of alternating actual and 
>> potential states in complex processes at all levels of reality.
>> 
>> Best wishes,
>> 
>> Joseph B.
>> 
>> - Original Message - From: "Robert E. Ulanowicz" 
>> To: "Carolina Isiegas" 
>> Cc: 
>> Sent: Wednesday, December 03, 2014 6:30 PM
>> Subject: Re: [Fis] Neuroinformation?
>> 
>> 
>> Dear Dr. Isiegas:
>> 
>> I envisio

Re: [Fis] Neuroinformation?

2014-12-04 Thread Bob Logan
Dear all - I support Joseph's remarks and would suggest that information in 
general is a process that unfortunately is formulated as a noun. Inspired by 
Bucky Fuller's I think I am a verb I suggest that "Information is a verb" It is 
a verb because it describes a process. Although that solves one problem we need 
to be able to describe a set of signs that have the potential to initiate the 
process of informing through interpretation. I would not suggest we create 
another word but recognize that the word information has many meanings and that 
when it is describing a process it has a verb-like quality to it and when it 
describes a set of sign that have the potential to be interpreted and hence 
become information it is acting as a noun. I would also suggest that a simple 
definition of the term information is not possible because its meaning is so 
context dependent. This is true of all words but even more so for information. 
For those that agree with my sentiments the above is information and for those 
that do not it is nonsense. My best wishes to both groups,  Bob Logan
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2014-12-04, at 6:40 AM, Joseph Brenner wrote:

> Dear Dr. Isiegas,
> 
> I will add my support to the extended concept of information that inheres in 
> the work of Robert Ulanowicz and John Collier. I would just add that I like 
> to call it information-as-process, to call attention to its 'structure' being 
> dynamic, with individual neurones involved in a cyclic (better spiral or 
> sinusoidal) movement between states of activation and inhibition. I have 
> ascribed an extension of logic to this form of alternating actual and 
> potential states in complex processes at all levels of reality.
> 
> Best wishes,
> 
> Joseph B.
> 
> - Original Message - From: "Robert E. Ulanowicz" 
> To: "Carolina Isiegas" 
> Cc: 
> Sent: Wednesday, December 03, 2014 6:30 PM
> Subject: Re: [Fis] Neuroinformation?
> 
> 
> Dear Dr. Isiegas:
> 
> I envision neuroinformation as the mutual information of the neuronal
> network where synaptic connections are weighted by the frequencies of
> discharge between all pairs of neurons. This is directly analogous to a
> network of trophic exchanges among an ecosystem, as illustrated in
> <http://people.biology.ufl.edu/ulan/pubs/SymmOvhd.PDF>.
> 
> Please note that this measure is different from the conventional
> sender-channel-receiver format of communications theory. It resembles more
> the "structural information" inhering in the neuronal network. John
> Collier (also a FISer) calls such information "enformation" to draw
> attention to its different nature.
> 
> With best wishes for success,
> 
> Bob Ulanowicz
> 
>> Dear list,
>> 
>>I have been reading during the last year all these interesting
>> exchanges. Some of them terrific discussions! Given my scientific
>> backgound
>> (Molecular Neuroscience), I would like to hear your point of view on the
>> topic of neuroinformation, how information "exists" within the Central
>> Nervous Systems. My task was experimental; I was interested in
>> investigating the molecular mechanisms underlying learning and memory,
>> specifically, the role of the cAMP-PKA-CREB signaling pathway in such
>> brain
>> functions (In Ted Abel´s Lab at the University of Pennsylvania, where I
>> spent 7 years). I generated several genetically modified mice in which I
>> could regulate the expression of this pathway in specific brain regions
>> and
>> in which I studied the effects of upregulation or downregulation at the
>> synaptic and behavioral levels. However, I am conscious that the
>> "information flow" within the mouse Nervous System is far more complex
>> that
>> in the "simple" pathway that I was studying...so, my concrete question for
>> you "Fishers" or "Fisers", how should we contemplate the micro and macro
>> structures of information within the neural realm? what is
>> Neuroinformation?
>> 
>> Best wishes,
>> 
>> 
>> --
>> Carolina Isiegas
>> ___
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>> 
> 
> 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] "The Travelers"

2014-10-23 Thread Bob Logan
Dear Stan - could you clarify that last sentence of your = perhaps I 
misinterpreted it - are you saying that context in a purely physical abiotic  
situation is somehow related to interpretation and hence information. I 
apologize in advance if I mis-interpreted your remarks.

In framing my advanced apology to you Stan, I inadvertently used the term 
mis-interpreted. This sparked the following idea: Mis-information is due to 
misinterpretation of the receiver whereas dis-informatio is due to the intended 
deception of the sender. 

A further thought about whether abiotic physical processes can be construed as 
information:  Meaning and hence information can only exist for a system that 
has a purpose, a telos, or an end it wishes to achieve, i.e abiotc system such 
as a living organism or even a cell.   "So-called information" with out meaning 
is only signals. And even there, to say that the sun's gravitational pull on 
the earth is a signal is to engage in anthropomorphic thinking. And to suggest 
that the sun's gravitational pull on the earth is information does not make 
sense because there is no way that anything can have meaning for the earth. The 
earth has no objective or  purpose, Gaia hypothesis not withstanding, For us 
earthlings it is another matter. We have figured out that the sun exerts a 
gravitational pull on the earth and the statement to that effect has meaning 
for those able to grasp elementary physics but the gravitational pull is not 
information in itself only a description of that gravitational pull of the sun 
on the earth is information. 

Bob

__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications

On 2014-10-23, at 9:27 AM, Stanley N Salthe wrote:

> Pedro wrote:
> 
> PM: Regarding the theme of physical information raised by Igor and Joseph, 
> the main problematic aspect of information (meaning) is missing there. One 
> can imagine that as two physical systems interact, each one may be 
> metaphorically attributed with meaning respect the changes experimented. But 
> it is an empty attribution that does not bring any further interesting aspect.
> 
> SS: I have advanced (  On the origin of semiosis.  Cybernetics and Human 
> Knowing 19 (3): 53-66. 2012 ) the idea that whenever context influences 
> importantly any reaction which, even in the physical realm, might be viewed 
> as an informational exchange, there is the forerunner of the interpretation 
> of an interaction, Such a simple 'interpretation' (proto-interpretation) 
> would then be the forerunner of meaning generation.  When context importantly 
> influences the outcome of a physical interaction, this brings a "further 
> interesting aspect" beyond the purely physical.
> 
> STAN 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Physical Informatics (J.Brenner)

2014-10-21 Thread Bob Logan
Thanks for this Bob U - very helpful - I will read your paper with great 
interest - it was nice to learn that not only am I observant but also obversant 
or perhaps you are the obversant one - all kidding aside I believe your insight 
is extremely useful and I am passing it on to Deacon's group Terry and the 
Pirates. - Bob L


On 2014-10-20, at 4:27 PM, Robert E. Ulanowicz wrote:

> Dear Bob,
> 
> What you are saying is the obverse of the third law of thermodynamics. The
> third law says that entropy (viz., disorder) can only be measured with
> respect to some reference condition. Since information is the complement
> of entropy in Bayesian informatics, then the obverse becomes, "Information
> can only be measured with respect to some reference state". (It may be the
> same one used for pin down entropy.) Changing the reference state changes
> the values for both information and entropy.
> 
> I tried elaborating those relationships in my FIS paper
> .
> 
> The best,
> Bob U.
> 
>> Dear all - my take on this post is that the question of whether physical
>> processes are information is like the question: Is there a sound if a tree
>> falls in the forest and no one is there to listen? This is like the Zen
>> koan: "what is the sound of one hand clapping" If no one is in the forest
>> are the trees information? Well for sure they are trees but as to whether
>> or not they are information that is strictly dependent on the point of
>> view of the respondent. For me they are just trees and here is why I think
>> so. For me information is about a process. The noun information relates to
>> the verb inform. If no one is being informed there is no information. In
>> the same way that if no one or thing is there being loved (verb) their is
>> no love (noun). If no one is engaged in the activity of loving (a verb)
>> there is no love (a noun). If there is no one being informed (a verb) then
>> there is no information (a noun). Now one can talk about an object or a
>> phenomenon having the possibility of informing someone which to my mind is
>> potential information which is what I would call the physical processes
>> that take place in our universe. A book written in Urdu is potential
>> information because an Urdu reader can be informed by it. For me as a
>> non-Urdu speaker there is very little information other than someone went
>> to the trouble of writing out a text with Urdu letters and hence there is
>> probably information there for an Urdu speaker reasoning why would any one
>> make the effort to create such an object unless that person wanted to
>> inform Urdu speakers. Just as one person's food is another person's poison
>> so it is that one person's information is just for another persons merely
>> a physical phenomenon such as processes in nature, ink on paper, sounds or
>> EM signals. Shannon developed a theory of signals in which some of those
>> signals have the ability to inform some recipients. I hope this collection
>> of words has informed you other than giving you the knowledge of my view
>> as to what constitutes information. Thanks to Joseph, Pedro, and Igor for
>> the opportunity to reflect on the nature of information. If you enjoyed my
>> post and would like to learn more about my views on information please
>> send me an email off line and I will send you an email version of my book
>> What is Information?  Propagating Organization in the Biosphere, the
>> Symbolosphere, the Technosphere and the Econosphere  for free. And now you
>> know what an infomercial is. This was an infomercial because of my offer
>> to share my book with you erudite scholars of FIS whose posts I always
>> enjoy. With kind regards - Bob
>> __
>> 
>> Robert K. Logan
>> Prof. Emeritus - Physics - U. of Toronto
>> Chief Scientist - sLab at OCAD
>> http://utoronto.academia.edu/RobertKLogan
>> www.physics.utoronto.ca/Members/logan
>> www.researchgate.net/profile/Robert_Logan5/publications
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> On 2014-10-20, at 1:57 PM, PEDRO CLEMENTE MARIJUAN FERNANDEZ wrote:
>> 
>>> 
>>> 
>>> - Original Message -
>>> From: Joseph Brenner
>>> To: Igor Gurevich ; Pedro C. Marijuan ; fis
>>> Sent: Monday, October 20, 2014 8:40 AM
>>> Subject: Re: [Fis] Physical Informatics contains fundamental results
>>> which impossible to get only by physical methods
>>> 
>>> Dear Igor, Dear Gerhard and Colleagues,
>>> 
>>> In Igor's summary of his recent work, I read the following absoutely
>>> critical statement:
>>> " It is shown that the expansion of the Universe is the source of
>>> information formation, wherein a variety of physical processes in an
>>> expanding Universe provide information formation." I take this as
>>> meaning that the expansion of the Universe as such does not produce
>>> information.
>>> 
>>> Gerhard's formulation is slightly different (my paraphrase):
>>> "The first assymetry in energy distribution, following the si

Re: [Fis] Physical Informatics… (J.Brenner)

2014-10-20 Thread Bob Logan
Dear all - my take on this post is that the question of whether physical 
processes are information is like the question: Is there a sound if a tree 
falls in the forest and no one is there to listen? This is like the Zen koan: 
"what is the sound of one hand clapping" If no one is in the forest are the 
trees information? Well for sure they are trees but as to whether or not they 
are information that is strictly dependent on the point of view of the 
respondent. For me they are just trees and here is why I think so. For me 
information is about a process. The noun information relates to the verb 
inform. If no one is being informed there is no information. In the same way 
that if no one or thing is there being loved (verb) their is no love (noun). If 
no one is engaged in the activity of loving (a verb) there is no love (a noun). 
If there is no one being informed (a verb) then there is no information (a 
noun). Now one can talk about an object or a phenomenon having the possibility 
of informing someone which to my mind is potential information which is what I 
would call the physical processes that take place in our universe. A book 
written in Urdu is potential information because an Urdu reader can be informed 
by it. For me as a non-Urdu speaker there is very little information other than 
someone went to the trouble of writing out a text with Urdu letters and hence 
there is probably information there for an Urdu speaker reasoning why would any 
one make the effort to create such an object unless that person wanted to 
inform Urdu speakers. Just as one person's food is another person's poison so 
it is that one person's information is just for another persons merely a 
physical phenomenon such as processes in nature, ink on paper, sounds or EM 
signals. Shannon developed a theory of signals in which some of those signals 
have the ability to inform some recipients. I hope this collection of words has 
informed you other than giving you the knowledge of my view as to what 
constitutes information. Thanks to Joseph, Pedro, and Igor for the opportunity 
to reflect on the nature of information. If you enjoyed my post and would like 
to learn more about my views on information please send me an email off line 
and I will send you an email version of my book What is Information?  
Propagating Organization in the Biosphere, the Symbolosphere, the Technosphere 
and the Econosphere  for free. And now you know what an infomercial is. This 
was an infomercial because of my offer to share my book with you erudite 
scholars of FIS whose posts I always enjoy. With kind regards - Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan
www.researchgate.net/profile/Robert_Logan5/publications








On 2014-10-20, at 1:57 PM, PEDRO CLEMENTE MARIJUAN FERNANDEZ wrote:

> 
> 
> - Original Message -
> From: Joseph Brenner
> To: Igor Gurevich ; Pedro C. Marijuan ; fis
> Sent: Monday, October 20, 2014 8:40 AM
> Subject: Re: [Fis] Physical Informatics contains fundamental results which 
> impossible to get only by physical methods
> 
> Dear Igor, Dear Gerhard and Colleagues,
>  
> In Igor's summary of his recent work, I read the following absoutely critical 
> statement:
> " It is shown that the expansion of the Universe is the source of information 
> formation, wherein a variety of physical processes in an expanding Universe 
> provide information formation." I take this as meaning that the expansion of 
> the Universe as such does not produce information. 
>  
> Gerhard's formulation is slightly different (my paraphrase):
> "The first assymetry in energy distribution, following the singularity, is 
> the source of information formation".
>  
> My question is, therefore, how best to combine these insights. For example, 
> we may say that the variety of physical processes are all the consequence of, 
> and subsequently reflect, a first assymetry.
>  
> It is also interesting to note that the approaches of both Igor and Gerhard 
> imply the emergence of information through the interactional impact 
> (informational interactions) of fundamental forces on particles, extended by 
> Gerhard to somewhat higher levels of organization (life) than Igor.  
>  
> I look forward to further discussion of these fundamental issues.
>  
> Sincerely,
>  
> Joseph
> - Original Message -
> From: Igor Gurevich
> To: Pedro C. Marijuan ; fis
> Sent: Wednesday, October 01, 2014 9:47 AM
> Subject: [Fis] Physical Informatics contains fundamental results which 
> impossible to get only by physical methods
> 
> Dear Pedro C. Marijuan,
> Dear colleagues,
> I send you "The main results of Gurevich I.M. (Physical Informatics contains 
> fundamental results which impossible to get only by physical methods)" 
> and "Igor Gurevich: Main publications in English" . 
> With best wishes.
> Igor Gurevic

Re: [Fis] COLLECTIVE INTELLIGENCE

2014-03-06 Thread Bob Logan
Dear John, Pedro and Fis colleagues.

John P has provided us with a delicious set of questions that are hard to 
resist. I want to address the questions he raised from a traditional point of 
view. No doubt the questions John has raised are motivated by the enormous 
possibilities that digital IT and in particular the Internet have made 
possible. The perspective I want to bring up is an historic one. Let me answer 
the questions John raises from that perspective.

>> Is there such a thing as Collective Intelligence? 

Yes - Long before we had digital IT and electrically configured IT such as the 
telegraph, telephone, radio and TV and long before we had the printing press 
and even long before we had writing we had collective intelligence as a result 
of spoken language and culture. What is a culture after all but a form of 
collective intelligence. Eric Havelock called myths the tribal encyclopedia. 
With writing the collectivity of intelligence grew wider as evidenced by the 
scholars of Ancient Greeks who created a collective intelligence through their 
writing. The printing press was the next ramping up of collective intelligence 
as the circle of intelligences contributing to a particular project 
dramatically increased. The ability to have a reliable way of storing and 
sharing experimental data contributed in no small way to the scientific 
revolution. Other fields of study thrived as a result of print IT such as 
philosophy, literature, history, economics etc etc. The printing press also 
contributed to the emergence of modern democracy. With the coming of 
electricity and electrically configured IT the collectivity of intelligence 
passed through another phase transition. Marshall McLuhan reflecting on this 
development well before the emergence of digital IT wrote.

"The university and school of the future must be a means of total community 
participation, not in the consumption of available knowledge, but in the 
creation of completely unavailable insights. The overwhelming obstacle to such 
community participation in problem solving and research at the top levels, is 
the reluctance to admit, and to describe, in detail their difficulties and 
their ignorance. There is no kind of problem that baffles one or a dozen 
experts that cannot be solved at once by a million minds that are given a 
chance simultaneously to tackle a problem. The satisfaction of individual 
prestige, which we formerly derived from the possession of expertise, must now 
yield to the much greater satisfactions of dialogue and group discovery. The 
task yields to the task force.(Convocation address U. of Alberta 1971)."

And now we come to the next phase transition in collective intelligence that we 
may identify with the Internet and other forms of digital IT. This development 
is both new and old at the same time. It is old as I have argued since language 
and culture, writing, the printing press, electric mass media each represented 
an internet of sorts metaphorically speaking. What is new is the magnitude and 
scale of the collectivity today, which allows a total democratization of view 
points and insights. Since a quantitative change can also be a quantitative 
change the current era of intelligence collectivities is new and one might even 
say a revolutionary change. For example a transition from representative 
democracy to participatory democracy. To conclude: Yes there is such a thing as 
Collective Intelligence - It has been with us since the emergence of Homo 
sapiens and it defines the human condition. As we push ahead to explore new 
frontiers of collective intelligence it is prudent to take into account our 
past experience with this phenomenon. Plus ca change plus ca le meme chose. 

>> How does IT effect the existence or non-existence of Collective 
>> Intelligence? 

I believe I have answered this question above. 

I will address the other questions in the fullness of time but now I must 
return to my duties. Thanks to John and Pedro for raising such an interesting 
topic.

I look forward to our discussion and any comments on my remarks.

With kind regards - Bob Logan

PS - One of duties that has cut short my response to John's questions is the 
organization of The Pages Conference on the Future of the Book taking place 
next week in Toronto. The book is another form of IT that contributes to 
collective intelligence which is being radically transformed by digital IT and 
is a topic worthy of discussion in the context of Collective Intelligence. For 
those that are interested in this particular sub-topic you may which to join my 
Google Group Rethinking the Book. If you are interested shoot me an email and I 
will invite you to join this Google Group. And here is the abstract of the 
Future of the Book conference I am helping to organize.

The Future of the Hybrid Book  (e-book + p-book)   - 10:30 to noon  - Main Hall

The Future of  Poetry Publishing- 10:30 to noon – Tiki

Re: [Fis] Fw: Responses

2014-01-16 Thread Bob Logan
Hi Christophe - I enjoyed your response - full of meaningful information - :-)  
Your point is well taken. I agree what might be meaningful information for one 
agent might be meaningless for another. I can add another example to your list 
of examples which I encountered some time ago. An author whose name I forget 
pointed out that a book written in Urdu is information for a literate Urdu 
speaker but perhaps not for those that cannot read Urdu. According to the 
definitions of Doug MacKay in 1969 and Gregory Bateson in 1973 'information is 
a distinction that makes a difference' and 'information is a difference that 
makes a difference' respectively. Meaningless information does not cut it by 
their definitions as it does not make a difference. Of course one could define 
information as a distinction or a difference that has the potential to make a 
difference for some agent. It seems to me that defining what is information is 
no easy task. My conclusion from our discussion is that depending on how you 
define information context can be an important part of what is information. The 
notion of information is extremely nuanced with multiple meanings and we seem 
to have only one word for it as pointed out by Shannon himself. In the abstract 
to his paper, The Lattice Theory of Information Shannon (1953) wrote, "The word 
"information" has been given many different meanings by various writers in the 
general field of information theory. It is likely that at least a number of 
these will prove sufficiently useful in certain applications to deserve further 
study and permanent recognition. It is hardly to be expected that a single 
concept of information would satisfactorily account for the numerous possible 
applications of this general field. The present note outlines a new approach to 
information theory, which is aimed specifically at the analysis of certain 
communication problems in which there exist a number of information sources 
simultaneously in operation."

MacKay made a distinction between 'selective information' as defined by 
Shannon's formula and 'structural information', which indicates how 'selective 
information' is to be interpreted.

"Structural information must involve semantics and meaning if it is to succeed 
in its role of interpreting selective or Shannon information. Structural 
information is concerned with the effect and impact of the information on the 
mind of the receiver and hence is reflexive. Structural information has a 
relationship to pragmatics as well as semantics where pragmatics tries to 
bridge the explanatory gap between the literal meaning of a sentence and the 
meaning that the speaker or writer intended. Shannon information has no 
particular relation to either semantics or pragmatics. It is only concerned 
with the text of a message and not the intentions of the sender or the possible 
interpretations of the receiver (Logan 2014)."

The above material is from my book What is Information? to be published 
simultaneously as a printed book and an e-book by DEMO press in Toronto. I 
would be happy to share this with you as a PDF, Christophe or any other reader 
that finds the above information meaningful and interesting.

Thank you Christophe for providing me with the opportunity to muse so more 
about the meaning of information especially 'meaningless information'. I quite 
enjoy going down the rabbit hole.  all the best Bob
__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan


On 2014-01-16, at 5:52 AM, Christophe wrote:

> Dear Bob, 
> Thanks for your answer. 
> So for you, information is always meaningful.
> Such a statement is surprising when many examples can display cases of 
> meaningless information. 
> The well known Chinese Room Argument: a  sentence written in Chinese  is 
> meaningless to a non Chinese speaking reader.
> A vervet monkey alarm is meaningful information for other vervet monkeys but 
> is meaningless for a passing by dog. 
> Cyphered information is meaningful or meaningless depending if the receiver 
> has the cyphering key or not.
> We can agree that the meaning of information does not exist by itself but is 
> a result of an interpretation by a system. The interpretations of given 
> information can deliver different meanings and "no meaning" is a possible 
> outcome.  Looking at downgrading to signals the intrerpretation of 
> information is surprising.
> Now, regarding information theory, I still understand it in a scientific 
> background as it is part of mathematics and computing. Things can be 
> different indeed if you look at applications of IT (to linguistics, 
> psychology, …). 
> But the key point may be that disregarding the possibility for meaningless 
> information shuts a road in analyzing the possibilities for computers to 
> understand us. A lot is still to be done in this area and

Re: [Fis] The Interaction Man

2013-12-08 Thread Bob Logan
Dear John - I agree with your distinction between information and 
communication. What is essential for communication is the interpretation of the 
information. If I cannot interpret the information there is no communication. 
What Shannon leave out of his theory of signals (this is not a typo, I believe 
that the notion of Shannon's work as information theory is a category error) is 
the interpretation of the receiver. The notion that a random set of numbers is 
the maximum amount of information seems ludicrous to me as what interpretation 
can one make of a random set of numbers. John, one slight quibble. You refer to 
Shannon's "model of communication". How can he have a model of communication if 
he makes now allowance for interpretation. He was concerned with the accuracy 
of transmitting a set of signs from point A to point B. Krassimir wrote: 
"Communication is a process of exchanging of "signals, messages" with different 
degree of complexity (Shannon)." Without the ability to interpret the signals 
there is no communication. If someone speaks to me in Navaho or Mandarin they 
will exchange signals with me but their level of communication will be close to 
nil. All I will be able to infer is that they want to communicate with me.

I hope that my exchange of signals is interpretable and that I have 
communicated with you, John and other members of FIS. 

The expression of this hope leads to the following thought. In an exchange 
between two intelligent agents who speak the same language but have made 
different assumptions about an issue they are discussing there is often a 
breakdown in communication because their interpretation of the assumptions upon 
which their exchange of signals are based are so different. This brings to mind 
I. A. Richards notion that in order for communication to occur one has to 
feedforward the context of what one wants to say. He once suggested that 
perfect communication only occurs if the two communicants have identical 
experiences and since this is not possible absolutely perfect communication is 
not possible. However one can improve one's communication by feedforwarding the 
context. So my feedforward to you and the FIS audience is that I worked with 
Marshall McLuhan from 1974 to his passing in 1980, he was a student of I. A. 
Richards and he (McLuhan) believed communication is effected by both the 
content of the message and the medium or channel by which the signals are 
exchanged so that "the medium is the message."

all the best - Bob Logan


On 2013-12-08, at 7:41 AM, John Collier wrote:

> At 12:38 AM 2013/12/05, Krassimir Markov wrote:
>> Dear Pedro and FIS Colleagues,
>> This discussion is full with interesting ideas.
>> What I want to add is that I distinguish the concepts "communication" and
>> "information interaction" which reflect similar phenomena but at different
>> levels of live hierarchy.
>> Communication is a process of exchanging of "signals, messages" with
>> different degree of complexity (Shannon).
>> Information interaction is exchanging of information models. It is specific
>> only for intelligent agents but not for low levels of live mater (bio
>> molecules, cells, organs).
>> Main feature of intelligent agents is decision making based on information
>> models.
>> Information interaction is impossible without communication.
>> Friendly regards
>> Krassimir
> 
> I would agree with distinguishing between communication and 
> information interaction, but I infer exactly the opposite conclusion. 
> Communication, it seems to me (and also according to the setup that 
> is the basis for Shannon's approach) requires coding and decoding 
> modules, but information transmission does not; it requires only a 
> channel. Information needs to be decoded (given meaning) to be 
> communication. At least that is what I read off of Shannon's model of 
> communication theory.
> 
> Maybe Krassimir is not talking about Shannon type information, and 
> has a different model in mind. If this is the case, it would be nice 
> if he were to make it explicit, since most people today at least 
> start with Shannon's approach (or one of the two rough equivalents 
> discussed by Kolmogorov) as basis that at least gives the syntax of 
> information, with channel, coding and decoding required for a full 
> communications channel, but left undefined by Shannon (though recent 
> work has tried to fill these gaps).
> 
> Cheers,
> John
> 
> --
> Professor John Collier colli...@ukzn.ac.za
> Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
> T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
> Http://web.ncf.ca/coll

[Fis] feedforward

2013-11-24 Thread Bob Logan
Dear FIS colleagues - I have just completed the first draft a paper entitled  
Feedforward, I. A. Richards, Cybernetics and Marshall McLuhan. Feedforward is a 
fascinating concept developed by I. A. Richards which I posit had a significant 
impact on the work of Marshall McLuhan. I am attaching the first page of the 
article in the body of this email. I am looking for feedback so I am 
feedforwarding you the first page of the article. If you are interested in 
receiving the whole article email me off line and I will email it to you. 
Thanks - Bob

Feedforward, I. A. Richards, Cybernetics and Marshall McLuhan

Robert K. Logan

lo...@physics.utoronto.ca



Abstract: I. A. Richards development of feedforward is reviewed. The impact of 
feedforward on the work of Marshall McLuhan is then surveyed and shown to have 
influenced his use of figure/ground, the user as content, the content of a new 
medium is some older medium, the use of the probe, effects preceding cause, 
avoidance of a point of view and roles versus jobs.

The term feedback is a commonly used term that most people are familiar with. 
Googling the term feedback resulted in about 2.48 billion hits. Less familiar 
is the term feedforward, which elicited only about 2 million hits less than 1% 
of the hits for feedback. The concept of feedforward, which I will introduce to 
you in this essay, is a very powerful concept that was first formulated by I. 
A. Richards in 1951 and which subsequently had an important impact on the work 
of Marshall McLuhan. The thesis that I intend to develop in this essay is that 
I. A. Richards’ notion of feedforward had a feedforward effect of the work of 
Marshall McLuhan and helped McLuhan or at the very least influenced McLuhan to 
develop a number of his key ideas, including:

1. his notion of figure/ground,

2. the user is the content,

3. the content of a new medium is some older medium,

4. the use of the probe as a research tool,

5. the idea that effects can precede causes, and

6. the notion that a point of view is best avoided in doing research.  

7. the prevalence of roles versus jobs in the electric age.

We will first examine Richards’ development and use of the notion of 
feedforward in his study of rhetoric and then study how the notion of 
feedforward impacted McLuhan’s approach to the study of media.

I. A. Richards’ area of research was rhetoric, which he considered to be more 
than just the art of persuasion. Richards was concerned with the accuracy of 
human communication. He considered the field of rhetoric to be about finding 
remedies for avoiding misunderstandings and hence improving communication as 
well as understanding how words work. He believed the notion of feedforward was 
an important tool for achieving these ends. Feedforward is basically a form of 
pragmatics where pragmatics is the use of context to assist meaning.

Richards considered his formulation of feedforward to have been one of his most 
important accomplishments. In an article entitled The Secret of “Feedforward” 
he was invited to write for the Saturday Review summing up his life’s work, he 
wrote,

The process by which any venture of [a] creative sort finds itself, and so 
pursues its end, is something I have learned, I hope, something about. Indeed, 
I am not sure I have learned anything else as important… I realize now what a 
prime role belongs to what I called “feedforward” in all our doings. 
Feedforward, as I see it, is the reciprocal, the necessary condition of what 
the cybernetics and automation people call “feedback.”



The term feedforward according to the Oxford English Dictionary (OED) was first 
introduced into the English language by I. A. Richards in 1951 at the 8th Macy 
Conference entitled Cybernetics: Circular Causal and Feedback Mechanisms in 
Biological and Social Systems in a talk entitled “Communication Between Men: 
The Meaning of Language.”


__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan






___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Shameless promotion of McLuhan Misunderstood: Setting the Record Straight

2013-10-25 Thread Bob Logan
cLuhan for the digital age. Although many of 
McLuhan's insights were meant to interpret communication phenomena in the 
electronic age, Professor Logan convincingly shows that much of what the man 
had to say about TV could also be applied to today's media environment, 
characterized by digital interactive media, fractured attention, and 
information overload. McLuhan Misunderstood: Setting the Record Straight proves 
that classic authors and their works are beyond categorization, irreducible to 
a single message, and inexhaustible in the possibilities of being; it 
demonstrates that McLuhan's thought—much like the media of communication he 
sought to understand—is alive and in constant flux." --Laureano Ralon, 
Figure/Ground Communication Blogger, Canada

"Understanding media is not easy. Back in the 1960s Marshall McLuhan opened our 
eyes up and expanded our vision of the media ecology. Understanding McLuhan has 
never been easy either ("I don't necessarily agree with everything I say" said 
McLuhan. Just imagine the rest). Thanks to Bob Logan now we can get closer to a 
full understanding of McLuhan's complex and amazing vision of contemporary 
culture." --Carlos A. Scolari, Universitat Pompeu Fabra, Spain

"More than anyone else, Robert K. Logan has kept Marshall McLuhan's thought 
alive over the generations. And now that the academic landscape seems finally 
ready for a thorough rereading of McLuhan's work, we are deeply fortunate to 
have professor Logan still here with us to clarify and help us understand it. 
With stunning lucidity, scholarly precision and good humor, Bob Logan makes 
McLuhan's thinking accessible to readers of the 21st century. His book 
impressively shows, and with apparent ease, how many of McLuhan's ideas still 
hold relevance today. It is an essential introduction and an absolute must read 
for everyone interested in one of the most intriguing and provocative thinkers 
of recent intellectual history." --Yoni Van Den Eede, Vrije Universiteit 
Brussel, Belgium

"Not only does Bob Logan's McLuhan Misunderstood not misunderstand McLuhan, and 
sets the record straight, but the book provides one of the best understandings 
of McLuhan around. Logan worked with Marshall McLuhan in the 1970s, and is one 
of the very few scholars who obtained his understanding of McLuhan not only 
from McLuhan's writings and lectures, but from all-important conversations, the 
top of the line in the acoustic realm. This special savvy shows throughout the 
volume, and makes it required reading for all who seek to better understand the 
media of the 21st century." --Paul Levinson, author ofDigital McLuhan and New 
New Media, USA

About the Author
Robert K. Logan, PhD is professor emeritus of physics at the University of 
Toronto, fellow of St. Michael's College, and chief scientist at the sLab, OCAD 
University. He collaborated and published with Marshall McLuhan between 1974 to 
1980. He is the author of a dozen books including one coauthored with McLuhan, 
The Future of the Library: An Old Figure in a New Ground as well as The 
Alphabet Effect(1984, 2004), The Sixth Language (2000, 2004), Understanding New 
Media (2011), and What Is Information? (2013).
__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan






___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Praxotype

2013-10-15 Thread Bob Logan
Thanks John for alerting us to the terms praxotype and cognotyppe. I have a 
simpler formula which I made use of in my book the Extended Mind: The Emergence 
of Language, the Human Mind and Culture. Words are simply concepts and hence 
thinking tools. Before verbal language hominids communicated by mimesis, i.e. 
hand signals, facial gestures, body language and prosody (non-verbal 
vocalization) like grunts. As the complexity of hominid existence increased 
mimesis did not have the requisite variety for everyday life. Conceptualization 
was needed. Verbal language emerged in which our words were our first concepts. 
The word water, for example, was a concept that united all our percepts of the 
water we drank, washed with, cooked with, fell as rain, or was found in rivers, 
lakes or the sea. With language the brain which before was a percept engine 
bifurcated into the human mind capable of conceptualization and hence planning 
and large scale coordination. Verbal language allowed us to deal with matters 
not immediately available in space and time. I claim that the emergence of 
verbal language represented three simultaneous bifurcations: from mimetic 
communication to verbal langauge; from the brain as a percept engine to the 
mind capable of conceptualization and from hominids to fully human Homo Sapiens.

for more details visit 
http://www.academia.edu/783502/The_extended_mind_understanding_language_and_thought_in_terms_of_complexity_and_chaos_theory

or 

http://www.academia.edu/783504/The_extended_mind_The_emergence_of_language_the_human_mind_and_culture

cheers - Bob Logan

On 2013-10-15, at 2:54 AM, John Collier wrote:

> This term might be useful in the context of the present discussion, 
> especially in the contest of coordinated practice(s). Cognotype might also be 
> useful. I think these might lead to a more fine-grained analysis of the more 
> integrative sociotype.
>  
> http://blogs.scientificamerican.com/guest-blog/2013/09/27/words-are-thinking-tools-praxotype/
>  
> ___
> fis mailing list
> fis@listas.unizar.es
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
http://utoronto.academia.edu/RobertKLogan
www.physics.utoronto.ca/Members/logan






___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Collier's Metaphysics

2013-05-26 Thread Bob Logan
Bruno - An interesting definition of God, omnipresent, omnipotent and 
omniscient. And equally mysterious. Turing emulable and Robinson Arithmetic. 
How does one derive an understanding of the real world (the acoustic world) 
from axioms. The scientific method of observation, generalization, hypothesis 
building and testing seems something worth emulating. Let's call it science 
method emulable. You cannot derive science from math or logic and prove 
anything with math or science about the real world. And I can prove it if you 
accept Popper's axiom that for a proposition to be scientific it has to be 
falsifiable. Well if you prove anything is true then it cannot be falsified and 
hence is not a scientific proposition. I think the trouble with your fallacy, 
Bruno, is that it is wrong (this is a Marshall McLuhan gag that he used all the 
time and made famous in his cameo appearance in Annie Hall.) Bruno, do not take 
this playful approach to your post as a personal attack. I am just playing with 
the ideas you put out there for our enlightenment. And as McLuhan said, if you 
don't like these ideas I have others. With kind regards and thanks for your 
stimulating post - Bob


On 2013-05-26, at 11:46 AM, Bruno Marchal wrote:

> Jerry,
> 
> On 26 May 2013, at 16:58, Jerry LR Chandler wrote:
> 
>> John:
>> 
>> I have followed your writings for many years - perhaps more than two decades 
>> now.  
>> 
>> Frankly, from the perspective of a hardcore realist, I find much of your 
>> written work to be highly metaphysical in nature, including the sentences 
>> which I cited in the post of May 17, 2013.
>> 
>> Your notion of metaphysics appears to so extremely narrowly restricted that 
>> you can exempt your own highly metaphysical writings from your definition of 
>> metaphysics.  In fact, the traditional usage of the term "metaphysics" is 
>> not narrowly restricted.
>> 
>> On numerous occasions, you assert your views as a MIT trained physicist. Yet 
>> in this immediate exchange, the responsibility for the assertions are 
>> attributed to others. Puzzling.  
>> 
>> Have you every given any serious metaphysical thought to the scientific 
>> meaning of the phrase "it from bit"?  
>> 
>> Perhaps the dichotomy of your perspectives is amply illustrated by the title 
>> of your book:
>> 
>> "Every Thing Must Go"
>> 
>> This title itself is a simple logical assertion. It expresses logical 
>> necessity.
>> 
>> If this assertion is true, what would remain?
>> 
>> Life?
>> Matter?
>> Mentation?
>> John Collier?
>> MIT style physics?
>> Mathematics?
>> Philosophy of science?
>> Metaphysics?
>> Nothing?
>> 
>> This title alone expresses a deep and profound metaphysical perspective. 
>> 
>> At heart, I am a simple man, in love with nature, logic and mathematics. 
>> From my perspective, your voluminous metaphysical writings tend to be 
>> contrary to my experience of nature, logic and mathematics. 
>> 
>> "It from bit"?  Really?
>> 
>> My question of May 17, 2013 remains open:
>> 
>>> How would a rational realist distinguish this metaphysical perspective from 
>>> witchcraft or magic?
>> 
>> 
>> Cheers
>> 
> 
> If we assume that there is a level of description such that we are Turing 
> emulable, then ontologically everything can go except some term of a first 
> order Turing universal theory. I use Robinson Arithmetic to fix the things.
> 
> Then nothing does go, as it can be shown how the appearances of the physical 
> laws can be explained in that theory (with computationalism at the metalevel, 
> or not). This leads to a derivation of physics from the additive and 
> multiplicative theory of numbers, making the theory testable (comp + 
> classical theory of knowledge). Then, thanks to Gödel, Löb and Solovay makes 
> it possible to distinguish what is true for the subject and what the subject 
> can justify.
> 
> It saves us form reductionism, as it shows that the universal machines can 
> defeat all complete theory about them. It shows that numberland is already 
> full of life, and where the information fluxes are born. Eventually it makes 
> us more ignorant, as the "arithmetical truth is big and beyond the 
> computable. And the computable lives there, surrounded by the non computable.
> 
> Information is a key notion, but, imo, not as fundamental than the universal 
> machine, or the universal number, which can interpret that information, and 
> react to it: not always in a predictible way, though. Another key is the 
> distinction between first person and third person views, singular and plural. 
> A simple notion of first person indeterminacy explains why the psycho-brain 
> identity can't work, and suggest precise formulation of the mind-body problem.
> 
> This approach generalizes to arithmetic what Everett did for quantum 
> mechanics. It shows that Church thesis rehabilates a Pythagorean form of 
> Neoplatonism. It is counter-intuitive, but it explains why.
> 
> It from bit? Yes. Even already some shadows of t

Re: [Fis] About FIS 2005

2013-04-15 Thread Bob Logan
Dear Xueshan - re Nalewajski's conjecture that molecular systems have 
information I am skeptical. The word information originated with the idea of 
forming the mind according to the OED. Information as far as I am concerned 
requires a sentient being to receive and understand it. Molecules and atoms 
react to forces not information. They have no idea of the forces acting on 
them. They are not informed as they have no sentience that can be informed. 
Information requires an interpretant for which the signal has meaning. 
Shannon's  information theory is merely signal theory as all he is concerned 
with is how well a set of symbols or a signal are transmitted from the sender 
to the receiver. The ability of the receiver to decipher the signal or 
interpret the signal  has no bearing on the reception of Shannon information. 
Shannon information has nothing to do with meaning. A set of random numbers has 
the maximum amount of Shannon information and yet has no meaning. If my set of 
symbols have meaning for you, whether or not you agree with the premise they 
represent, then they are information. As for a molecule or even a flower or a 
penguin they are not information. In other words information has to inform as a 
grammatical analysis of the word information implies. A representation 
represents, a contradiction contradicts, a saturation saturates and in general 
an "X"tion "X"es and therefore information informs or at least has the 
capability of informing. So while a text in the Basque or Albanian languages 
might not inform me because of my inability with these languages they are 
capable of informing those familiar with the Basque and Albanian languages 
respectively and are therefore informaton. A random set of letters cannot 
inform anyone yet they have maximum Shannon information. Information is a 
tricky thing. 

This line of thought raises the question of whether or not DNA is information. 
DNA does not inform a sentient being yet it does catalyze and hence instructs 
how RNA is produced which in turn catalyzes and instructs how proteins are 
created which in turn gives rise to bodily functions. Therefore we suggested 
that DNA represents a different form of information from Shannon information 
which we called biotic or instructional information. The argument can be found 
in the paper  Propagating of Organization: An Inquiry by Stuart Kauffman, 
Robert K. Logan, Robert Este, Randy Goebel, David Hobill and Ilya Smulevich. 
published in 2007 in Biology and Philosophy 23: 27-45. I am happy to share this 
paper with anyone requesting it. 

Bob Logan

On 2013-04-14, at 9:59 PM, Xueshan Yan wrote:

> 
> Dear Michel,
> 
> Thank you!
> 
> I am very familiar with your FIS 2005 website long before.
> 
> Have you read the Polish chemist Nalewajski's book:
> Information theory of molecular systems (Elsevier, 2006), I
> really want to know if there are INFORMATON that play a role
> between two atoms, or two molecules, or two supramolecules
> as Jean-Marie Lehn said.
> 
> As to FIS 2005, I need every review about all four FIS
> conferences held in Madrid, Vienna, Paris, and Beijing, but
> only a general review about FIS 2005 not be given by people
> so far.
> 
> Best regards,
> 
> Xueshan
> 9:59, April 15, 2013  Peking University
> 
> 
>> -Original Message-
>> From: Michel Petitjean [mailto:petitjean.chi...@gmail.com]
> 
>> Sent: Sunday, April 14, 2013 6:19 PM
>> To: Yan Xueshan
>> Subject: Re: About FIS 2005
>> 
>> Dear Xueshan,
>> As far as I know, there is no longer report, but I am at
> your 
>> disposal if you wish to get more: please feel free to ask
> me. 
>> Also you may have a look at the programme, the
> proceedings, 
>> and all what is available from the main welcome page: 
>> http://www.mdpi.org/fis2005/ Best, Michel.
>> 
>> 
>> 2013/4/14 Xueshan Yan :
>>> 
>>> Dear Michel,
>>> 
>>> May I ask you a favor?
>>> 
>>> Do you have any more detailed review about FIS 2005,
> except 
>> your FIS 
>>> 2005 brief conference report published in 
>>> http://www.mdpi.org/entropy/htm/e7030188.htm?
>>> 
>>> Best regards,
>>> 
>>> Xueshan
>>> 17:47, April 14, 2013
>>> 
>> 
> 

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] fis Digest, Vol 570, Issue 2

2013-04-14 Thread Bob Logan
Dear Dr. Yan

I make reference to the MacKay quote in my article that can be found here: 
www.mdpi.com/2078-2489/3/1/68/pdf

I make reference to 
 Hayles, K. How We Became Posthuman; University of Chicago Press: Chicago, IL, 
USA, 1999.

where you will find an extensive discussion of Donald MacKay's formulation of 
his definition of information:  Information is a distinction that makes a 
difference–MacKay

If you google  "Information is a distinction that makes a difference" you will 
find many more references to this quote.

with kind regards - Bob Logan


On 2013-04-14, at 4:52 AM, Xueshan Yan wrote:

> "Information is a distinction that
> makes a difference"

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Fwd: SV: Science, Philosophy and Information. An Alternative Relation] S.Brier

2013-02-12 Thread Bob Logan
Dear Friends - etymology might be of some help in responding to Steven's 
question, "When you say "philosophers" do you mean "theorists?" And, if not, 
what distinguishes the two?" The philosopher is a lover of wisdom and 
philosophy is the love of wisdom. The theorist is the one that sees. Theory and 
theatre have the same root. The theorist is associated with science and science 
is about knowledge. There is a vast difference between knowledge and wisdom as 
"every wise man's son doth know". And then there is information. In my book 
Collaborate to Compete I suggest the following relationship between wisdom, 
knowledge, information, data and values.

"• Data are the pure and simple facts without any particular structure or 
organization, the basic atoms of information.
• Information is structured data, which adds meaning to the data and gives it 
context and significance.

• Knowledge is the ability to use information strategically to achieve one's 
objectives, and

• Wisdom is the capacity to choose objectives consistent with one's values and 
within a larger social context.

These definitions lead to our definition of KM:

Knowledge management is the collaborative organizational activity of creating 
the environment, both attitudinally and technologically, so that knowledge can 
be accessed, shared and created within an organization in a way that all of the 
experiences and knowledge within the enterprise, including that of all its 
staff, customers, suppliers and business partners, can be organized to achieve 
the enterprise's objectives and reinforce its values.

Knowledge, Value and Values

"Structuring or processing data to convert it into information gives that data 
added meaning and added value. Data is not information until it is processed. 
Information is not intelligence until it is efficiently communicated" (Vine, 
2000). In the same way that information structures data, knowledge structures 
information, giving it additional levels of meaning and providing it with 
utility. Information by itself, without knowledge, does not have utility. 
Wisdom augments and guides knowledge through values. Knowledge by itself does 
not create well-being. In fact, knowledge by itself can be very destructive and 
can be and has been used for evil ends. For example, at the root of our 
environmental crisis is the blind use of scientific and technological knowledge 
without the guidance of ecological wisdom. The laissez faire use of science, 
unmoderated by concern for human values, creates disservice in the long run. 
The instantaneous access to information that computers provide encourages the 
use of knowledge to achieve short-term goals without concern for the long-term 
effects. Wisdom and prudence buffer us from this impulse. At the core of the 
environmental challenges facing us today is the fact that data is transformed 
instantaneously into information and that the knowledge needed to exploit that 
information can be developed much faster than the wisdom that is needed to 
guide the use of that knowledge."

Now to return to the question that Steven asked: What would "a philosophical 
component" of information theory look like?

It depends on what information theory one is considering. If one's theory of 
information does not deal with meaning then it cannot deal with values nor with 
wisdom and hence not with philosophy and the love of wisdom. So how could one 
have a philosophy of information theory. If, on the other hand, one's theory of 
information, embraces meaning and hence value one's theory of information would 
consider how information is used to achieves one's objectives consistent with 
one's values. 

Let me give the last word to a poet who raises still more questions that any 
philosopher of information theory must consider:

"Where is the wisdom we have lost in knowledge?

 Where is the knowledge we have lost in information?"   - T. S. 
Eliot

I hope my "philosophical" musings are of some help in this FIS dialog - Bob 
Logan 

PS - it occurs to me that philosophy is largely about opinion and is largely 
subjective whereas science is objective and consists of propositions that can 
be falsified. Philosophical propositions are not falsifiable. Their value to 
any user or reader lies total in the values of that user or reader as "every 
wise man's son doth know". The line "every wise man's son doth know" comes from 
Shakespeare's play Twelfth Night and appears in the song O Mistress Mine of 
Feste who is described by Curio as follows: "Feste, the jester, my lord; a fool 
that the lady Olivia's father took much delight in."


On 2013-02-11, at 6:58 PM, Steven Ericsson-Zenith wrote:

> John,
> 
> When you say "philosophers" do you mean "theorists?" An

Re: [Fis] The Information Flow

2012-11-13 Thread Bob Logan
Hey Stan - I agree with the way you characterize the role of logic as a 
linguistic mechanism. Logic connects one set of statements, the premises, with 
another set of statements, the conclusion. Without challenging your remarks I 
would suggest that like the case with the poets it is sometimes useful to set 
aside the dictates of logic. McLuhan talked about the reversal of cause and 
effect. By this he meant in the case of artists that they start with the effect 
they wish to create and then find the causes that will create the effect they 
are striving for. In the case of technology McLuhan correctly noted that the 
effect of the telegraph was the cause of the telephone. But for me the 
interesting phenomena where the logic of cause and effect does not hold is the 
case of emergence and self-organization. With an emergent system in which the 
properties of the system can not be derived from, reduced to or predicted from 
the properties of the components the notion of cause and effect does not hold. 
The reductionist program of logical thinking does not do much to understand 
emergent phenomena. It is not that logic is wrong it is that it is irrelevant. 
So if one is an emergentist one cannot be a mechanist. That is simple logic. ;-)

warm regards - Bob


On 2012-11-13, at 9:30 AM, Stanley N Salthe wrote:

> Bruno said --
> > but this does not mean that Mechanism is
> a good *explanation* of anything. On the contrary, I prefer to look at
> it as a tool, perhaps a simplifying tool, to *formulate* the problems
> (notably the mind-body problem), to explain it is not yet solved, even
> in that simplifying frame, etc.
> 
> Quite right.  Logic is a linguistic mechanism, and a ll of our philosophical 
> and scientific efforts are mediated by it.  We (except poets) are all 
> mechanists!
> 
> STAN
> 
> 
> On Tue, Nov 13, 2012 at 8:50 AM, Bruno Marchal  wrote:
> Dear Robert and FIS colleagues,
> 
> 
> On 12 Nov 2012, at 16:35, Robert Ulanowicz wrote:
> 
> > Dear Pedro,
> >
> > Roman & Littlefield is coming out with a volume along those lines
> > entitled "Beyond Mechanism"
> >  > >
> >
> > As for our Chinese colleagues, I find them more open to non-mechanical
> > scenarios than are anglophones.
> 
> The problem is that few people defending mechanism are aware that
> mechanism is incompatible with weak materialism (the primary existence
> of a physical universe, that is more or less the current dogma/
> paradigm). Most proponents of Mechanism have still the 19th century
> conception of Mechanism, which is refuted by theoretical computer
> science/mathematical logic.
> 
> Mechanical entities are intrinsically related to non mechanical
> scenario. You cannot genuinely be open to mechanism without being open
> to the non-mechanical. Most predicate applying to machine are
> undecidable, non mechanical, etc. Machines, notably when they are self-
> observing, are confronted to the non mechanical, and *can* overcome it
> by relying on non mechanically generable informations.
> 
> I am not defending Mechanism, but as a logician I do invalidate
> widespread misconception on machines, and notably I try to explain
> that Digital Mechanism (computationalism) is quite the opposite of
> reductionism. I would even say that it might be used as a vaccine
> against reductionism in the exact and human science.
> 
> Mechanism, in the weak sense I am using, is quite plausible, as there
> are no evidences against it, but this does not mean that Mechanism is
> a good *explanation* of anything. On the contrary, I prefer to look at
> it as a tool, perhaps a simplifying tool, to *formulate* the problems
> (notably the mind-body problem), to explain it is not yet solved, even
> in that simplifying frame, etc. We are very ignorant of what are
> machines, and probably so, in case we assume we are ourself Turing
> emulable (with or without oracles).
> 
> 
> > All three of my books are being
> > translated into Chinese. The first one, "Growth and Development:
> > Ecosystems Phenomenology" has already been published.
> 
> Congratulation !
> 
> Best,
> 
> Bruno
> 
> > Quoting PEDRO CLEMENTE MARIJUAN FERNANDEZ :
> >
> >> Dear colleagues,
> >>
> >> Yes, the foundations are trembling... as usual during quite long a
> >> time. Maybe too many aspects have to be put into line in order to
> >> have new, more consistent foundations for human knowledge. Until now
> >> the different crisis of Mechanics, the dominant scientific culture,
> >> have been "solved" at the small price of leaving conceptual
> >> inconsistencies until the rug of brand new fields or subdisciplines
> >> while at the same time fictive claims of unity of sceince,
> >> reductionism, etc. were upheld. Good for mechanics, as probably
> >> there were few competing options around --if any. Bad for the whole
> >> human knowledge, as multidisciplinary "schizophrenia" has been
> >> assumed as the natural state of 

Re: [Fis] Stephen Wolfram discussing his ANKS in Reedit this Monday

2012-05-15 Thread Bob Logan
Dear Friends - please do not take the following critique as disrespectful but 
the following amusing thoughts came into my head as I read that the cosmos is 
engaged in computing and is a cosmic computer. During the age of polytheism the 
different aspects of the cosmos were at war with each other and the cosmos was 
a battleground. Then with monotheism the cosmos bifurcated into the good side 
with angels fluttering about God sitting on a throne and the cosmos was 
basically praying and doing all kinds of good things except for the fallen 
angels who stoked the fires of the inferno somewhere in the netherworld. 

Then came Newton or as Alexander Pope put it

Nature and nature's laws lay hid in night;
God said "Let Newton be" and all was light.

Suddenly the cosmos was a machine, a clockworks, that God created, set in 
motion and the cosmos mechanically followed the Creator's law.

My couplet for our current time with attribution to Alexander Pope is:

Now we are in the brave new age of information  
God said "Let Turing be" and all was computation.

Suddenly the cosmos has evolved into a cosmic-super-computer having evolved 
from the clockwork universe which had in turn evolved from the dual domain of 
God's heaven and the Devil's hell which had in turn evolved from the 
battleground of the gods. What's next. Well from biology we got the Gaia 
hypothesis, the earth as an organism. What would be the next step - yes you 
guessed it the Cosmic Organism. Just google "cosmic organism" and you will find 
some 20,200 hits.

It does not stop there either - here are the following cosmoses as revealed by 
Google

quantum cosmos - 194,000 hits

string theory cosmos - 10,000 hits

holographic cosmos - 2,200 hits

black hole cosmos 20,200

In fact take any metaphor from science or technology and someone will have a 
theory how our cosmos operates as that technology or science metaphor. 

With the concept of the multiverse or multiple universes we are back to the 
polytheistic world since every universe will have its own God. 

My conclusion is that there is one cosmos with multiple ways of describing it 
each of which employs a particular metaphor. The computational universe is just 
one example in a long line of cosmic metaphors and as our science and 
technology evolves so will the metaphors to describe this cosmos of ours. 

with kind regards - Bob Logan



On 2012-05-15, at 9:42 AM, Gordana Dodig-Crnkovic wrote:

> Dear Bob,
> 
> I am not sure if I have right to reply, but you make a very important remark.
> 
> The answer is: computing nature performs much more than existing computers.
> 
> What is computable in computing nature is what nature is able to perform 
> through its continuous changes.
> Dialectical processes are also typical in nature and thus in the framework of 
> computing nature, those also are computations.
> 
> In short the question is: what kind of computations are those dialectical 
> processes?
> 
> That is what we want to learn.
> 
> All the best,
> Gordana
> 
> 
> -Original Message-
> From: Robert Ulanowicz [mailto:u...@umces.edu] 
> Sent: den 15 maj 2012 15:36
> To: Gordana Dodig-Crnkovic
> Cc: Bruno Marchal; fis@listas.unizar.es
> Subject: Re: [Fis] Stephen Wolfram discussing his ANKS in Reedit this Monday
> 
> Quoting Gordana Dodig-Crnkovic :
> 
> 
>> 2.   Whatever changes in the states of the physical world there  
>> are, we understand them as computation.
> 
> Dear Gordana,
> 
> I'm not sure I agree here. For much of what transpires in nature (not  
> just in the living realm), the metaphor of the dialectic seems more  
> appropriate than the computational. As you are probably aware,  
> dialectics are not computable, mainly because their boundary value  
> statements are combinatorically intractable (sensu Kauffman).
> 
> It is important to note that evolution (which, as Chaisson contends,  
> applies as well to the history of the cosmos [and even the symmetrical  
> laws of force]) is driven by contingencies, not by laws. Laws are  
> necessary and they enable, but they cannot entail.
> 
> Regards,
> Bob
> 
> 
> ___
> fis mailing list
> fis@listas.unizar.es
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Incomplete Nature - Completing the discussion that Terry started

2012-04-26 Thread Bob Logan
Hi Terry - I just finished your book literally 2 minutes ago. I just had to 
tell you what a brilliant and moving piece of work it is. I plan to go through 
it and write a lengthy review of it just to attempt to nail down all the ideas 
you developed in this amazing tome. And I agree "there is more here than stuff".

The first thing that comes to mind is that a conference should be organized 
around this book so we can discuss as a community the implication of the 
challenges you have presented. This is an important piece of work the 
implications of which need to be discussed.

I would be happy to help organize such a conference but I do not think I have 
the resources to organize such a conference myself at OCAD University but I 
will try. I can think of some organizations for which this idea would appeal. 
Perhaps the complexity conferences organized by Yaneer at NESCI would be an 
appropriate venue. I remember vividly a conversation that you and Stu had in 
which I participated in a peripheral way at one of Yaneer's conferences a 
number of years ago, which I believe was held in Boston.

Another possibility is the Society for Chaos Theory in Psychology & Life 
Sciences conference co-ordinated by Stephen Guastello

Another possibility is someone in the FIS group based primarily in Europe might 
want to do something.

{If any of the groups I mentioned are interested in  a conference on Terry's 
book Incomplete Nature please contact me so I can volunteer to help}

I would want to involve Stuart Kauffman in such a conference as you refer to 
him frequently. I am very proud that you began Chapter 13 with the quote from 
Stu that was contained in the paper I co-authored with him and others.

Perhaps there is a conference already being planned in which case I would like 
to participate. 

Finally, you asked that I remind you to write a short piece for the special 
issue of the e-journal MDPI Information. So this is another reminder. An 
excerpt from your book would do as well.

Thanks so much for writing such a great book and stimulating so much thought. 

Bob
__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Fwd: Re: Physics of computing]--Plamen S.

2012-04-06 Thread Bob Logan
Dear Loet - sorry for the delayed response but I just returned from a 4 city 
lecture tour in Brazil introducing my book What is Information? and lecturing 
on Marshall McLuhan. I enjoyed your comments that you so elegantly framed. 

Terry Deacon in his wonderful book Incomplete Nature expands the difference 
that makes a difference definition of information to energy and defines work as 
energy that makes a difference. The discussion is on pp. 332-35. It i a 
fascinating discussion in a fascinating book. I cannot recommend it too highly. 

Thanks for your comments again and all the best - Bob


On 2012-03-18, at 2:57 AM, Loet Leydesdorff wrote:

> Dear Bob,
>  
> Yes, I agree: the difference that makes a difference is operationally 
> generated by a receiving system; information itself is nothing but a series 
> of differences (contained in a probability distribution). The selection 
> mechanisms in the receiving systems that position the incoming uncertainty 
> have to be specified (as hypotheses). Meaningful information emerges from 
> selecting the signal from the noise.
>  
> The meaningful information (the differences that make a difference) can again 
> be communicated as information (for example, in and among biological 
> systems). Thus, the operation is recursive and the communication / 
> autopoiesis continues. Meaning can only be communicated by systems which are 
> able to entertain a symbolic order reflexively such as human beings and in 
> interhuman discourses.
>  
> I’ll read the book by Reading.
> Best,
> Loet
>  
> Loet Leydesdorff
> Professor, University of Amsterdam
> Amsterdam School of Communications Research (ASCoR), 
> Kloveniersburgwal 48, 1012 CX Amsterdam. 
> Tel.: +31-20- 525 6598; fax: +31-842239111
> l...@leydesdorff.net ; http://www.leydesdorff.net/ ; 
> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>  
> From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On 
> Behalf Of Bob Logan
> Sent: Saturday, March 17, 2012 10:55 PM
> To: Stanley N Salthe
> Cc: fis
> Subject: Re: [Fis] FW: [Fwd: Re: Physics of computing]--Plamen S.
>  
> Stan - great formula but as I learned from Anthony Reading who wrote a lovely 
> book on information Meaningful Information - it is the recipient that brings 
> the meaning to the information. 
>  
> PS My book What is Information was been translated into Portuguese and 
> published in Brazil where I am doing a 4 city, 5 university speaking tour. 
> The book has not yet appeared in English but it is scheduled to be published 
> soon by Demo press.
>  
> Regards from Brazil - Bob
>  
>  
>  
> On 2012-03-17, at 11:17 AM, Stanley N Salthe wrote:
> 
> 
> Concerning the meaning (or effect) of information (or constraint) in general, 
> I have proposed that context is crucial in modulating the effect -- in all 
> cases.  Thus: it would be like the logical example:
>  
>  Effect = context a   x   Constraint ^context b
>  
> STAN
>  
>  
>  
> 
> On Fri, Mar 16, 2012 at 2:18 PM, Christophe Menant 
>  wrote:
> Dear FISers, 
> Indeed information can be considered downwards (physical & meaningless) and 
> upwards (biological & meaningful). The difference being about interpretation 
> or not. 
> It also introduces an evolutionary approach to information processing and 
> meaning generation.
> There is a chapter on that subject in a recent book 
> (http://www.amazon.co.uk/Information-Computation-Philosophical-Understanding-Foundations/dp/toc/9814295477).
>  
> “Computation on Information, Meaning and Representations.An Evolutionary 
> Approach”
> Content of the chapter:
> 1. Information and Meaning. Meaning Generation
> 1.1. Information.Meaning of information and quantity of information
> 1.2. Meaningful information and constraint satisfaction. A systemic approach
> 2. Information, Meaning and Representations. An Evolutionary Approach 
> 2.1. Stay alive constraint and meaning generation for organisms
> 2.2. The Meaning Generator System (MGS). A systemic and evolutionary approach
> 2.3. Meaning transmission
> 2.4. Individual and species constraints. Group life constraints. Networks of 
> meanings
> 2.5. From meaningful information to meaningful representations
> 3. Meaningful Information and Representations in Humans
> 4. Meaningful Information and Representations in Artificial Systems
> 4.1. Meaningful information and representations from traditional AI to 
> Nouvelle AI. Embodied-situated AI
> 4.2. Meaningful representations versus the guidance theory of representation
> 4.3. Meaningful information and representations versus the enactive approach
> 5. Conclusion and Continuation
> 5.1. Conclusion
> 5.2. Continuation
> A version 

Re: [Fis] [Fwd: Re: Physics of computing]--Plamen S.

2012-03-18 Thread Bob Logan
Dear Stanley - how can there be information in the abiotic world? Information 
is the noun associated with the verb to inform or informing. A rock can not be 
informed. An abiotic entity can not be informed. Information begins with life. 
A bacterium can be informed but not an abiotic entity. When we look at stars or 
the moon or a fossil, they are not information. Our interpretation of the 
things in nature we observe, biotic or abiotic is the information. Perhaps I am 
missing something but that is how I see things from my naive point of view. The 
star, the moon or the fossil are not signs unless you believe that God exists 
and he or she made these signs for us to interpret. What do you mean that 
semiosis is a universal phenomenon? 

best Bob
On 2012-03-18, at 11:48 AM, Stanley N Salthe wrote:

> As my first posting for this week:
> 
> Bob, Loet -- I respond by clarifying that my meaning in this little equation 
> is that (following Sebeok) semiosis is a universal phenomenon.  The system of 
> interpretance in my effort here is the LOCALE.  It is such locales that have 
> evolved into organisms and social systems.  In organisms and other distinct 
> systems of interpretance, the sign is the context for interpretation.  So, in 
> the little equation, I am GENERALIZING semiosis into abiotic Nature.
> 
> STAN
>   
> 
> On Sun, Mar 18, 2012 at 2:57 AM, Loet Leydesdorff  
> wrote:
> Dear Bob,
> 
>  
> 
> Yes, I agree: the difference that makes a difference is operationally 
> generated by a receiving system; information itself is nothing but a series 
> of differences (contained in a probability distribution). The selection 
> mechanisms in the receiving systems that position the incoming uncertainty 
> have to be specified (as hypotheses). Meaningful information emerges from 
> selecting the signal from the noise.
> 
>  
> 
> The meaningful information (the differences that make a difference) can again 
> be communicated as information (for example, in and among biological 
> systems). Thus, the operation is recursive and the communication / 
> autopoiesis continues. Meaning can only be communicated by systems which are 
> able to entertain a symbolic order reflexively such as human beings and in 
> interhuman discourses.
> 
>  
> 
> I’ll read the book by Reading.
> 
> Best,
> 
> Loet
> 
>  
> 
> Loet Leydesdorff
> 
> Professor, University of Amsterdam
> Amsterdam School of Communications Research (ASCoR), 
> Kloveniersburgwal 48, 1012 CX Amsterdam. 
> Tel.: +31-20- 525 6598; fax: +31-842239111
> l...@leydesdorff.net ; http://www.leydesdorff.net/ ; 
> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
> 
>  
> 
> From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On 
> Behalf Of Bob Logan
> Sent: Saturday, March 17, 2012 10:55 PM
> To: Stanley N Salthe
> Cc: fis
> Subject: Re: [Fis] FW: [Fwd: Re: Physics of computing]--Plamen S.
> 
>  
> 
> Stan - great formula but as I learned from Anthony Reading who wrote a lovely 
> book on information Meaningful Information - it is the recipient that brings 
> the meaning to the information. 
> 
>  
> 
> PS My book What is Information was been translated into Portuguese and 
> published in Brazil where I am doing a 4 city, 5 university speaking tour. 
> The book has not yet appeared in English but it is scheduled to be published 
> soon by Demo press.
> 
>  
> 
> Regards from Brazil - Bob
> 
>  
> 
>  
> 
>  
> 
> On 2012-03-17, at 11:17 AM, Stanley N Salthe wrote:
> 
> 
> 
> 
> Concerning the meaning (or effect) of information (or constraint) in general, 
> I have proposed that context is crucial in modulating the effect -- in all 
> cases.  Thus: it would be like the logical example:
> 
>  
> 
>  Effect = context a   x   Constraint ^context b
> 
>  
> 
> STAN
> 
>  
> 
>  
> 
>  
> 
> On Fri, Mar 16, 2012 at 2:18 PM, Christophe Menant 
>  wrote:
> 
> Dear FISers, 
> Indeed information can be considered downwards (physical & meaningless) and 
> upwards (biological & meaningful). The difference being about interpretation 
> or not. 
> It also introduces an evolutionary approach to information processing and 
> meaning generation.
> There is a chapter on that subject in a recent book 
> (http://www.amazon.co.uk/Information-Computation-Philosophical-Understanding-Foundations/dp/toc/9814295477).
>  
> “Computation on Information, Meaning and Representations.An Evolutionary 
> Approach”
> Content of the chapter:
> 1. Information and Meaning. Meaning Generation
> 1.1. Information.Meaning of information and quantity of information
> 1.2. Meaningful informat

Re: [Fis] FW: [Fwd: Re: Physics of computing]--Plamen S.

2012-03-17 Thread Bob Logan
Stan - great formula but as I learned from Anthony Reading who wrote a lovely 
book on information Meaningful Information - it is the recipient that brings 
the meaning to the information. 

PS My book What is Information was been translated into Portuguese and 
published in Brazil where I am doing a 4 city, 5 university speaking tour. The 
book has not yet appeared in English but it is scheduled to be published soon 
by Demo press.

Regards from Brazil - Bob



On 2012-03-17, at 11:17 AM, Stanley N Salthe wrote:

> Concerning the meaning (or effect) of information (or constraint) in general, 
> I have proposed that context is crucial in modulating the effect -- in all 
> cases.  Thus: it would be like the logical example:
> 
>  Effect = context a   x   Constraint ^context b
> 
> STAN
> 
> 
> 
> 
> On Fri, Mar 16, 2012 at 2:18 PM, Christophe Menant 
>  wrote:
> Dear FISers, 
> Indeed information can be considered downwards (physical & meaningless) and 
> upwards (biological & meaningful). The difference being about interpretation 
> or not. 
> It also introduces an evolutionary approach to information processing and 
> meaning generation.
> There is a chapter on that subject in a recent book 
> (http://www.amazon.co.uk/Information-Computation-Philosophical-Understanding-Foundations/dp/toc/9814295477).
>  
> “Computation on Information, Meaning and Representations.An Evolutionary 
> Approach”
> Content of the chapter:
> 1. Information and Meaning. Meaning Generation
> 1.1. Information.Meaning of information and quantity of information
> 1.2. Meaningful information and constraint satisfaction. A systemic approach
> 2. Information, Meaning and Representations. An Evolutionary Approach 
> 2.1. Stay alive constraint and meaning generation for organisms
> 2.2. The Meaning Generator System (MGS). A systemic and evolutionary approach
> 2.3. Meaning transmission
> 2.4. Individual and species constraints. Group life constraints. Networks of 
> meanings
> 2.5. From meaningful information to meaningful representations
> 3. Meaningful Information and Representations in Humans
> 4. Meaningful Information and Representations in Artificial Systems
> 4.1. Meaningful information and representations from traditional AI to 
> Nouvelle AI. Embodied-situated AI
> 4.2. Meaningful representations versus the guidance theory of representation
> 4.3. Meaningful information and representations versus the enactive approach
> 5. Conclusion and Continuation
> 5.1. Conclusion
> 5.2. Continuation
> A version close to the final text can be reached at 
> http://crmenant.free.fr/2009BookChapter/C.Menant.211009.pdf
> 
> As Plamen says, we may be at the beginning of a new scientific revolution. 
> But I’m afraid that an understanding of the meaning of information needs 
> clear enough an understanding of the constraint at the source of the meaning 
> generation process. And even for basic organic meanings coming from a “stay 
> alive” constraint, we have to face the still mysterious nature of life. And 
> for human meanings, the even more mysterious nature of human mind.
> This is not to discourage our efforts in investigating these questions. Just 
> to put a stick in the ground showing where we stand. 
> Best,
> Christophe 
> Date: Fri, 16 Mar 2012 13:47:28 +0100
> From: pcmarijuan.i...@aragon.es
> To: fis@listas.unizar.es
> Subject: [Fis] [Fwd: Re: Physics of computing]--Plamen S.
> 
>  Mensaje original 
> Asunto:   Re: [Fis] Physics of computing
> Fecha:Fri, 16 Mar 2012 13:24:38 +0100
> De:   Dr. Plamen L. Simeonov 
> Para: Pedro C. Marijuan 
> Referencias:  <20120316041607.66ffc68000...@1w8.tpn.terra.com> 
> <4f6321c3.5000...@aragon.es>
> 
> 
> +++
> 
> Dear All,
> 
> I could not agree more with Pedro's opinion. The referred article is 
> interesting indeed. but, information is only physical in the narrow sense 
> taken by conventional physicalistic-mechanistic-computational approaches. 
> Such a statement defends the reductionist view at nature: sorry. But 
> information is more than bits and Shanno's law and biology has far more to 
> offer. I think we are at the beginning of a new scientific revolution. So, we 
> may need to take our (Maxwell) "daemons" and (Turing) "oracles" closer under 
> the lens. In fact, David Ball, the author of the Nature paper approached me 
> after my talk in Brussels in 2010 on the Integral Biomathics approach and 
> told me he thinks it were a step in the right direction: biology driven 
> mathematics and computation. 
> 
> By the way, our book of ideas on IB will be released next month by Springer: 
> http://www.springer.com/engineering/computational+intelligence+and+complexity/book/978-3-642-28110-5
> If you wish to obtain it at a lower price (65 EUR incl. worldwide delivery) 
> please send me your names, mailing addresses and phone numbers via email to: 
> pla...@simeio.org. There must be at least 9 orders to keep that discount 
> price..
> 
> Best,
> 
> Plamen
> 
> 
>

Re: [Fis] WG: stuff and non-stuff

2012-02-28 Thread Bob Logan
Dear Joe - thanks for the honourable mention - I am in fact working my way thru 
Terry's masterpiece (on p. 200) and agree with your evaluation of it. I am 
making careful notes which I will be happy to share with the FIS team. Best 
wishes to all - Bob

 
On 2012-02-28, at 1:16 PM, joe.bren...@bluewin.ch wrote:

> Dear Pedro, John and Colleagues,
> 
> The article by Terrence Deacon in the book referred to by John is entitled 
> "What is Missing from Theories of Information?" and, as Pedro has indicated, 
> it and Deacon's new book Incomplete Nature. How Mind Emerged from Matter may 
> be major new additions to the foundations of information. Among other things, 
> far from supporting "it from bit", Deacon provides expert arguments against 
> this position, adopted indeed in a majority of the other articles in the 
> Davies compendium.
> 
> Deacon's key point is that what is missing from theories is operation in 
> reality of constraints, extending their role discussed previously by Stuart 
> Kauffmann, Bob Logan, Bob Ulanowicz, Stan and John himself and focussing on 
> what, as the consequence of constraints, is absent in information and other 
> complex processes.
> 
> I hope that many colleagues will make the effort to access this material so 
> that we may achieve a critical mass for its discussion and evaluation.
> 
> Best wishes,
> 
> 
> Ursprüngliche Nachricht
> Von: pcmarijuan.i...@aragon.es
> Datum: 21.02.2012 18:02
> An: 
> Betreff: [Fis] stuff and non-stuff
> 
> Dear FIS colleagues,
> 
> John's comments below on that book are quite interesting. Most approaches to 
> information rely on "stuff" and "organization of stuff" --information is 
> inevitably physical, as Rolf Landauer put long ago. However, "non stuff" and 
> "organization of "non stuff" might be taken as central ideas too, e.g. in 
> Deacon's approach --through the notion of absence. Deacon is one of the main 
> contributors of that book, and author of another very recent info book that 
> has already been referred in this list, by Joseph I think.
> 
> My further point, to connect with an unfinished message on info science 
> teaching some weeks ago, is that genuine informational entities, those 
> capable of making "distinctions" that are used for self-constructing in 
> permanent communication with the medium, deserve a special status within the 
> whole info science studies. These distinctional entities are but the great 
> players of the absence game... Therfor info science teaching should cover 
> "central themes", "multidisciplinary recombinations", and the comparative 
> study of "informational-distinctional entities."
> 
> Best wishes to all!
> 
> ---Pedro
> John Collier escribió:
>> 
>> Hi all,
>> I am reviewing a book edited by Paul Davies and Niels Henrik Gregersen 
>> titled Information and the Nature of Reality: From Physics to Metaphysics. 
>> There is a lot of quasireligious stuff that I find hard to swallow, mostly 
>> by people I have never heard of before, but many of the chapters are by 
>> well-known scholars who have been influential in physics and biology, as 
>> well as the history of science. The most common thread through the articles 
>> is that the world is not made up of "stuff" (matter), and that the idea has 
>> been problematic since its introduction. Instead the world is made of 
>> information (the "It from Bit" view). Interesting book, even if you don't 
>> agree with it.
>> John
>> 
>> Professor John Collier  
>> Philosophy, University of KwaZulu-Natal
>> Durban 4041 South Africa
>> T: +27 (31) 260 3248 / 260 2292
>> F: +27 (31) 260 3031
>> email: colli...@ukzn.ac.za>>> On 2012/01/23 at 07:18 PM, in message 
>> <4f1d967a.8070...@aragon.es>, "
> 
> --
> -
> Pedro C. Marijuán
> Grupo de Bioinformación / Bioinformation Group
> Instituto Aragonés de Ciencias de la Salud
> Avda. Gómez Laguna, 25, Pl. 11ª
> 50009 Zaragoza, Spain
> Telf: 34 976 71 3526 (& 6818) Fax: 34 976 71 5554
> pcmarijuan.i...@aragon.es
> http://sites.google.com/site/pedrocmarijuan/
> -
> 
> 
> 
> 
> ___
> fis mailing list
> fis@listas.unizar.es
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] THEORY AND SCIENCE From QTQ

2012-01-11 Thread Bob Logan
Caro Colleagues - Etymology is always a useful exercise - theory and theatre 
come from the same root 'to see' and science from the root 'to know'. "The word 
theoria θεωρία, meant 'a looking at, viewing, beholding', and referring to 
contemplation or speculation (wikipedia article on theory)." Now most often we 
say to see is to believe or know. This translate into the idea that  theory 
helps us to believe or to know or to do science. But Marshall McLuhan said I 
did not see it until I believed it. In other words to believe or know is a 
necessary condition to be able to see which translates into the idea that we 
need science to make a theory. Science is a way of organizing theories and 
theories are a way of organizing science. The relation of science and theory is 
a chicken and egg problem. It is a question of emergence. The relation of 
science and theory is one of non-linear dynamics. One needs science to make a 
theory and a theory or theories to make science. As is the case with emergent 
phenomena one cannot predict what theories will emerge from science or what 
science will emerge from theories. This is my theory of science and my science 
of theories.

I hope you enjoyed my playful take on the relation of science and theory which 
I offer as a serious resolution to the challenges raised in this thread. I also 
hope you will comment.

with kind regards - Bob Logan

 
On 2012-01-11, at 2:42 AM, Krassimir Markov wrote:

> Dear QTQ and FIS Colleagues,
>  
> I am afraid we had not define the terms ‘theory” and “science” but start 
> discussion.
> Let firstly clear what they means and after that to make conclusions.
>  
> It is clear the theories are part of the science but the science is something 
> more.
>  
> In our area - the Information Science is quite more than any theory for 
> information.
>  
> Of course,  “What is information?” is the basic question, but after it follow 
> the questions “How it is used by live organisms?” and “Why it is needed for 
> social structures?”, “Why one and the same reflection is information for one 
> but not for another subject?” and “Can the information be totally destroyed 
> or not, i.e. is the information depended with physical  (material) world or 
> not?”, etc.
>  
> ---
>  
> About the journals:
>  
> I have more than 35 year experience in editing and publishing scientific 
> collections and journals.
> My personal position is: “The Variety and Independence cause Development!” 
>  
> Every journal has its own politic and there is no sense to discus its rules.
> My personal position is presented in the name of my firs Int. Journal: 
> “Information Theories and Applications”.
> Yes – Theories.
>  
> Some scientists prefer to be protected from “spam” papers by the reviewers, 
> but this has simple decision – “Do not read papers at all !”
>  
> The science needs new ideas.
> Sometimes the proper way is not in the fat books but in the thin papers. 
> We have long history of development but are we sure that all in it 
> (especially for information) is absolutely correct ant may be accepted 
> without doubt?
>  
> ---
>  
> Happy New 2012 Year!
> Friendly regards
>  
> Krassimir
>  
>  
>  
>  
> From: Pedro C. Marijuan
> Sent: Tuesday, January 10, 2012 5:48 PM
> To: fis@listas.unizar.es
> Subject: [Fis] [Fwd: THEORY AND SCIENCE] From QTQ
>  
> 
> 
>  Mensaje original 
> Asunto:   THEORY AND SCIENCE
> Fecha:Tue, 10 Jan 2012 10:49:58 +0800
> De:   whhbs...@sina.com
> Responder a:  whhbs...@sina.com
> Para: Pedro C. Marijuan mailto:pcmarijuan.i...@aragon.es, mjs 
> mailto:m...@aiu.ac.jp, Joseph Brenner mailto:joe.bren...@bluewin.ch, fislist 
> mailto:fis@listas.unizar.es
> 
> 
> Dear Pedro, Dear Marcin, Dear Joseph, Dear FIS Colleagues,
> Theory is important and necessary, but theory is different from science, 
> theory is a growing view or hypothesis. We should remember Russell's paradox 
> and the third number of Math Crisis, we should remember Aristotle and 
> Galileo, and we shouldn't forget the article about MDMA by Jan Hendrik Schön 
> in 2002. Let us remember history that go on record according to gubernatorial 
> volition usually. Furthermore, lie is not science, for example Hwang's 
> cloning experiments in stem-cell research.
> Newton's mechanics is science, computer science is true, and however there is 
> no information science. The sole criterion for truth or science is practice.
> Of course, after the Hwang affair, Science(journal)gets its wrist slapped for 
> publishing a fraudulent stem-cell paper. Instead, the committee recommends 
> that papers received by the journal should be divided into uncontroversial 
> and 

Re: [Fis] [Fwd: THEORY AND SCIENCE] From QTQ

2012-01-10 Thread Bob Logan
Dear Gyorgy - well said - I agree with you and thank you for your excellent 
comments - Bob Logan


On 2012-01-10, at 11:23 AM, Gyorgy Darvas wrote:

> I disagree!
> Theory is a part, an important element of science.
> Practice should check, can confirm or falsify a theory, but cannot replace or 
> discredit its mission.
> 
> As regards journals, my position is that they must give place to 
> controversial ideas. The task of a journal is to give forum for discussion. A 
> journal (i.e., its editors) and the reviewers do not need to share the 
> opinions put forward in the submitted papers. Scientific truth can be met 
> only if controversial ideas are discussed. That means, scientific truth is 
> shaped as a result of discussion. The task of the reviewers is not to take 
> over the responsibility from the scientific community, to decide on the 
> correctness of ideas, incl. theories, expressed in the papers. (They can 
> check whether the mathematical derivations are not mistaken, experiments are 
> documented and reproducible, etc.) Ideas can be discussed if they are 
> published for a wide circle of scientists, so that anyone who wants could 
> express her/his supporting or controversial ideas and arguments. This is a 
> key to the development of science.
> 
> Regards,
> Gyuri
> 
> 
> 
> At 16:48 2012.01.10.ÿ, Pedro C. Marijuan wrote:
> 
> 
>>  Mensaje original  
>> Asunto: THEORY AND SCIENCE 
>> Fecha: Tue, 10 Jan 2012 10:49:58 +0800 
>> De: whhbs...@sina.com 
>> Responder a: whhbs...@sina.com 
>> Para: Pedro C. Marijuan , mjs , 
>> Joseph Brenner  , fislist  
>> 
>> 
>> Dear Pedro, Dear Marcin, Dear Joseph, Dear FIS Colleagues, 
>> 
>> Theory is important and necessary, but theory is different from science, 
>> theory is a growing view or hypothesis. We should remember Russell's paradox 
>> and the third number of Math Crisis, we should remember Aristotle and 
>> Galileo, and we shouldn't forget the article about MDMA by Jan Hendrik 
>> Schön in 2002. Let us remember history that go on record according to 
>> gubernatorial volition usually. Furthermore, lie is not science, for example 
>> Hwang's cloning experiments in stem-cell research. 
>> 
>> Newton's mechanics is science, computer science is true, and however there 
>> is no information science. The sole criterion for truth or science is 
>> practice.
>> 
>> Of course, after the Hwang affair, Science(journal )gets its wrist 
>> slapped for publishing a fraudulent stem-cell paper. Instead, the committee 
>> recommends that papers received by the journal should be divided into 
>> uncontroversial and controversial, and the latter gone over with a fine comb 
>> of new check. Â This is a way to arrive at science, too.
>> 
>> Best wishes
>> 
>> Â 
>> 
>> QTQ
>> 
>> 
>> 
>> 
>> 
>> ___
>> fis mailing list
>> fis@listas.unizar.es
>> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis
> 
> Recent publications online:
> - Mathematical description of a so far undisclosed symmetry of nature:
> http://arxiv.org/abs/0811.3189v1
> - Physical consequences of a new gauge-symmetry and the concluded 
> conservation law:
> http://www.springerlink.com/content/g28q43v2112721r1/
> - Spontaneous symmetry breaking in non-Euclidean systems:
> http://www.springerlink.com/content/k272555u06q2074w/?p=14dd4c9c5b5e4c1396b3e4855a87e9e2&pi=1
>  
> __
> Gyorgy Darvas
> E-mail / Skype;  S Y M M E T R I O N
> Mailing address: c/o G. Darvas; 29 Eotvos St., Budapest, H-1067 Hungary
> Phone: 36 (1) 302-6965;  
> Monograph: Symmetry;  Course of lectures
> ___
> 
> ___
> fis mailing list
> fis@listas.unizar.es
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 
www.physics.utoronto.ca/Members/logan




___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Curious chronicle

2010-07-19 Thread bob logan
Stanley -I have often had this thought. Better to compete on the  
football pitch than the battlefield. The term hero is used for those  
that excel in either form of battle. Bob

__

Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto
www.physics.utoronto.ca/Members/logan

On 19-Jul-10, at 2:12 PM, Stanley N Salthe wrote:

Has anyone suggested the function of contact sports to be the  
'moral equivalent' of war.  Many young men requires this kind of  
excitement because of their hormone mix.


STAN

On Mon, Jul 19, 2010 at 10:55 AM, Pedro C. Marijuan  
 wrote:

Dear FISers,

Looking for an informational explanation of soccer, or other  
sports, as was asking Joseph, one can look at the internal side of  
the event. Then, as Jorge and Bob have done, one can discuss about  
the panorama of networking relationships or the "ascendancy" of the  
different elements. While agreeing with the interest of these  
approaches, one can also look towards the outside and ask about the  
social importance attributed to such type of spectacles. It is  
interesting that today a lot of economic activities, like sports,  
may be ascribed  to ephemeral "information production" --think of  
entertainment, news, fashions, e-networks, communications, tourism,  
etc. Maybe this is the fastest growing segment, even in spite of  
the global crisis. Why the increasing predominance of "panem et  
circenses"?
An speculative point may be that complex societies are caught into  
a information paradox. The higher they grow in their aggregate  
complexity the lower the structure of basic social relations (and  
interesting information) around the individual. According to  
Dunbar's "social brain hypothesis", these complex societies deviate  
progressively from the evolutionary networking structure of our  
species. Thus "info" surrogates of whatever type are more and more  
necessary for the individual and for the society as a whole,  
although probably they are working worse and worse. If this is so,  
it makes sense that in the "information era" depression has become  
the first incapacitating pathology (above flu).


Unfortunately, the victory at the world championship has been so  
ephemeral!


best wishes

Pedro
PS. As a question to Karl: in what extent are directed graphs (or  
generic networks) equivalent to multidimensional partitions? Would  
it make sense the description of "ascendancy" in terms of partitions?



Robert Ulanowicz escribió:


Dear Jorge and Fis members:

The method is intriguing, but rather ad-hoc.

I and colleagues in marine science have directly used
information-theoretic indexes to evaluate the dynamically most
important nodes and links in a quantified network. I'm convinced it
could be applied as well to players on a team:

Ulanowicz, R.E. and D. Baird. 1999. Nutrient controls on ecosystem
   dynamics:  The Chesapeake mesohaline community.  J.  Mar.
   Sys. 19:159-172

The best,
Bob Ulanowicz


- 


Robert E. Ulanowicz|  Tel: +1-352-378-7355
Arthur R. Marshall Laboratory  |  FAX: +1-352-392-3704
Department of Biology  |  Emeritus, Chesapeake Biol. Lab
Bartram Hall 110   |  University of Maryland
University of Florida  |  Email 
Gainesville, FL 32611-8525 USA |  Web www.cbl.umces.edu/~ulan>
- 
-



Quoting Jorge Navarro López :



Dear FIS collegaes,

Hi!  This is my first posting in the list. My name is Jorge Navarro
and I am working with Pedro on Systems Biology and Network Science.
Following with Joseph proposal I have found an interesting paper
about a satisfactory theory of information  applicable to  teamwork
sports:

*Quantifying the Performance of Individual Players in a Team  
Activity*


http://www.plosone.org/article/info:doi/10.1371/journal.pone.0010937


I think that formally one can say  a lot about what teamship
activities become interesting and exciting to watch, and what other
activities are dull and boring.
I was playing soccer myself until a few years ago (forward), like
Villa :-), and I am very interested in the informational side of
sports, soccer of course.

VIVA ESPAÑA!!!

Kind Regards

Jorge


--



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis




--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Avda. Gómez Laguna, 25, Pl. 11ª
50009 Zaragoza, Spain
Telf: 34 976 71 3526 (& 6818) Fax: 34 976 71 5554
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
fis mailing list
fis@listas.unizar.es
https:

Re: [Fis] Derrida's "diferAnce" and Kolmogorov's Information Operator

2010-02-22 Thread bob logan
Dear Joseph - once again your post was most stimulating, provocative  
and enjoyable. Kolmogorov's definition of information that you quote  
is most interesting but like Shannon's definition incorporates the  
notion that information is a quantitative concept that can be  
measured. The Bateson definition that you refer to was a critique of  
this notion of information as a quantitative measure. The criticism  
began with  MacKay (1969 Information, Mechanism and Meaning.  
Cambridge MA: MIT Press.) who wrote "Information is a distinction  
that makes a difference" which Bateson (1973 Steps to an Ecology of  
Mind. St. Albans: Paladin Frogmore.) then built on to come up with  
the more popular: "Information is a difference that makes a  
difference". MacKay was the first to critique Shannon's quantitative  
definition of information when Shannon wrote his famous definition:   
"We have represented a discrete information source as a Markoff  
process. Can we define a quantity, which will measure, in some sense,  
how much information is ‘produced’ by such a process, or better, at  
what rate information is produced?" – Shannon (1948 . A mathematical  
theory of communication. Bell System Technical Journal, vol. 27, pp.  
379-423 and 623-656, July and October, 1948.)


According to Claude Shannon (1948, p. 379) his definition of  
information is not connected to its meaning. Weaver concurred in his  
introduction to Shannon’s A Mathematical Theory of Communication when  
he wrote: “Information has ‘nothing to do with meaning’ although it  
does describe a ‘pattern’." Shannon also suggested that information  
in the form of a message often contains meaning but that meaning is  
not a necessary condition for defining information. So it is possible  
to have information without meaning, whatever that means.


Not all of the members of the information science community were  
happy with Shannon’s definition of information. Three years after  
Shannon proposed his definition of information Donald Mackay (1951)  
at the 8th Macy Conference argued for another approach to  
understanding the nature of information. The highly influential Macy  
Conferences on cybernetics, systems theory, information and  
communications were held from 1946 to 1953 during which Norbert  
Wiener’s newly minted cybernetic theory and Shannon’s information  
theory were discussed and debated with a fascinating  
interdisciplinary team of scholars which also included Warren  
McCulloch, Walter Pitts, Gregory Bateson, Margaret Mead, Heinz von  
Foerster, Kurt Lewin and John von Neumann. MacKay argued that he did  
not see “too close a connection between the notion of information as  
we use it in communications engineering and what [we] are doing here…  
the problem here is not so much finding the best encoding of symbols… 
but, rather, the determination of the semantic question of what to  
send and to whom to send it.” He suggested that information should be  
defined as “the change in a receiver’s mind-set, and thus with  
meaning” and not just the sender’s signal (Hayles 1999b, p. 74). The  
notion of information independent of its meaning or context is like  
looking at a figure isolated from its ground. As the ground changes  
so too does the meaning of the figure.


The last two paragraphs are an excerpt from my new book What is  
Information? to be published by the University of Toronto Press in  
late 2010 or early 2011. Your post Joseph has stimulated the  
following thoughts that I hope to add to my new book before it is  
typeset.


As MacKay and Bateson have argued there is a qualitative dimension to  
information not captured by the Shannon Weaver quantitative model nor  
by Kolmogorov's definition. Information is multidimensional. There is  
a quantitative dimension as captured by Shannon and Kolmogorov and a  
qualitative one of meaning as captured by MacKay and Bateson but one  
can think of other dimensions as well. In responding to a  
communication by Joseph Brenner on the Foundations of Information  
(FIS) listserv I described the information that he communicated as  
stimulating, provocative and enjoyable. Brenner cited the following  
Kolmogorov definition of information as “any operator which changes  
the distribution of probabilities in a given set of events.”  
Brenner's information changed the distribution of my mental events to  
one of stimulation, provocation and enjoyment and so there is  
something authentic that this definition of Kolmogorov captures that  
his earlier cited definition of information as "the minimum  
computational resources needed to describe a program or a text" does  
not. We therefore conclude that not only is there a relativistic  
component to information but it is also multidimensional and not uni- 
dimensional as is the case with Shannon information.


Joseph - many thanks for your stimulating post - I look forward to  
your comments on this riff on your thoughts. - Bob

___

Re: [Fis] The Information Cluster

2009-10-27 Thread bob logan
Dear Pedro - I want to reinforce the point you make in 3. KNOWLEDGE  
ACCRETION & RECOMBINATION. I completely agree with the sentiments  
expressed there. I like to say that the soft issues are the hardest  
problems. The science and technology issues are easy to solve. Our  
scientific and technological knowledge has outstripped our ability to  
integrate these technical advances into our social systems with  
disastrous results and the worse is yet to come such as the  
possibility for a run away greenhouse effect. I applaud your clarion  
voice.

Bob


On 27-Oct-09, at 8:20 AM, Pedro C. Marijuan wrote:

> Dear All,
>
> My impression after the recent --and intense-- exchange on "knowledge
> definition" is that the discussion should also be centered in the  
> quest
> for a new conceptual information cluster, globally, rather than trying
> to achieve consensus upon a single term, or proceeding term by term.
> Well, here is my particular bio-inspired proposal:
>
> 1. INFORMATION
> Which has to be treated as "distinction on the adjacent". Temporarily
> leaving aside the adjacency part, a discussion on the "grammar of
> distinctions" is needed. Thus, a possible concrete case could be   
> color
> vision.  How do we build the panorama of color nuances out from the
> different types of cones activated in the retina? If the idea of
> multidimensional partitions can be applied, this will be an excellent
> test on how far we can go along that road. Colors arisen from
> combinations of colors should respond to, and be captured by, the
> partitional treatment (including the curious "disappearance" of  
> colors...).
>
> 2. INFORMATION PROCEEDING
> The central idea is that, actually, information  is never "processed",
> except in the realm of artificial systems. As natural information
> becomes internalized by the living subject, it is mingled with the
> internal life processes, and proceeds up to the point of
> irretrievability. Thus, information always changes the subject (no  
> such
> thing as a "separate" observer in the information realm). Those  
> changes
> in the subject, echoing late Tom Stonier in this very list (in the  
> 90's)
> posting on "semantic metabolism", would include further aspects about
> meaning generation, response, relevance to the life cycle, closure,  
> and
> fitness.
>
> 3. KNOWLEDGE ACCRETION & RECOMBINATION
> Two  exemplary cases might be envisioned for both accretion and
> recombination: protein domains in the prokaryotic world, and the
> disciplinary knowledge of our societies. Perhaps, at the time  
> being, the
> social discussion looks more "timely". The point is that past
> conceptions of knowledge do not help in the dramatic knowledge
> recombination efforts needed to confront global problems of today. Per
> se, the idealizations of the scientific method eliminate the social
> process of accretion of specialized knowldege and the later
> social-institutional recombination of the isolated disciplines, which
> are often seen as mere pragmatic, "dirty", pork-barrel matters... a  
> new
> science, information science, with a completely different abstraction
> procedure is needed (no rhetoric!)
>
> best greetings
>
> Pedro
>
> PS. Finally I am working on a web site (awfully in construction yet),
> see below. Comments will be appreciated.
>
> -- 
>
> -
> Pedro C. Marijuán
> Grupo de Bioinformación / Bioinformation Group
> Instituto Aragonés de Ciencias de la Salud
> Avda. Gómez Laguna, 25, Pl. 11ª
> 50009 Zaragoza. España / Spain
> Telf: 34 976 71 3526 (& 6818) Fax: 34 976 71 5554
> pcmarijuan.i...@aragon.es
> http://sites.google.com/site/pedrocmarijuan/
> -
>
>
>
>
>
> ___
> fis mailing list
> fis@listas.unizar.es
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Neurodynamic Central Theory

2009-09-25 Thread bob logan
Pedro - I was interested in this project but thought there was no  
room for a Canadian - I would love to participate if that is at all  
possible. My work on the origin of language in my book The Extended  
Mind might be of some help to the Neurodynamic Central Theory  
project. Here are two chapters from the Extended Mind that might be  
of help in this project where I argue that the human mind and  
language emerged together. If I can be of help let me know - with  
greetings to all my European friends.  - Bob


On 25-Sep-09, at 6:29 AM, Pedro C. Marijuan wrote:

> Dear FIS colleagues,
>
> Thanks to Karl and Joe for their responses. Sorry that I cannot answer
> them, but herein is the draft we are currently working in. We continue
> looking for partners (the easiest if they belong to the European  
> Union).
> Comments are welcome...
>
> best wishes
>
> Pedro
> -- 
> -
>
>  *
> *
>
> *TOWARDS A NEURODYNAMIC CENTRAL THEORY--REVISITING CAJAL'S "LAW OF
> MAXIMUM ECONOMY"*
>
>
>
> Arguably, one of the most dramatic absences in the contemporary
> scientific system occurs in the neurosciences. The ongoing
> neurocomputational, neuromolecular, neuroinformatic and neuroimaging
> revolutions (to name but a few of the emerging disciplines responsible
> of the enormous experimental data-accumulation taking place in
> neurosciences) have not been accompanied by a parsimonious theoretical
> synthesis yet. Thereupon, the lack of a central neurodynamic theory
> playing a "circulating role" between all the neural fields, similar to
> the Darwinian theory's central role in the biological realm (or
> classical mechanics in physics), is creating an intellectual vacuum  
> that
> negatively influences on the neurosciences themselves, humanities and
> the social sciences.
>
> The theoretic development herein envisioned will be grounded,
> historically, in Cajal's  law of "maximum economy in space, time, and
> interconnecting matter" (Textura, pp. 95-106), which will be revisited
> from different perspectives. On the one side, both M. Leyton (1992,
> 2001) and K.P. Collins & P.C. Marijuán (1997) have proposed  
> overarching
> optimization principles related to symmetry-operations in the neural
> maps that somehow imply minimization of excitations (echoing ideas
> already discussed by Freud himself, and more recently by Barlow,
> Rossler, Edelman, and Arbib; see also Cherniak, Griffin, Chen, etc.
> about structural optimization). Then, the emergence of the Action /
> Perception cycle as a basic building block to organize adaptive  
> behavior
> would be inscribed within the "closure" operations of the ongoing
> neurocomputational minimization processes performed upon the  
> excitation
> /inhibition collective variable, topologically distributed, defined  
> both
> globally and locally as a sort of "entropy" in the neuronal  
> mappings and
> rhythms (Gibson, Turvey, Kelso, Buzsáki). Further, the integration of
> such basic minimization processes within the main brain architectures
> --cerebellar, thalamocortical, limbic, frontal-- has to be achieved  
> in a
> way that simultaneously produces the outcomes of adaptive behavior,
> meaning, memory contents, and the possibility of frontal "conceptual"
> guidance. Recent works of authors such as Rodolfo Llinás, Alain  
> Berthoz,
> Jeff Hawkins, Tomás Ortiz, and Joaquín Fuster (e.g., the "cognits"
> proposal) would imply advancements in those particular areas for the
> proposed scheme.
>
> Indeed, the quest for a neurodynamic optimization principle in the
> neurosciences, and the central theory to be elaborated upon, that we
> must credit to Cajal's insights one hundred years ago, appears as a
> long-term interdisciplinary enterprise. If finally successful, it  
> would
> boost our present understanding of information processes in the  
> Central
> Nervous System and in a plethora of related disciplines, from
> neuro-robotics to social fields...
>
> Note: FISers Michael Leyton, Andrei Khrennikov, Fivos Panetsos, Rafa
> Lahoz, and Pedro Marijuan are already involved in this tentative
> project, Tomás Ortiz and Joaquin Fuster, quite probably, can be  
> enlisted
> too.
>
> -- 
>
> -
> Pedro C. Marijuán
> Grupo de Bioinformación / Bioinformation Group
> Instituto Aragonés de Ciencias de la Salud
> Avda. Gómez Laguna, 25, Pl. 11ª
> 50009 Zaragoza. España / Spain
> Telf: 34 976 71 3526 (& 6818) Fax: 34 976 71 5554
> pcmarijuan.i...@aragon.es
> -
>
>
>
>
>
> ___
> fis mailing list
> fis@listas.unizar.es
> https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Breaking my silence

2008-06-30 Thread bob logan
Hi Stan - thanks for commenting on my post. I really liked your  
remark highlighted in pink. But I have a problem understanding the  
remarks in blue. Perhaps you could clarify for me. Why is it also  
true for any dissipative system and what exactly do you mean by a  
dissipative system? Also I do not see how Shannon info can be used to  
roughly assess relative configurations. Also why does a tornado have  
more possible macroscopic conformations than does a bird, and this  
has more than a snail. Finally. what is a conformation?


These questions are not posed to challenge your assertions but rather  
to help me understand them.


Thanks - Bob L

On 29-Jun-08, at 10:03 PM, Stanley Salthe wrote:


Replying to Bob Logan, then to Pedro, then to Ted -
-snip-

Bob said:


The reader will find in this paper an argument that Shannon info
does not work for biological systems precisely because as has been
pointed out in the discussion evolution cannot be predicted. This
reinforces Bob U's remark In my judgement there are far too many
folks who want to use the Shannon entropy itself as the measure of
information, and I believe that doing so erects major impediments to
grasping what information truly is. Bob U remark is right on the
money according to POE.



  The material reason that Shannon information cannot be used to
calculate information carrying capacity in biology (or for any
dissipative structures), is that there is no way way to find the
complete repertoire of any such system.  Thus, it is not
technologically 'useful'.  However, it does carry conceptual weight
nevertheless.  It can be used to roughly assess relative
configurations.  Thus, a tornado has more possible macroscopic
conformations than does a bird, and this has more than a snail.
---



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] concluding the session & season greetings

2007-12-20 Thread bob logan
Merry Christmas to all and it is I that need to thank you FISers for your 
thoughtful and meaningful responses to my musings. It has been a great 
experience for me. All the best for the New Year
  - Bob

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] An invitation

2007-11-26 Thread bob logan
Dear FIS friends - please find enclosed two items. An invitation to my book launch for those in the Greater Toronto Area. I have also included an order form for those wishing to order my book at a 20% discount. This book is about the origin of language, the human mind and culture all of which are central to the study of media ecology. All the best - Bob LoganUniversity of Toronto Press is pleased to  invite you toa book launch celebrating the publication ofThe Extended Mind: The Emergence of Language, the Human Mind, and CultureBy Robert K. Logan4:30pm - 7pmTuesday, December 4th,2007Beal Institute for Strategic CreativityOntario College of Art & Design100 McCaul St., Suite 600Toronto, OntarioMedia inquiries please contact Andrea-Jo Wilson at 416.978.2239 ext. 248 meaUniversity of Toronto PressSpecial 20% Discount Order FormThe Extended Mind: The Emergence of Language, the Human Mind, and CultureBy Robert K. LoganSpecial 20% Discount Price $31.96 ISBN 978-0-8020-9303-5The ability to communicate through language is such a fundamental part of human ex-istence that we often take it for granted, rarely considering how sophisticated the process is by which we understand and make ourselves understood. In The Extended Mind, acclaimed author Robert K. Logan examines the origin, emergence, and co-evolution of language, the human mind, and culture.Building on his previous study, The Sixth Language (2000), and making use of emer-gence theory, Logan seeks to explain how language emerged to deal with the complexity of hominid existence brought about by toolmaking, control of fire, social intelligence, coordinated hunting and gathering, and mimetic communication. The resulting emergence of language, he argues, signifies a fundamental change in the functioning of the human mind – a shift from per-cept-based thought to concept-based thought.This study will be of particular interest to linguists because of the way in which the origin of language is tied to the emergence of cognitive science and culture.From the perspective of the Extended Mind model, Logan provides an alternative to and critique of Noam Chomsky’s approach to the origin of language. He argues that language can be treated as an organism that evolved to be easily acquired, obviating the need for the hard-wiring of Chomsky’s Language Acquisition Device.In addition Logan shows how, according to this model, culture itself can be treated as an organism that has evolved to be easily attained, revealing the universality of human culture as well as providing an insight as to how altruism might have originated. Bringing timely insights to a fascinating field of inquiry, The Extended Mind will be of interest to readers in a wide range of disciplines. Robert K. Logan is a professor emeritus in the Department of Physics at the University of Toronto. Special 20% Discount Order FormAuthor / Title Reg. Price   Disc. Price      ISBN   Logan/ Extended Mind (Cloth)   $39.95  $31.96   978-0-8020-9303-5Enclosed please find:Cheque  Money order   Institutional  purchase order (please attach to order form)   ___CARD NUMBER     EXPIRY DATE   SIGNATURE (REQUIRED)   Name  Telephone Street                                                                                                               City    State/Prov    Zip/Postal CodeMail to: Order Department Phone: 1-800-565-9523 or 416-667-7791University of Toronto Press                       Fax: 1-800-221-9985 or 416-667-78325201 Dufferin Street  Email: [EMAIL PROTECTED]Toronto ON, Canada M3H 5T8                 Web:   www.utppublishing.com ___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] more thoughts about Shannon info

2007-11-06 Thread bob logan
Dear colleagues - please forgive my lapse in communications. I have  
been studying the question of Shannon info and have come up with the  
following thoughts I want to share with you. As always comments are  
solicited. Rather than answering each point raised in our recent  
email exchanges l decided to do some research and try to answer a  
number inquiries all at once.


The inspiration for adopting the word entropy in information theory  
comes from the close resemblance between Shannon's formula and the  
very similar  formula from thermodynamics: S = -k ∑ pi ln pi .
Entropy is related to the second law of thermodynamics that states that:


Energy spontaneously tends to flow only from being concentrated in  
one place
to becoming diffused or dispersed and spread out. And energy is  
governed by the first law of thermodynamics that states that energy  
cannot be destroyed or created.


Is there an equivalent 1st and  2nd law for information?

Entropy is used to describe systems that are undergoing dynamic  
interactions like the molecules in a gas. What is the analogy with  
Shannon entropy or information?


Is Shannon’s formula really the basis for a theory of information or  
is it merely a theory of signal transmission?


Thermodynamic entropy involves temperature and energy in the form of  
heat, which is constantly spreading out. Entropy S  is defined as ∆Q/ 
T. What are the analogies for Shannon entropy?


There is the flow of energy in thermodynamic entropy but energy is  
conserved, i.e. it cannot be destroyed or created.


There is the flow of information in Shannon entropy but is  
information something that cannot be destroyed or created as is the  
case with energy? Is it conserved? I do not think so because when I  
share my information with you I do not lose information but you gain  
it and hence information is created. Are not these thoughts that I am  
sharing with you, my readers, information that I have created?


Shannon entropy quantifies the information contained in a piece of  
data: it is the minimum average message length, in bits. Shannon  
information as the minimum number of bits needed to represent it is  
similar to the formulations of Chaitin information or Kolomogorov  
information. Shannon information has functionality for engineering  
purposes but since this is information without meaning it is better  
described as the measure of the amount and variety of the signal that  
is transmitted and not described as information. Shannon information  
theory is really signal transmission theory. Signal transmission is a  
necessary but not a sufficient condition for communications. There is  
no way to formulate the semantics, syntax or pragmatics of language  
within the Shannon framework.___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-12 Thread bob logan
hi stan - I think we need to agree to disagree - for me final cause  
involves purpose or intention. A gas that expands into a larger  
volume has no intention or purpose and therefore one cannot talk  
about final cause because there is no agent - I did enjoy your  
argument but I just cannot buy it - Bob



On 12-Oct-07, at 3:01 PM, bob logan wrote:

Loet et al - I guess I am not convinced that information and  
entropy are connected. Entropy in physics has the dimension of  
energy divided by temperature. Shannon entropy has no physical  
dimension - it is missing the Boltzman constant. Therefore how can  
entropy and shannon entropy be compared yet alone connected?


I am talking about information not entropy - an organized  
collection of organic chemicals must have more meaningful info than  
an unorganized collection of the same chemicals.



On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a  
random soup of
organic chemicals have more Shannon info than an equal number of  
organic
chemicals organized as a living cell where knowledge of some  
chemicals
automatically implies the presence of others and hence have less  
surprise

than those of the  soup of random organic chemicals? -  Bob


Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved  
(N).
The maximum entropy is therefore log(N). (Because of the  
randomness of

the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system  
increases

and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-12 Thread bob logan
Loet et al - I guess I am not convinced that information and entropy  
are connected. Entropy in physics has the dimension of energy divided  
by temperature. Shannon entropy has no physical dimension - it is  
missing the Boltzman constant. Therefore how can entropy and shannon  
entropy be compared yet alone connected?


I am talking about information not entropy - an organized collection  
of organic chemicals must have more meaningful info than an  
unorganized collection of the same chemicals.



On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:

Loet - if your claim is true then how do you explain that a random  
soup of
organic chemicals have more Shannon info than an equal number of  
organic
chemicals organized as a living cell where knowledge of some  
chemicals
automatically implies the presence of others and hence have less  
surprise

than those of the  soup of random organic chemicals? -  Bob


Dear Bob and colleagues,

In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved (N).
The maximum entropy is therefore log(N). (Because of the randomness of
the soup , the Shannon entropy will not be much lower.)

If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system increases
and the Shannon entropy can be expected to decrease.

In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.

I hope that this is convincing or provoking your next reaction.

Best wishes,


Loet


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] info & meaning

2007-10-11 Thread bob logan
Dear Colleagues - I have combined my responses to a number of  
comments (from Ted, Stan and Loet) in one email. My comments are in  
red and the comments of the others that I am commenting on are in blue.


Hi Ted - I did not understand why you clain Kauffman's approach is  
algrebraic.


At the highest level, I think, there's a decision each of us has to  
make about the nature of the abstraction space we want to work in.  
Kauffman is famously on record as believing that the preferred space  
is algebraic. He goes further in stating that it is the ONLY space,  
all others illusory, or less fundamental. Without burdening you with  
that rather indefensible weight, its clear to me that what you have  
presented is clearly in this camp.



Hi Stan - once again I enjoyed your remarks amplifying your original  
comment. I would like to add that science only deals with formal  
cause and not final cause. Final cause is for philosophers, social  
critics and theologians.


Bob said:



Hi Stan - interesting ideas - I resonate with the thought that the
meaning of info is associated with  Aritostle's final cause -  
cheers Bob




Here I follow up with an extract from a text I am working on at present,
just to amplify this a bit more:

 Finally, what is the justification for considering meaning  
generally to be
associated to finality?  Why not formal causes as well?  Final cause  
is the
'why' of events, while formal causes carry the 'how' and 'where',  
material

causes the local 'readiness', and efficient cause the 'when' (Salthe,
2005).  It seems clear when choosing among these, that the meaning
(significance, purport, import, aim -- Webster's New Collegiate  
Dictionary)

of an event must be assigned to its final causes.  Formal and material
causes are merely enabling, while efficient cause only forces or  
triggers.
The example of a New Yorker cartoon captures some of my meaning  
here.  Two
Aliens are standing outside of their spaceship, which has apparently  
landed
on Earth, as we see spruce trees burning all around them in a fire  
that we
infer was triggered by their landing. One them says: "I know what  
caused it

-- there's oxygen on this planet".  If we think that is amusing, we know
implicitly why formality cannot carry the meaning of an event.  In  
natural
science formality has been used to model the structure of an  
investigated

system, and so is not suited to carrying its teleo tendencies as well.
Formality marks what will happen where, but not also 'why' it  
happens. The

causal approach itself is required if we are trying to extend semiosis
pansemiotically to nature in general. Natural science discourse is built
around causality, and so attempts to import meaning into it requires  
it to

be assimilated to causation.

Loet - if your claim is true then how do you explain that a random  
soup of organic chemicals have more Shannon info than an equal number  
of organic chemicals organized as a living cell where knowledge of  
some chemicals automatically implies the presence of others and hence  
have less surprise than those of the  soup of random organic  
chemicals? -  Bob



On 7-Oct-07, at 6:47 AM, Loet Leydesdorff wrote:


Dear Bob and colleagues,

Although I know that this comment was made in responding to another  
comment, let me react here because I think that this is not correct:
The point I am making is that organization is a form of  
information which Shannon theory does not recognize.


Shannon's theory is a mathematical theory which can be used in an  
application context (e.g., biology, electrical engineering) as a  
methodology. This has been called entropy statistics or, for  
example, statistical decomposition analysis (Theil, 1972). The  
strong methodology which it provides may enable us to answer  
theoretical questions in the field of application.


An organization at this level of abstraction can be considered as a  
network of relations and thus be represented as a matrix. (Network  
analysis operates on matrices.) A matrix can be considered as a two- 
dimensional probability distribution which contains an uncertainty.  
This uncertainty can be expressed in terms of bits of information.  
Similarly, for all the submatrices (e.g., components and cliques)  
or for any of the row or column vectors. Thus, one can recognize  
and study organization using Shannon entropy-measures.


The results, of course, have still to be appreciated in the  
substantive domain of application, but they can be informative to  
the extent of being counter-intuitive.


Best wishes,


Loet
Loet Leydesdorff
Amsterdam School of Communications Research (ASCoR)
Kloveniersburgwal 48, 1012 CX Amsterdam
Tel.: +31-20- 525 6598; fax: +31-20- 525 3681
[EMAIL PROTECTED] ; http://www.leydesdorff.net/

Now available: The Knowledge-Based Economy: Modeled, Measured,  
Simulated. 385 pp.; US$ 18.95
The Self-Organization of the Knowledge-Based Society; The Challenge  
of Scientometrics





___

Fwd: [Fis] info & meaning

2007-10-06 Thread bob logan



Begin forwarded message:


From: bob logan <[EMAIL PROTECTED]>
Date: October 5, 2007 12:26:46 PM EDT (CA)
To: Pedro Marijuan <[EMAIL PROTECTED]>
Subject: Re: [Fis] info & meaning

Cher Colleagues - I am overwhelmed by all the responses in terms of  
there number and their depth. This has been a very busy time as a  
new school year just begun. I have just signed a contract for my  
new book Understanding New Media: Extending Marshall McLuhan which  
requires a manuscript by the end of Oct., my book  The Extended  
Mind: The Emergence of Langugae, the Human Mind and Culture is  
released on Oct. 22 and I have to plan a book launch and publicity  
for it and finally I hosted Katherine Hayles this week in Toronto  
where she spoke at the Ont College of Art and Design.  So this is  
my excuse and hope you will forgive my long silence.


So here goes my comments in red as many as I can make with more to  
follow. First of all let me share a wonderful quote of Claude  
Shannon that supports my hypothesis that information is a relative  
thing and is context dependent.


This paper appears in: Information Theory, IEEE Transactions on
Publication Date: Feb 1953
Volume: 1,  Issue: 1
On page(s): 105- 107
ISSN: 0018-9448
Posted online: 2003-04-22 14:03:12.0

Abstract

The word "information" has been given many different meanings by  
various writers in the general field of information theory. It is  
likely that at least a number of these will prove sufficiently  
useful in certain applications to deserve further study and  
permanent recognition. It is hardly to be expected that a single  
concept of information would satisfactorily account for the  
numerous possible applications of this general field. The present  
note outlines a new approach to information theory which is aimed  
specifically at the analysis of certain communication problems in  
which there exist a number of information sources simultaneously in  
operation. A typical example is that of a simple communication  
channel with a feedback path from the receiving point to the  
transmitting point. The problem is to make use of the feedback  
information for improving forward transmission, and to determine  
the forward channel capacity when the best possible use is made of  
this feedback information. Another more general problem is that of  
a communication system consisting of a large number of transmitting  
and receiving points with some type of interconnection network  
between the various points. The problem here is to formulate the  
best systems design whereby, in some sense, the best overall use of  
the available facilities is made. While the analysis sketched here  
has not yet proceeded to the point or a complete solution of these  
problems, partial answers have been found and it is believed that a  
complete solution may be possible.



On 24-Sep-07, at 11:56 AM, Pedro Marijuan wrote:


Dear Bob and colleagues,

Thanks for the scholarly work. It was nice reading the vast  
landscape, historical and otherwise, you have covered around that  
new biotic interpretation of information as "Propagating  
Organization" or POE. Perhaps in this list we have already arrived  
to a truce on Shannonian matters --notwithstanding a general  
agreement with your criticisms on that respect, the problem may be  
a lack of interesting alternative generalizations, inclusive  
enough so that the Shannonian theory gets its proper place as a  
great theory of communication and as a great theory of physical  
structures (in the guise of "physical information theory") and  
also as a great theory in data analysis.


I agree Shannon theory is powerful but I also agree with Shannon  
that its scope is limited.


Thus I am sympathetic with integrative quests such as POE, but  
given that disagreeements are usually the most valuable stuff for  
discussions, I would raise the following points:


 --1. Constraints and boundary conditions are conflated (see  
below), with some more emphasis on the former. Thus, taking into  
account that most chemical constraints are in the form of  
"activation energies" and "free energies" which establish the  
kinetics of the multitude of chemical reactions and molecular  
transformations in the cell, How this heterogeneous collection of  
variables and parameters may receive a form of global treatment as  
POE seems to imply? I have seen no hints in the paper.


 ...we defined a new form of information, which we called  
instructional or biotic information, not with Shannon, but with  
constraints or boundary conditions. The amount of information  
will be related to the diversity of constraints and the diversity  
of processes that they can partially cause to occur.


As a result of my conversations with Hayle I believe I need to put  
more emphsis on the processes that the constraints make possible.  
So I plan to think more about the relationship of bound

Re: [Fis] Re: Logan's 9-21 info & meaning

2007-10-06 Thread bob logan


On 27-Sep-07, at 2:58 PM, Stanley N. Salthe wrote:



(2)  I use entropy production as the general key here because whatever
happens produces entropy, and all configurations will constrain this
production to less than that which would be produced in an  
explosion of the
same energy gradient.  Thus, any reduction in informational entropy  
in a

given system has the effect of reducing the rate of physical entropy
production by that system relative to what would be the case if the
constraints had not appeared.  However, the maximum entropy production
princple (MEP) suggests that constraints will tend to apear that  
will allow
the greatest physical entropy production consistent with the  
integrity of

the system.
Stan is this the same idea as Weiner's :  For Wiener, on the other  
hand,  “life is an island of negentropy amid a sea of disorder  
(ibid.”).”





The Relativity of Information

In POE we associated biotic or instructional information with the
organization that a biotic agent is able to propagate. This  
contradicts
Shannon's definition of information and the notion that a random  
set or
soup of organic chemicals has more Shannon information than a  
structured

and organized set of organic chemicals found in a living organism.
S:  This 'notion' would be valid IF the comparison was strictly  
between
a soup of subatomic particles and the arrangement of subatomic  
particles

entrained by a biotic organization.
The point I am making is that organization is a form of information  
not taken into account by Shannon



This argument completely contradicts the notion of information of  
a system
biologist who would argue that a biological organism contains  
information.

 S: That is true because the systems biologist talks about
macromolecules and the cellular configurations constrainig them.



This argument contradicts the notion that information is an  
invariant like

the speed of light, the same in all frames of reference.

 S: As above, a material system exists at several scales, and
information at one scale can have no meaning, as such, at another  
scale.


Agreed




"Shannon's theory defines information as a probability function  
with no
dimension, no materiality, and no necessary connection with  
meaning. It is
a pattern not a presence (Hayles 1999a, p. 18)." The lack of a  
necessary
connection to meaning of Shannon information is what distinguishes  
it from
biotic information. Biotic information obviously has meaning,  
which is the

propagation of the organism's organization.
 S: This is why I have suggested that the meaning of  
information is
related to final causation.  The 'propagation' referred to by Bob  
is the

final cause regarding biological reproductive actvities.
Once again I agree  with Stan and this idea is related to Fredkin's  
idea that the meaning of information is connect to how it is  
interpreted.



I would interpret the signals transmitted between Shannon's sender  
and

receiver as data, Consistent with MacKay and Bateson's position
information makes a difference when it is contextualized and  
significant.
Knowledge and wisdom represent higher order applications of  
information
beyond the scope of this study. The contextualization of data so  
that it

has meaning and significance and hence operates as information is an
emergent phenomenon. The communication of information cannot be  
explained
solely in terms of the components of the Shannon system consisting  
of the
sender, the receiver and the signal or message. It is a much more  
complex

process than the simplified system that Shannon considered for the
purposes of mathematizing and engineering the transmission of  
signals.
First of all it entails the knowledge of the sender and the  
receiver, the
intentions or objectives of the sender and the receiver in  
participating
in the process and finally the effects of the channel of  
communication

itself as in McLuhan's notion that "the medium is the message". The
knowledge and intention of the sender and the receiver as well as the
effects of the channel all effect the message that is transmitted  
by the

signal in addition to its content.
  S: Of course, none of this INVALIDATES Shannon, but merely  
states
that in most natural contexts it would be woefully underdetermining  
even if

it could be calculated.

Agreed





The Meaning of Information in Biotic Systems

Biotic or instructional information, defined in POE as the  
constraints
that allow an autonomous agent, i.e. a living organism, to convert  
free

energy into work so that the living organism is able to propagate its
organization through growth and replication, is intimately  
connected with
meaning. "For Shannon the semantics or meaning of the message does  
not

matter, whereas in biology the opposite is true. Biotic agents have
purpose and hence meaning (Kauffman et al. 2007).
  S: This is a bit tricky, insofar as careful consideration  
suggests
that non-

Re: [Fis] Info & meaning

2007-10-06 Thread bob logan
Hi Stan - interesting ideas - I resonate with the thought that the  
meaning of info is associated with  Aritostle's final cause - cheers Bob



On 25-Sep-07, at 4:39 PM, Stanley N. Salthe wrote:


First I comment on Pedro's:


The "information overload" theme (in the evolution of social modes of
communication), is really intriguing. >Should we take it, say, in its
prima facie? I am inclined to put it into question, at least to  
have an

excuse and try to >unturn that pretty stone...


The question of information overload connects with my theory of  
senescence

(1993 book Development & Evolution: Complexity and Change in Biology),
which is that it results from the continual taking in of  
information after
a system has become definitive (all material systems necessarily  
continue
to get marked), while at the same time there is less loss of  
information

(matter is 'sticky').  The result is that a system becomes slower to
respond to perturbations, atr least because lag times increase as a  
result
of a multiplication of channels (increased informational friction),  
and

because some channels effect responses at cross purposes.


and then
Walter's:

(2) In the same way: how can arise the 'meaning' In naturalist  
terms or

imposed by us?


I have recently concluded that meaning can be naturalized by  
aligning it

with finality in the Aristotelian causal analysis (material/formal,
efficient/final).  I assert that final cause has not really been  
eliminated

from physics and other sciences, but has been embodied in variatinal
principles like the Second Law of thermodynamics, and in evolutionary
theory - in Fisher's fundamental theorem of natural selection, as  
well as

in some interpetations of the 'collapse of the wave function' in QM.

STAN


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] Re: Logan response to the various comments of FISites

2007-10-06 Thread bob logan

Hi Pedro et al.

My comments are in red - I know that I surpassed my limit of four per  
week so I am commenting on a number of contributions in this post - I  
use x to divide one section from another -
With this post I am up to date. I apologize for my delayed responses.  
My overall reaction to all of the posts is one of deep appreciation.  
I look forward to the continued dialogue.


 Bob Logan



Hi Jerry et al.



I quite enjoyed this post below. The way the 3.486 billion possible  
interpretations are whittled down to 1 or possibly 2 meanings is  
through pragmatics - pragmatics or context is very important for  
understanding meaning - Bob



On 1-Oct-07, at 10:10 AM, Jerry LR Chandler wrote:


To the FIS List:

I mean to say a bit about meaning and information.

The context I wish to start with is natural language, that is, the  
meaning of everyday language as used in a novel, a play, between  
two lovers at the breakfast table, in a political convention,  
during a financial transaction or in a chemical lecture.


The choice of meaning under such circumstances is open to  
interpretation by those present, the speaker and the listener(s).


How do listeners attach meaning to the spoken words?


Consider the Porpthryian decision tree and apply it to the possible  
meanings of each word in a sentence composed of twenty words.   
Consider the possibility that each word of the sentence is  
restricted to only three meanings.  Then, 3,486,784,401 possible  
combinations of meanings are possibly created by the sentence, that  
is, 3 multiplied by itself twenty times.


Does such a sentence have _a_ meaning?






On 2-Oct-07, at 6:24 AM, Pedro Marijuan wrote:

Dear colleagues,

Answering to a couple of Jerry's questions,


Under what circumstances can the speaker's meaning or the writer's  
meaning be _exact_?


Is _meaning_ a momentary impulse with potential for settling into  
a local minimum in the biochemical dynamic?


A previous point could be---what entities are capable of  
elaborating that obscure item we call "meaning"? Just anything (eg,  
some parties have stated that molecules or atoms may communicate),  
or only the living beings?


My understanding of what Bob has proposed along the POE guideliness  
is that only the living cell would be capable --and of course, all  
the further more complex organisms.  This point is of some relevance.


After decoding and interpretation of the organic codes, the  
meaning of my message about meaning and information may have  
meaning to you.


Maybe. But I suffer some information overload (perhaps "overload"  
is just the incapablity to elaborate meaning under the present  
channels or means of communication).




The way meaning finallly emerges in oral commnication is thru  
dialogue so as to establish a context of an  ambiguous meaning.  
Context is king!




 
xx


On Oct 2 Guy Hoelzer wrote:
In my view meaning exists (or not) exclusively within systems.  It  
exists

to the extent that inputs (incoming information) resonate within the
structure of the system.  The resonance can either reinforce the  
existing

architecture (confirmation), destabilize it (e.g., cognitive
disequilibrium), or construct new features of the architecture (e.g.,
learning).
I like this contribution and the comments made by Stan Salthe also on  
Oct 2  - they parallel Fredkins idea:
"The meaning of information is given by the processes that interpret  
it."  Would you agree Guy and Stan?


And for Søren Brier who commented on Stan's comment also on Oct 2  I  
would ask do the processes that interpret info constitute semiotic  
ontology?



On Oct 3 Walter Riofrio comments:

In (2) Pedro understand Bob is proposing that only the living beings  
(from living cells to more complex organisms) are capable to  
elaborate (and transmit) meaning in information.


My approach to this issue is: we could understand meaning in  
information (or, meaningful information) only in living systems (and  
I propose even until the systems which opened the doors of prebiotic  
world). That is the way - I think - 'Information with meaning' arises  
in the physical universe.


I agree it is only living and perhaps prebiotic things that have  
processes, namely propagating their organization and are therefore  
capable of interpreting information and hence according to Fredkin  
providing it with meaning. When a rock is acted upon by earth's  
gravity it does not have to interpret because it has no options it  
can only behave as causality demands. Living things make choices -  
Bacteria decide to swim towards or away from a substance depending on  
their interpretation of whether it is food or toxin. The meaning of a  
glucose gradient to a b

Re: [Fis] info & meaning

2007-10-06 Thread bob logan
Karl et al - I agree there is lots of value in past FIS discussions  
of info but as Shannon himself  opined


"It is hardly to be expected that a single concept of information  
would satisfactorily account for the numerous possible applications  
of this general field."


I only offered one such concept namely instructional or biotic  
information. And I agree with Karl:  Now the

task is to figure out how to manage the consequences.



On 25-Sep-07, at 9:54 AM, karl javorszky wrote:



There is more well-prepared work in FIS than meets the eye. Let us not
restart from inventing the idea of information. This is well done.  
Now the

task is to figure out how to manage the consequences.

All the best:
Karl
Karl



___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] More introductions to the FIS list

2007-09-05 Thread bob logan
(I am reintroducing this message from Bob some weeks ago that the server 
had left in the could. It is an interesting preparation for the next 
discussion session, to start soon ... ---Pedro)


---

Hi Pedro et al.

The emergence of new phenomena occurs at the edge of chaos. The
emergence of a new medium which is both a technique of communication
and  information organization must therefore occurs at the edge of
chaos which occurs whenever there is some form of information
overload. I believe that info overload is the engine of creation for
new media. For example writing emerged to deal with overload of
economic data in Sumer as described in my book the Sixth Language and
my previous post to the FIS list that Pedro commented on July 10th.

Bob


On 10-July-07, Pedro Marijuan wrote:



Dear FIS colleagues,

Thanks to Bob for all his mindful contents. Given the many ideas
to comment in his draft, I will briefly refer to a couple of them
just at the beginning of the paper. First, on "concepts" as the
basic stuff of human language-communication. And second, on the
notion of "information overload".

About the former, if one is interested on the "motor" side of the
action-perception cycle that seemingly organizes the neurodynamics
of vertebrate brains, there might be alternative approaches to
"concepts". Alain Berthoz general vision, and more concretely, J.
Fuster who made a few years ago (2003) the proposal of "cognits",
might deserve be entered into the discussion... one of the central
points is that nowadays we are missing a good, fertile synthesis
around the enormous experimental accumulation in neuroscience last
decades (as claimed by Edelman, Arbib, etc.). Anyhow, I leave
here, as this theme probably will reappear in the session next
September.

The "information overload" theme (in the evolution of social modes
of communication), is really intriguing. Should we take it, say,
in its prima facie? I am inclined to put it into question, at
least to have an excuse and try to unturn that pretty stone...

best regards

Pedro



___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] Fwd: Bob Logan's introduction to the FIS list: A Problem in Physics

2007-07-06 Thread bob logan


From: bob logan <<mailto:[EMAIL PROTECTED]>[EMAIL PROTECTED]>
Date: July 5, 2007 11:56:29 AM EDT (CA)
To: "Joseph Brenner" <<mailto:[EMAIL PROTECTED]>[EMAIL PROTECTED]>
Subject: Re: Bob Logan's introduction to the FIS list: A Problem in Physics

Dear Joseph - thanks so much for your email which has been most 
stimulating. I read the two papers you attached but found them hard 
sledding, ie. difficult to read because of my lack of a background in logic 
which I always found too formal. I must admit in some ways I am 
undisciplined and have trouble with details. I googled you and found two 
items of yours actually more helpful. They were PARACONSISTENCY AND 
TRANSCONSISTENCY IN THE LOGIC OF STEPHANE


LUPASCO and Stephane Lupasco and Florentin Smarandache: Conflicting Logics 
of Contradiction and an Included Middle
by
Joseph E. Brenner (I think I 
only found the abstract of the 2nd paper- perhaps you could send the whole 
article.
Perhaps you could help me by stating the 3 laws of classical or Aristotlean 
logic which in agreement with you I find too confining. McLuhan observed 
that all technologies provide both service and disservice. I see logic as a 
technology to sort out our thinking. It's service is obvious but its 
disservice is that it excludes rich possibilities and tends to crowd out 
empirical thinkng as was the case with the Classical Greeks. Aristotle was 
a great logician, drama critic and not a bad biologist but his physics was 
atrocious. He argued that a ball dropped from the mast of a moving ship 
would fall to the deck behind the mast in the opposite direction of motion. 
Very logical but wrong. If he actually dropped a ball from the mast of a 
moving ship he would have discovered that his logic was wrong. or as 
McLuhan liked to joke his fallacy was wrong. Parminides argued A could not 
change to B because non-A could not be as it was a contradiction in terms. 
He succeeded in convincing every Greek philosopher that they had to have 
something in their pilosophical system that did not change (atoms, 4 
elements of earth, air, fire and water, Plato's ideal forms, Aristotle's 
aetherial heavens. Another consequence that I argue in the attached chapter 
from my book the Alphabet Effect is the Greeks missed the concept of zero 
because non-being, i.e. zero could not be.


Also could you explain the included middle - my guess is that it means both 
A and not-A can be true or have some truth to them and some falsity as well.


I also find it hard to absorb the idea of another form of reality ala 
Nicolescu. For me the reality is that things are simultaneously true, false 
and evolving into something new in their Adjacent Possible (see Propagating 
Organization An Enquiry in Section 7 of 
<http://www.physics.utoronto.ca/~logan>www.physics.utoronto.ca/~logan for a 
better understanding of the Adjacent Possible.) The triality I see is that 
I am Bob, non-Bob and the transition to the new-Bob simultaneously because 
as Heraclitus described it I am in total flux and you cannot encounter the 
same Bob twice because I am everchanging and in fact you cannot encounter 
me even once because I am a verb and not a noun.


I lookk forward to your responses to my musings - Bob
ps - this was fun

On 4-Jul-07, at 6:19 AM, Joseph Brenner wrote:


Dear Bob,

I am very grateful to Pedro for having placed me on the FIS list also and 
thus enabled me to see your note. My initial training was as an Organic 
Chemist (Ph.D. U. of Wisconsin, 1958 (ugh)) but my career was in corporate 
development with the Du Pont Company. Since my retirement in 1994, I have 
become an amateur philosopher and logician, concentrating my effort on the 
non-propositional "logic of reality" of the Franco-Romanian thinker 
Stéphane Lupasco (Bucharest, 1900 - Paris, 1988). This logical system is 
grounded in the quantum mechanics of Planck, Pauli and Heisenberg and 
assumes a principle of duality, a dynamic opposition (e.g., of intensive 
and extensive character) at the heart of energy and hence of all 
phenomena. I thus look forward very much to reading the papers available 
on your site, and attach, for what it is worth, a recent talk I presented 
at a logic conference that outlines my system (LIR).


This leads me to my question: since I am totally outside academia, I have 
no ready access to working physicists. And yet my system depends on the 
correctness of the attached paragraph, LIR and the Four Constants of 
Physics, from a manuscript I am working on. Might I ask you to comment on 
the latter, or, if you do not have the time, to suggest an accessible 
reference in which the properties of the "Four Constants" are discussed?


I hope that my further activities with Pedro and the Trancoso Group may 
lead to the occasion of our meeting, in Spain or Portugal, or Switzerland. 
If you are passing through Geneva, it is only 1-1/2 hours by car from 
w

[Fis] Bob Logan's introduction to the FIS list

2007-07-03 Thread bob logan

From: bob logan <<mailto:[EMAIL PROTECTED]>[EMAIL PROTECTED]>
Date: June 30, 2007 8:48:19 AM EDT (CA)
To: <mailto:fis@listas.unizar.es>fis@listas.unizar.es
Subject: Bob Logan's introduction to the FIS list


Dear, Cher, Caro, Liebe Colleagues

I am honoured to have been invited to this list and look forward to our 
discussions. I was trained as an elementary particle physicist at MIT and 
contributed to that field for a number of years. My most memorable result 
was to show that elementary particles behave as Regee poles, i.e. they have 
complex values of spin when the are exchanged virtually (1965). I became a 
physics prof at the U. of Toronto 1968 where in 1971 I introduced a course 
that I teach today called the Poetry of Physics and the Physics of Poetry 
teaching physics without math and studying literature and art related to 
science. It was then I began my career as an "intellectual tourists" 
exploring the relationship of science and the humanities and the social 
sciences. My journey has taken me into many different academic and 
practical territories. My Poetry of Physics course caught the attention of 
Marshall McLuhan with whom I collaborated for 6 exciting years researching 
the impact of media of communications. This study led me to study the 
impact of alphabetic writing on the development of Western culture, the 
origin and evolution of language including speech, writing, math, science, 
computing and the Internet. I also wandered into future studies, knowledge 
management, collaboration studies, industrial design and systems biology. I 
founded and operated with a spouse a company that engaged in computer 
training, Web development and knowledge management consulting 1982 to 2000. 
Along the way I became involved in Canadian politics as a policy advisor to 
Prime Minister Trudeau and several cabinet ministers. I am currently a 
professor emeritus, teaching one semester per year and actively pursuing my 
interests all of which are consistent with the FIS project. I am interested 
in the meaning of information which I believe has many different 
manifestations. Along with a team headed by Stuart Kauffman we formulated, 
based on our study of biotic information, the relativity of information, 
i.e. the notion that information is not an invariant like the speed of 
light but is a relative quality depending on the context in which it is 
being used. We also concluded that the "meaning" of life (or living 
organisms) is the propagation of their organization. I apologize for this 
lengthy introduction to a 50 year career that began when I began my studies 
at MIT in 1957. For more details and access to my most important papers and 
the first chapter of most of my books please visit my Web site at 
<http://www.physics.utoronto.ca/~logan>www.physics.utoronto.ca/~logan. 
Thank you for your attention and I look forward with enthusiasm to our 
conversations and interactions. You can reach me by email at 
<mailto:[EMAIL PROTECTED]>[EMAIL PROTECTED]


Bob Logan aka Robert K. Logan

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis