Re: [Fis] The shadows are real !!!

2018-02-25 Thread John Collier
Inclined to agree with Joseph. I would like to point out that there are 
different meanings for "real', and one has to be clear about ones 
metaphysics to make the idea (somewhat) clear. Peirce, for example, 
would call Plato's shadows (which aren't really shadows at all, real, 
but not existent. The sort of shadows that we normal experience are both 
real and existent on Peirce's account.


John

On 2018/02/26 4:58 AM, joe.bren...@bluewin.ch wrote:


Dear FISers,

With all due respect to Krassimir, Sung, and his son, it is
becoming a matter of scientific interest that statements by them
and others to the effect that "systematic research of what the
'shadows' are a part" has not been done are made routinely. First
of all, the logic in reality  of Lupasco about which I have been
talking here for 10 years, includesa new mereology in which the
dynamic relations between part and whole are set out for
discussion. Second, while the 'diagram' of Merleau-Ponty may be
considered interesting as philosophy and as a foundation of
religious belief, I see no reason to include it, without heavy
qualification, in a discussion of the foundations of information
science.

Thank you,

Joseph



Message d'origine
De : s...@pharmacy.rutgers.edu
Date : 25/02/2018 - 15:04 (PST)
À : ag...@ncf.ca, fis@listas.unizar.es
Objet : Re: [Fis] The shadows are real !!!

Hi Krassimir,


I agree with you that  "/The shadows are real/ but only a part
of the whole. What is needed is a systematic research from
what they are part."


In my previous post,  I was suggesting that Shadows are a part
of the irreudicible triad consisting of *Form (A), Shadow (B)
*and*Thought (C)*.  The essential notion of the ITR
(Irreducible Triadic realrtion) is that A, B, and C cannot be
reduced to any one or a pair of the triad.  This
automatically means that 'Shadow' is a part of the whole triad
(which is, to me, another name for the Ultimate Reality), as
Form and Thought are.  In other words, the Ultimate Reality is
not Form nor Shadow nor Thought individually but all of them
together, since they constitute an irreducible triad.    This
idea is expressed in 1995  in another way: The Ultimate
Reality is the /complementary union/ of the /Visble/ and the
/Invisible World/ (see *Table 1* attached).  Apparently a
similar idea underlies the philosophy of Maurice Merleau-Ponty
(1908-1961), according to my son, Douglas Sayer Ji (see his
semior research thesis submitted in 1996 to the Department of
Philosophy at Rutgers University under the guidance of B.
Wilshire, attached).


All the best.


Sung



*From:* Fis <fis-boun...@listas.unizar.es> on behalf of John
Collier <ag...@ncf.ca>
*Sent:* Sunday, February 25, 2018 2:51 PM
*To:* fis@listas.unizar.es
*Subject:* Re: [Fis] The shadows are real !!!
Daer Krassimir, List

I basically support what you are saying. I understand the
mathematics you presented, I am good at mathematics and
studied logic with some of the best. However, and this is a
big however, giving a mathematical or logical proof by itself,
in its formalism, does not show anything at all. One has to be
able to connect teh mathematics to experience in a
comprehensible way. This was partly the topic of my
dissertation, and I take a basically Peircean approach, though
there are others that are pretty strong as well.

I fgenerally skip over the mathematics and look for the
empirical connections. If I find them, then generally all
becomes clear. Without this, the formalism is nothing more
than formalism. It does not help to give formal names to
things and assume that this identifies things, Often trying to
follow up approaches kine this is a profound waste of time. I
try to, and often am able to, express my ideas in a nonformal
way. Some mathematically oriented colleagues see this as
automatically defective, since they think that formal
representation is all that really rigorously explains things.
This sort of thinking (in Logical Positivism) eventually led
to its own destruction as people started to ask the meaning of
theoretical terms and their relation to observations. It is a
defunct and self destructive metaphysics. Irt leads nowhere --
my PhD thesis was about this problem. It hurts me to see
people making the same mistake, especially when it leads them
to bizarre conclusions that are compatible

Re: [Fis] The shadows are real !!!

2018-02-25 Thread John Collier

Daer Krassimir, List

I basically support what you are saying. I understand the mathematics 
you presented, I am good at mathematics and studied logic with some of 
the best. However, and this is a big however, giving a mathematical or 
logical proof by itself, in its formalism, does not show anything at 
all. One has to be able to connect teh mathematics to experience in a 
comprehensible way. This was partly the topic of my dissertation, and I 
take a basically Peircean approach, though there are others that are 
pretty strong as well.


I fgenerally skip over the mathematics and look for the empirical 
connections. If I find them, then generally all becomes clear. Without 
this, the formalism is nothing more than formalism. It does not help to 
give formal names to things and assume that this identifies things, 
Often trying to follow up approaches kine this is a profound waste of 
time. I try to, and often am able to, express my ideas in a nonformal 
way. Some mathematically oriented colleagues see this as automatically 
defective, since they think that formal representation is all that 
really rigorously explains things. This sort of thinking (in Logical 
Positivism) eventually led to its own destruction as people started to 
ask the meaning of theoretical terms and their relation to observations. 
It is a defunct and self destructive metaphysics. Irt leads nowhere -- 
my PhD thesis was about this problem. It hurts me to see people making 
the same mistake, especially when it leads them to bizarre conclusions 
that are compatible with the formalism (actually, it is provable that 
almost anything is compatible with a specific formalism, up to numerosity).


I don't like to waste my time with such emptiness,

John

On 2018/02/25 6:22 PM, Krassimir Markov wrote:

Dear Sung,
I like your approach but I think it is only a part of the whole.
1. */The shadows are real/* but only a part of the whole. What is 
needed is a systematic research from what they are part.
2. About the whole now I will use the category theory I have seen you 
like:

/CAT_A => F => CAT_B => G => CAT_C /
//
/CAT_A => H => CAT_C /
//
/_F ○ G = H /
where
/F/, /G/, and /H/ are /*functors*/;
/CAT_II Î CAT/ is the category of /*information interaction categories*/;
/CAT_A Î CAT_II / and /CAT_C Î CAT_II /  are the categories of 
*/mental models’ categories/*;

/CAT_B Î CAT_II /  is the category of */models’ categories/*.
Of course, I will explain this in natural language (English) in 
further posts.

Smile
;
Dear  Karl,
Thank you for your post – it is very useful and I will discus it in 
further posts.

;
Dear Pedro,
Thank you for your nice words.
Mathematics is very good to be used when all know the mathematical 
languages.
Unfortunately, only a few scientists are involved in the mathematical 
reasoning, in one hand, and, as the Bourbaki experiment had shown, not 
everything is ready to be formalized.

How much of FIS members understood what I had written above?
The way starts from philosophical reasoning  and only some times ends 
in mathematical formal explanations.

Friendly greetings
Krassimir


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


--
John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal, Durban
Collier web page <http://web.ncf.ca/collier>
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Idealism and Materialism

2017-11-05 Thread John Collier
Loet, I have no disagreement with this. at least in the detailed summary 
you give. In fact I would argue that the notion of information as used 
in physics is empirically based just as it is in the cognitive sciences. 
Our problem is to find what underlies both.


My mention of the Scholastics was to Pierce's version, not the common 
interpretation due to a dep misunderstanding about what they were up to. 
I recommend a serous study of Peirce on te issues of meaning and 
metaphysics. He wa deeply indebted to their work iin logic.


Of course there may be no common ground, but the our project is 
hopeless. Other things you have said on this group lead me to think it 
is not a dead end of confused notions. In that case we are wasting our time.


John


On 2017/11/05 7:58 PM, Loet Leydesdorff wrote:

Dear Krassimir and colleagues,

The Scientific Revolution of the 17th century was precisely about the 
differentiation between scholarly discourse and scholastic disputatio. 
A belief system is an attribute of agents and/or of a community. The 
sciences, however, develop also as systems of rationalized 
expectations. These are based on communications as units of analysis 
and not agents (communicators). This is Luhmann's point, isn't it?


Of course, individual scientists can be religious and groups like 
Jesuits can do science. At the level of (institutional) agency or 
organizations, one has both options. However, the communication 
dynamics is very different. In religious communication, there is an 
original (e.g., the Bible) which is copied. Textbooks are updated; 
error is removed, while error was added by transcriptions by monks. 
The origins of the invention of the printing press are relevant here: 
Galilei could not publish the Discorsi in Italy, but it could be 
published by Louis Elsevier in Leiden!


In science studies, we have learned to distinguish between social and 
intellectual organization. While at the level of social organization, 
scientific and religious structures are comparable, the intellectual 
organization is very different. For example, the notion of "truth" is 
preliminary in science, while it is sacrosanct in religious 
philosophy. Thus, we can elaborate the functional differentiation 
between these two codes of communication. Scientific discourse is 
validated using criteria that are coded in communication; religious 
disputatio is about a given truth.


Best,
Loet



Loet Leydesdorff

Professor emeritus, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net <mailto:l...@leydesdorff.net>; 
http://www.leydesdorff.net/
Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of 
Sussex;


Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, 
Hangzhou; Visiting Professor, ISTIC, 
<http://www.istic.ac.cn/Eng/brief_en.html>Beijing;


Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London;

http://scholar.google.com/citations?user=ych9gNYJ=en


------ Original Message --
From: "John Collier" <ag...@ncf.ca <mailto:ag...@ncf.ca>>
To: fis@listas.unizar.es <mailto:fis@listas.unizar.es>
Sent: 11/5/2017 4:28:31 PM
Subject: Re: [Fis] Idealism and Materialism


Krassimir,

What, if like me, you see materialism and idealism as both incorrect, 
and adopt something like Russell's neutral monism. I mention this 
because I believe information to be neutral between material and 
ideal. It is a false dichotomy on my view


I disagree that information cannot be given by concrete examples. 
There are examples in both physics and of course in cognition that 
are used in both consistent and I think compatible ways.


I would go so far as to say that the division has been a sad one for 
sound philosophy, and that in some respects we should start over 
again from Aristotle (to whom the division did not seem to even 
occur, in line with general Greek thinking) and the later Scholasticism.


Regards,

John


On 2017/11/05 3:07 PM, Krassimir Markov wrote:

Dear Bruno and FIS Colleagues,

Thank you very much for your useful remarks!

This week I was ill and couldn’t work.
Hope, the next week will be better for work.

Now I want only to paraphrase my post about Idealism and Materialism:

The first is founded on believing that the Intelligent Creation exists.

The second is founded on believing that the Intelligent Creation does not
exist.

Both are kinds of religions because they could not prove their foundations
by experiments and real examples.

The scientific approach does not believe in anything in advance. The
primary concepts have to be illustrated by series of real examples. After
that the secondary concepts have to be defined and all propositions have
to be proved.

Are the mathematicians materialists or idealists?
Of course neither the first nor the second!

Mathematics 

Re: [Fis] TR: Principles of IS

2017-10-03 Thread John Collier
on 'ambiguity' that was recently published in
Progr Biophys Mol Biol July 22, 2017 fiy.
Cell-cell communication is the basis for molecular
embryology/morphogenesis. This may seem tangential at best to your
discussion of Information Science, but if you'll bear with me I will get
to the point. In my (humble) opinion, information is the 'language' of
evolution, but communication of information as a process is the mechanism.
In my reduction of evolution as communication, it comes down to the
interface between physics and biology, which was formed when the first
cell delineated its internal environment (Claude Bernard, Walter B Cannon)
from the outside environment. From that point on, the dialog between the
environment and the organism has been on-going, the organism internalizing
the external environment and compartmentalizing it to form what we
recognize as physiology (Endosymbiosis Theory). Much of this thinking has
come from new scientific evidence for Lamarckian epigenetic inheritance
from my laboratory and that of many others- how the organism internalizes
information from the environment by chemically changing the information in
DNA in the egg and sperm, and then in the zygote and offspring, across
generations. So here we have a fundamental reason to reconsider what
'information' actually means biologically. If you are interested in any of
my publications on this subject please let me know (jtor...@ucla.edu).
Thank you for any interest you may have in this alternative way of
thinking about information, communication and evolution.




___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



Dear FIS Colleagues,

As promised herewith the "10 principles of information science". A couple
of previous comments may be in order.
First, what is in general the role of principles in science? I was
motivated by the unfinished work of philosopher Ortega y Gasset, "The idea
of principle in Leibniz and the evolution of deductive theory"
(posthumously published in 1958). Our tentative information science seems
to be very different from other sciences, rather multifarious in
appearance and concepts, and cavalierly moving from scale to scale. What
could be the specific role of principles herein? Rather than opening
homogeneous realms for conceptual development, these information
principles would appear as a sort of "portals" that connect with essential
topics of other disciplines in the different organization layers, but at
the same time they should try to be consistent with each other and provide
a coherent vision of the information world.
And second, about organizing the present discussion, I bet I was too
optimistic with the commentators scheme. In any case, for having a first
glance on the whole scheme, the opinions of philosophers would be very
interesting. In order to warm up the discussion, may I ask John Collier,
Joseph Brenner and Rafael Capurro to send some initial comments /
criticisms? Later on, if the commentators idea flies, Koichiro Matsuno and
Wolfgang Hofkirchner would be very valuable voices to put a perspectival
end to this info principles discussion (both attended the Madrid bygone
FIS 1994 conference)...
But this is FIS list, unpredictable in between the frozen states and the
chaotic states! So, everybody is invited to get ahead at his own, with the
only customary limitation of two messages per week.

Best wishes, have a good weekend --Pedro


10 PRINCIPLES OF INFORMATION SCIENCE

1. Information is information, neither matter nor energy.

2. Information is comprehended into structures, patterns, messages, or flows.

3. Information can be recognized, can be measured, and can be  processed
(either computationally or non-computationally).

4. Information flows are essential organizers of life's self-production
processes--anticipating, shaping, and mixing up with the accompanying
energy flows.

5. Communication/information exchanges among adaptive life-cycles underlie
the complexity of biological organizations at all scales.

6. It is symbolic language what conveys the essential communication
exchanges of the human species--and constitutes the core of its "social
nature."

7. Human information may be systematically converted into efficient
knowledge, by following the "knowledge instinct" and further up by
applying rigorous methodologies.

8. Human cognitive limitations on knowledge accumulation are partially
overcome via the social organization of "knowledge ecologies."


9. Knowledge circulates and recombines socially, in a continuous
actualization that involves "creative destruction" of fields and
disciplines: the intellectual Ars Magna.


10. Information science proposes a new, radical vision on the information
and knowledge flows that support individual lives, with profound
consequences for scientific-philosophical practice and for social
governance.





Re: [Fis] Causation is transfer of information

2017-03-30 Thread John Collier
Interesting papers. I have a few remarks, but no time right now. I heartily 
agree with your general point.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Hector Zenil [mailto:hzen...@gmail.com]
Sent: Wednesday, 29 March 2017 11:00 AM
To: Terrence W. DEACON <dea...@berkeley.edu>
Cc: fis <fis@listas.unizar.es>
Subject: Re: [Fis] Causation is transfer of information

With all due respect, I am still amazed how it is so much ignored and neglected 
all the science and math around information developed in the last 50-60 years! 
With most people here citing in the best case only Shannon Entropy but 
completely neglecting and ignoring algorithmic complexity, logical depth, 
quantum information and so on. Your philosophical discussions are quite empty 
if most people ignore the progress that computer science and math has done in 
the last 60 years! Please take it constructively. This should be a shame for 
the whole field of Philosophy of Information and FIS.

Perhaps I can help alleviate this a little even if I feel wrong pointing you 
out to my own papers on subjects relevant to philosophical discussion:

http://www.hectorzenil.net/publications.html

They do care about the meaning and value of information beyond Shannon Entropy. 
For example, paper J21:

- Natural Scene Statistics Mediate the Perception of Image Complexity 
(available online at 
http://www.tandfonline.com/doi/abs/10.1080/13506285.2014.950365 also available 
pdf preprint in the arxiv)

and

- Rare Speed-up in Automatic Theorem Proving Reveals Tradeoff Between 
Computational Time and Information Value (https://arxiv.org/abs/1506.04349).

And we even show how Entropy fails at the most basic level:

Low Algorithmic Complexity Entropy-deceiving Graphs 
(https://arxiv.org/abs/1608.05972)

Best Regards,

Hector Zenil

---
This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are addressed. If 
you have received this email in error please notify the sender and delete the 
message.

On Tue, Mar 28, 2017 at 10:14 PM, Terrence W. DEACON 
<dea...@berkeley.edu<mailto:dea...@berkeley.edu>> wrote:
>
> Dear FIS colleagues,
>
> I agree with John Collier that we should not assume to restrict the concept 
> of information to only one subset of its potential applications. But to work 
> with this breadth of usage we need to recognize that 'information' can refer 
> to intrinsic statistical properties of a physical medium, extrinsic 
> referential properties of that medium (i.e. content), and the significance or 
> use value of that content, depending on the context.  A problem arises when 
> we demand that only one of these uses should be given legitimacy. As I have 
> repeatedly suggested on this listserve, it will be a source of constant 
> useless argument to make the assertion that someone is wrong in their 
> understanding of information if they use it in one of these non-formal ways. 
> But to fail to mark which conception of information is being considered, or 
> worse, to use equivocal conceptions of the term in the same argument, will 
> ultimately undermine our efforts to understand one another and develop a 
> complete general theory of information.
>
> This nominalization of 'inform' has been in use for hundreds of years in 
> legal and literary contexts, in all of these variant forms. But there has 
> been a slowly increasing tendency to use it to refer to the 
> information-beqaring medium itself, in substantial terms. This reached its 
> greatest extreme with the restricted technical usage formalized by Claude 
> Shannon. Remember, however, that this was only introduced a little over a 
> half century ago. When one of his mentors (Hartley) initially introduced a 
> logarithmic measure of signal capacity he called it 'intelligence' — as in 
> the gathering of intelligence by a spy organization. So had Shannon chose to 
> stay with that usage the confusions could have been worse (think about how 
> confusing it would have been to talk about the entropy of intelligence). Even 
> so, Shannon himself was to later caution against assuming that his use of the 
> term 'information' applied beyond its technical domain.
>
> So despite the precision and breadth of appliction that was achieved by 
> setting aside the extrinsic relational features that characterize the more 
> colloquial uses of the term, this does not mean that these other uses are in 
> some sense non-scientific. And I am not alone in the belief that these 
> non-intrinsic properties can also (eventually) be strictly formalized and 
> thereby contribute insights to such technical fields as molecular biology and 
> cognitive neuroscience.
>
> As a result I think that it is

[Fis] Causation is transfer of information

2017-03-28 Thread John Collier
I wrote a paper some time ago arguing that causal processes are the transfer of 
information. Therefore I think that physical processes can and do convey 
information. Cause can be dispensed with.


  *   There is a copy at Causation is the Transfer of 
Information<http://web.ncf.ca/collier/papers/causinf.pdf> In Howard Sankey (ed) 
Causation, Natural Laws and Explanation (Dordrecht: Kluwer, 1999)

Information is a very powerful concept. It is a shame to restrict oneself to 
only a part of its possible applications.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] non-living objects COULD NOT “exchange information”

2017-03-28 Thread John Collier


John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: John Collier
Sent: Tuesday, 28 March 2017 9:39 AM
To: 'darvasg' <darv...@iif.hu>
Subject: RE: [Fis] non-living objects COULD NOT “exchange information”

I wrote this a few days ago, but it is still worth posting. I might add that 
biological entities making choices grades off into cases where there is only 
one choice. If determinism is true, then there are no real choices. If it is 
false, that doesn’t help either.

There are cases that I have given references to on this list in which 
information, but no energy leads to step climbing, indicate transformation of 
information into energy. Though the example was constructed by experimenters, I 
see nothing that could not result from a fortuitous set of physical 
circumstances. The movement could be used to trigger an informational even 
(turn a switch, for example, or select a quantum state), though turning 
information into information.

I suspect there are simpler examples, and leave the list to come up with the. 
All I wanted to do was to demonstrate principle.  We tend to give almost 
magical properties to life. Thai violates my understanding of General Systems 
Theory, which applies the same principles to all systems from top to bottom, 
rather than trying to find everything in the lowest levels, as in physicalism.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of darvasg
Sent: Saturday, 25 March 2017 11:40 AM
To: fis@listas.unizar.es<mailto:fis@listas.unizar.es>; Krassimir Markov 
<mar...@foibg.com<mailto:mar...@foibg.com>>
Subject: Re: [Fis] non-living objects COULD NOT “exchange information”


Dear Krassimir,

They can

For details, see my contrinution to the 2015 Vienna IS4IS meeting and following 
publications of the proceedings!

Best, Gyuri




24.03.2017 16:25 időpontban Krassimir Markov ezt írta:
Dear Arturo and FIS Colleagues,
Let me remember that:
The basic misunderstanding that non-living objects could "exchange  
information" leads to many principal theoretical as well as psychological 
faults.
For instance, photon could exchange only energy and/or reflections !
Sorry for this n-th my remark ...
Friendly greetings
Krassimir




From: tozziart...@libero.it<mailto:tozziart...@libero.it>
Sent: Friday, March 24, 2017 4:52 PM
To: fis@listas.unizar.es<mailto:fis@listas.unizar.es>
Subject: [Fis] I: Re: Is information truly important?


Dear  Lars-Göran,
I prefer to use asap my second FIS bullet, therefore it will be my last FIS 
mail for the next days.

First of all, in special relativity, an observer is NOT by definition a 
material object that can receive and store incoming energy from other objects.
In special relativity, an observer is a frame of reference from which a set of 
objects or events are being measured.  Speaking of an observer is not 
specifically hypothesizing an individual person who is experiencing events, but 
rather it is a particular mathematical context which objects and events are to 
be evaluated from. The effects of special relativity occur whether or not there 
is a "material object that can recieve and store incoming energy from other 
objects" within the inertial reference frame to witness them.

Furthermore, take a photon (traveling at speed light) that crosses a cosmic 
zone close to the sun.  The photon "detects" (and therefore can interact with) 
a huge sun surface (because of its high speed), while we humans on the Earth 
"detect" (and can interact with) a much smaller sun surface.
Therefore, the photon may exchange more information with the sun than the 
humans on the Earth: both the photon and the humans interact with the same sun, 
but they "detect" different surfaces, and therefore they may exchange with the 
sun a different information content.
If we also take into account that the photon detects an almost infinite, fixed 
time, this means once again that it can exchange much more information with the 
sun than we humans can.


In sum, once again, information does not seem to be a physical quantity, rather 
just a very subjective measure, depending on the speed and of the time of the 
"observer".


Arturo Tozzi
AA Professor Physics, University North Texas
Pediatrician ASL Na2Nord, Italy
Comput Intell Lab, University Manitoba
http://arturotozzi.webnode.it/

Messaggio originale
Da: "Lars-Göran Johansson" 
<lars-goran.johans...@filosofi.uu.se<mailto:lars-goran.johans...@filosofi.uu.se>>
Data: 24/03/2017 14.50
A: 
"tozziart...@libero.it<mailto:tozziart...@libero.it>"<tozziart...@libero.it<mailto:tozziart...@libero.it>>
Ogg: Re: [Fis] Is information truly important?
24 mars 2017 k

Re: [Fis] What is information? and What is life?

2017-01-10 Thread John Collier
Dear List,

I agree with Terry that we should not be bound by our own partial theories. We 
need an integrated view of information that shows its relations in all of its 
various forms. There is a family resemblance in the ways it is used, and some 
sort of taxonomy can be constructed. I recommend that of Luciano Floridi. His 
approach is not unified (unlike my own, reported on this list), but compatible 
with it, and is a place to start, though it needs expansion and perhaps 
modification. There may be some unifying concept of information, but its 
application to all the various ways it has been used will not be obvious, and a 
sufficiently general formulation my well seem trivial, especially to those 
interested in the vital communicative and meaningful aspects of information. I 
also agree with Loet that pessimism, however justified, is not the real 
problem. To some extent it is a matter of maturity, which takes both time and 
development, not to mention giving up cherished juvenile enthusiasms.

I might add that constructivism, with its positivist underpinnings, tends to 
lead to nominalism and relativism about whatever is out there. I believe that 
this is a major hindrance to a unified understanding. I understand that it 
appeared in reaction to an overzealous and simplistic realism about science and 
other areas, but I think it through the baby out with the bathwater.

I have been really ill, so my lack of communication. I am pleased to see this 
discussion, which is necessary for the field to develop maturity. I thought I 
should add my bit, and with everyone a Happy New Year, with all its 
possibilities.

Warmest regards to everyone,
John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: December 31, 2016 12:16 AM
To: 'Terrence W. DEACON' ; 'Dai Griffiths' 
; 'Foundations of Information Science Information 
Science' 
Subject: Re: [Fis] What is information? and What is life?

We agree that such a theory is a ways off, though you some are far more 
pessimisitic about its possibility than me. I believe that we would do best to 
focus on the hole that needs filling in rather than assuming that it is an 
unfillable given.

Dear Terrence and colleagues,

It is not a matter of pessimism. We have the example of “General Systems 
Theory” of the 1930s (von Bertalanffy  and others). Only gradually, one 
realized the biological metaphor driving it. In my opinion, we have become 
reflexively skeptical about claims of “generality” because we know the 
statements are framed within paradigms. Translations are needed in this 
fractional manifold.

I agree that we are moving in a fruitful direction. Your book “Incomplete 
Nature” and “The Symbolic Species” have been important. The failing options 
cannot be observed, but have to be constructed culturally, that is, in 
discourse. It seems to me that we need a kind of calculus of redundancy. 
Perspectives which are reflexively aware of this need and do not assume an 
unproblematic “given” or “natural” are perhaps to be privileged nonetheless. 
The unobservbable options have first to be specified and we need theory 
(hypotheses) for this. Perhaps, this epistemological privilege can be used as a 
vantage point.

There is an interesting relation to Husserl’s Critique of the European Sciences 
(1935): The failing (or forgotten) dimension is grounded in “intersubjective 
intentionality.” Nowadays, we would call this “discourse”. How are discourses 
structured and how can they be translated for the purpose of offering this 
“foundation”?

Happy New Year,
Loet

My modest suggestion is only that in the absence of a unifying theory we should 
not privilege one partial theory over others and that in the absence of a 
global general theory we need to find terminology that clearly identifies the 
level at which the concept is being used. Lacking this, we end up debating 
incompatible definitions, and defending our favored one that either excludes or 
includes issues of reference and significance or else assumes or denies the 
relevance of human interpreters. With different participants interested in 
different levels and applications of the information concept—from physics, to 
computation, to neuroscience, to biosemiotics, to language, to art, 
etc.—failure to mark this diversity will inevitably lead us in circles.

I urge humility with precision and an eye toward synthesis.

Happy new year to all.\

— Terry

On Thu, Dec 29, 2016 at 12:30 PM, Dai Griffiths 
> wrote:

Thanks Stan,

Yes, it's a powerful and useful process.
My problem is that in this list, and in other places were such matters are 
discussed, we don't seem to be able to agree on the big picture, and the higher 
up the generalisations we go, the less we agree.

I'd like to keep open the possibility that we might be yoking ideas together 
which it may 

[Fis] BBC Documentaries 2016: The Joy of Data [FULL BBC SCIENCE DOCUMENTARY]

2016-12-16 Thread John Collier
Not bad. Certainly entertaining. I got this link through Luciano Floridi, who 
is one of the interviewees. I think it is pretty high quality, though I doubt 
that anyone here will be surprised by anything in it.

https://www.youtube.com/watch?v=Xgp7BIBtPhk

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] A provocative issue

2016-12-11 Thread John Collier
Shannon declared in his original book that constraints are information. I don’t 
get the distinction you are trying to make. Also, Shannon information applies 
to continuous systems. If they have a form (are constrained), then they have 
finite information. Infinite information applies only if there are no 
constraints. I don’t see how that could be true in a world that has 
regularities.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Bob Logan
Sent: Sunday, 11 December 2016 10:21 PM
To: tozziart...@libero.it
Cc: fis <fis@listas.unizar.es>
Subject: Re: [Fis] A provocative issue

Bravo Arturo - I totally agree - in a paper I co-authored with Stuart Kauffman 
and others we talked abut the relativity of
information and the fact that information is not an absolute. Here is the 
abstract of the paper and an excerpt from the paper that discusses the 
relativity of information. The full papers available at: 
https://www.academia.edu/783503/Propagating_organization_an_enquiry

Best wishes - Bob Logan


Kauffman, Stuart, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich. 2007. Propagating Organization: An Inquiry. Biology and 
Philosophy 23: 27-45.

Propagating Organization: An Enquiry - 
Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
lIlya Shmulevich

Institute for Systems Biology, Seattle Washington

 Abstract: Our aim in this article is to attempt to discuss propagating 
organization of process, a poorly articulated union of matter, energy, work, 
constraints and that vexed concept, “information”, which unite in far from 
equilibrium living physical systems. Our hope is to stimulate discussions by 
philosophers of biology and biologists to further clarify the concepts we 
discuss here. We place our discussion in the broad context of a “general 
biology”, properties that might well be found in life anywhere in the cosmos, 
freed from the specific examples of terrestrial life after 3.8 billion years of 
evolution. By placing the discussion in this wider, if still hypothetical, 
context, we also try to place in context some of the extant discussion of 
information as intimately related to DNA, RNA and protein transcription and 
translation processes. While characteristic of current terrestrial life, there 
are no compelling grounds to suppose the same mechanisms would be involved in 
any life form able to evolve by heritable variation and natural selection. In 
turn, this allows us to discuss at least briefly, the focus of much of the 
philosophy of biology on population genetics, which, of course, assumes DNA, 
RNA, proteins, and other features of terrestrial life. Presumably, evolution by 
natural selection – and perhaps self-organization - could occur on many worlds 
via different causal mechanisms.
Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.
Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.

Section 4. The Relativity of Information
 In Sections 2 we have argued that the Shannon conception of information are 
not directly suited to describe the information of autonomous agents that 
propagate their organization. In Section 3 we have defined a new form of 
information, instructional or biotic information as the constraints that direct 
the flow of free energy to do work.
The reader may legitimately ask the question “isn’t informatio

Re: [Fis] A provocative issue

2016-12-11 Thread John Collier
Arturo, List:

This is a view that was fairly common, especially associated with Edwin Jaynes, 
but the other view has also been put forward by people like Brillouin and, more 
recently, John Wheeler, Murray Gell-Mann and Seth Lloyd, for example. 
Cosmologist David Layzer is another example. Interesting that they are all 
physicists.

My PhD student, Scott Muller, published a book based on his dissertation, 
Asymmetry: The Foundation of Information, (Springer 2007) that uses Jaynes’ 
notion of an IGUS together with group theory to define the amount of 
information in an object (I have a different way of doing that). Jaynes held 
that each IGUS had its own measure of information in something, and there was 
no common measure. Scott argued that you can combine the information measured 
by all possible IGUSs (sort of like observers or interactors, but more strictly 
defined) to get the information in the object. I define it as the minimal 
number of yes-no questions required to completely describe the thing. The two 
should be equivalent. So you are siding with Jaynes, I think. I think Scott 
nailed the idea of objective intrinsic information on solid ground.

By the way, Shannon’s measure is of the information capacity of a channel. 
There are better ways to define the information in a real situation (e.g., the 
computational notion of information), but Shannon’s approach can be adapted to 
give the same result with some relatively intuitive assumptions.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of 
tozziart...@libero.it
Sent: Sunday, 11 December 2016 5:57 PM
To: fis@listas.unizar.es
Subject: [Fis] A provocative issue


Dear FISers,
I know that some of you are going to kill me, but there’s something that I must 
confess.
I notice, from the nice issued raised by Francesco Rizzo, Joseph Brenner, John 
Collier, that the main concerns are always energetic/informational arguments 
and accounts.
Indeed, the current tenets state that all is information, information being a 
real quantity that can be measured through informational entropies.
But… I ask to myself, is such a tenet true?
When I cook the pasta, I realize that, by my point of view, the cooked pasta 
encompasses more information than the not-cooked one, because it acquires the 
role of something that I can eat in order to increase my possibility to 
preserve myself in the hostile environment that wants to destroy me.  However, 
by the point of view of the bug who eats the non-cooked pasta, my cooked pasta 
displays less information for sure.  Therefore, information is a very 
subjective measure that, apart from its relationship with the observer, does 
not mean very much…  Who can state that an event or a fact displays more 
information than another one?
And, please, do not counteract that information is a quantifiable, objective 
reality, because it can be measured through informational entropy… 
Informational entropy, in its original Shannon’s formulation, stands for an 
ergodic process (page 8 of the original 1948 Shannon’s seminal paper), i.e.: 
every sequence produced by the processes is the same in statistical properties, 
or, in other words, a traveling particle always crosses all the points of its 
phase space.  However, in physics and biology, the facts and events are never 
ergodic.  Statistical homogeneity is just a fiction, if we evaluate the world 
around us and our brain/mind.
Therefore, the role of information could not be as fundamental as currently 
believed.

P.S.: topology analyzes information by another point of view, but it’s an issue 
for the next time, I think…




Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fwd: about consciousness an Euclidean n-space

2016-12-10 Thread John Collier
Some remarks on Arturo’s comment below.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier


From: Fis [fis-boun...@listas.unizar.es] on behalf of 
tozziart...@libero.it [tozziart...@libero.it]
Sent: December 6, 2016 4:17 AM
To: Jerry LR Chandler; fis@listas.unizar.es
Subject: [Fis] R: Re: Who may proof that consciousness is an Euclidean n-space 
???

Dear Jerry,
thanks a lot for your interesting comments.
I like very much the logical approach, a topic that is generally dispised by 
scientists for its intrinsic difficulty.
We also published something about logic and brain (currently under review), 
therefore we keep it in high consideration:
http://biorxiv.org/content/early/2016/11/15/087874

However, there is a severe problem that prevents logic in order to be useful in 
the description of scientific theories, explanans/explanandum, and so on. The 
severe problem has been raised by three foremost discoveries in the last 
century: quantum entanglement, nonlinear dynamics and quantistic vacuum.
Quantum entanglement, although experimentally proofed by countless scientific 
procedures, is against any common sense and any possibliity of logical inquiry. 
The concepts of locality and of cause/effect disappear in front of the puzzling 
phenomenon of quantum entanglement, which is intractable in terms of logic, 
neither using the successful and advanced approaches of Lesniewski- Tarski, nor 
Zermelo-Fraenkels.
The same stands for nonlinear chaotic phenomena, widespread in nature, from 
pile sands, to bird flocks and to brain function. When biforcations occur in 
logistic plots and chaotic behaviours take place, the final systems 
ouputs are not anymore causally predictable.
Quantistic vacuum predicts particles or fields interactions occurring through 
breaks in CPT symmetries: this means that, illogically, the arrow of the time 
can be reverted (!) in quantistic systems.

[John Collier] I believe the problems here can be resolved by adopting an 
information-theoretic account of causality. I have not yet shown how it applies 
in QM or in complexly organised systems, but I see no special problems. The 
basic idea is that causal connection between two things is that the same 
information is carried by both. It is a development of Reichenbach’s 
markability account of causation, but without the questionable invocation of 
counterfactuals. You can find accounts in the two papers below. The second 
gives a brief account of how it should be applied to complexly organized 
systems. The papers are very condensed, I warn readers, but several people have 
got the idea on the first read. The second paper uses the Barwise-Seligman 
notion of information flow explicitly. It helps to know that first, but I give 
a brief description.

  *   Causation is the Transfer of 
Information<http://web.ncf.ca/collier/papers/causinf.pdf> In Howard Sankey (ed) 
Causation, Natural Laws and Explanation (Dordrecht: Kluwer, 1999)

  *   Information, causation and 
computation<http://web.ncf.ca/collier/papers/CollierJohn%20formatted.pdf> 
(2012. Information and 
Computation:<http://astore.amazon.co.uk/books-books-21/detail/9814295477> 
Essays on Scientific and Philosophical Understanding of Foundations of 
Information and Computation, Ed by Gordana Dodig Crnkovic and Mark Burgin, 
World Scientific)

John
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Is quantum information the basis of spacetime?

2016-11-15 Thread John Collier
I don't think that Bell's inequality shows indeterminacy, m3aning randomness, 
or chance. It does show entanglement. There are quantum that are reversible 
(some are macroscopic). In most measurements there is quantum decoherence, 
which breaks up entanglement, and has been compared to thermodynamic 
dissipation. In my review of Time's Arrow's Today: Recent Physical and 
Philosophical Work on the Direction of Time, edited by Steven F. Savitt, 
Cambridge University Press, 1995. I wrote:

"The chapters by physicists James Leggett and Phil Stamp deal with the 
distinction between quantum decoherence and dissipation. Although it has been 
widely remarked that quantum mechanics is formally reversible, many have 
thought that the "collapse of the wave packet" implies that measurement imposes 
a direction on time. Leggett and Stamp thoroughly refute this position by 
distinguishing between decoherence and the usual statistical mechanical 
dissipation. Although they are not essential to the basic argument, 
"macroscopic" quantum systems demonstrate that decoherence is reversible. The 
so-called collapse of the wave packet introduces nothing new to the problem of 
the direction of time."

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

> -Original Message-
> From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Bruno Marchal
> Sent: Tuesday, 15 November 2016 5:21 PM
> To: FIS Webinar <Fis@listas.unizar.es>
> Subject: Re: [Fis] Is quantum information the basis of spacetime?
> 
> 
> On 13 Nov 2016, at 10:48, Andrei Khrennikov wrote:
> 
> > Dear all,
> > I make the last remark about "physical information". The main problem
> > of quantum physics is to justify so called IRREDUCIBLE QUANTUM
> > RANDOMNESS (IQR). It was invented  by von Neumann. Quantum
> randomness,
> > in contrast to classical, cannot be reduced to variations in an
> > ensemble. One single electron is irreducibly random.
> >
> > The operational Copenhagen interpretation cannot "explain" the origin
> > of  IQR, since it does not even try to explain anything, "Shut up and
> > calculate!" (R. Feynman to his students). Nevertheless, many  top
> > experts in QM want some kind of "explanation". The informational
> > approach to QM is one of such attempts. Roughly speaking, one tries to
> > get IQR from fundamental  notion of "physical information" as the
> > basic blocks of Nature.
> >
> > This is very important activity, since nowadays IQR has huge
> > technological value, the quantum random generators are justified
> > through IQR. And this is billion Euro project.
> >
> > Finally, to check experimentally the presence of IQR, we have to
> > appeal to violation  of Bell's inequality. And here (!!!) to proceed
> > we  have to accept the existence of FREE WILL. Thus finally the
> > cognitive elements appears, but in  very surprisingly setting
> 
> 
> Bell's inequality shows only indeterminacy and non-locality in the Mono-
> world QM theory. I have shown that local and deterministic Mechanism
> (simple Descartes Mechanist hypothesis in cognitive science) implies the
> *appearance* of non-locality and indeterminacy, and this before I knew
> anything about QM. QM without collapse (non-copenhague
> theory) confirms Descartes' Mechanism (in cognitive science, not in physics).
> The indeterminacy and non-locality are an appearance emerging from our
> abstraction with respect to the many computations, which can be proved to
> exist from the universally accepted assumption of elementary arithmetic.
> 
> You are logically valid in QM + the assumption of a unique reality, which
> needs the assumption that brain are not Turing emulable. But that seems to
> me quite speculative and almost like an ad hoc assumption to avoid the
> computationalist solution of the mind-body problem. Better to continue the
> testing and abandon Mechanism only when we find good evidences against
> it, I think.
> 
> Bruno
> 
> 
> 
> 
> 
> >
> > Yours, andrei
> >
> > Andrei Khrennikov, Professor of Applied Mathematics, Int. Center Math
> > Modeling: Physics, Engineering, Economics, and Cognitive Sc.
> > Linnaeus University, Växjö, Sweden
> > My RECENT BOOKS:
> > http://www.worldscientific.com/worldscibooks/10.1142/p1036
> > http://www.springer.com/in/book/9789401798181
> > http://www.panstanford.com/books/9789814411738.html
> > http://www.cambridge.org/cr/academic/subjects/physics/econophysics-
> and
> > -financial-physics/quantum-social-science
> > http://www.springer.com/us/book/978

Re: [Fis] Is quantum information the basis of spacetime?

2016-11-12 Thread John Collier
More on Quantum information and emergent spacetime, this time by Erik P. 
Verlinde:
Emergent Gravity and the Dark Universe<https://arxiv.org/abs/1611.02269>

There is a less formal review at
http://m.phys.org/news/2016-11-theory-gravity-dark.html

I consider the idea very speculative, as I have seen no work on information 
within a spacetime boundary except for this sort of work.

Of course, meaning need not apply. I doubt that it is bounded by language, but 
it at least has to be representational. Perhaps more is also required. I am 
reluctant to talk of meaning when discussing the semiotics of biological 
chemicals, for example, but could not find a better word. A made up word like 
Deacon’s “entention” might work best, but it still would not apply to the 
physics cases, even though the information in the boundaries in all cases but 
the internal information one can tell you about the spacetime structure within 
the boundary. That seems to me that it is like smoke to fire: smoke doesn’t 
mean fire, despite the connection.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: Saturday, 12 November 2016 9:29 PM
To: 'Alex Hankey' <alexhan...@gmail.com>; 'FIS Webinar' <Fis@listas.unizar.es>
Subject: Re: [Fis] Is quantum information the basis of spacetime?

Dear Alex and colleagues,

Thank you for the reference; but my argument was about “meaning”. “Meaning” can 
only be considered as constructed in language. Other uses of the word are 
metaphorical. For example, the citation to Maturana.

Information, in my opinion, can be defined content-free (a la Shannon, etc.) 
and then be provided with meaning in (scholarly) discourses. I consider physics 
as one among other scholarly discourses. Specific about physics is perhaps the 
universalistic character of the knowledge claims. For example: “Frieden's 
points apply to quantum physics
as well as classical physics.“ So what? This seems to me a debate within 
physics without much relevance for non-physicists (e.g., economists or 
linguists).

Best,
Loet


Loet Leydesdorff
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net <mailto:l...@leydesdorff.net> ; http://www.leydesdorff.net/
Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/> University of Sussex;
Guest Professor Zhejiang Univ.<http://www.zju.edu.cn/english/>, Hangzhou; 
Visiting Professor, ISTIC, <http://www.istic.ac.cn/Eng/brief_en.html> Beijing;
Visiting Professor, Birkbeck<http://www.bbk.ac.uk/>, University of London;
http://scholar.google.com/citations?user=ych9gNYJ=en

From: Alex Hankey [mailto:alexhan...@gmail.com]
Sent: Saturday, November 12, 2016 8:07 PM
To: Loet Leydesdorff; FIS Webinar
Subject: Re: [Fis] Is quantum information the basis of spacetime?

Dear Loet and Fis Colleagues,

Are you aware of Roy Frieden's
'Physics from Fisher Information'.
His book was published in the 1990s.
I consider it a very powerful statement.

Ultimately everything we can detect at
both macroscopic and microscopic levels
depends on information production from
a quantum level that forms Fisher Information.

Frieden's points apply to quantum physics
as well as classical physics.

Best wishes,

Alex Hankey


On 12 November 2016 at 18:56, Loet Leydesdorff 
<l...@leydesdorff.net<mailto:l...@leydesdorff.net>> wrote:
Dear Marcus,

When considering things in terms of "functional significance" one must confront 
the need to address "meaning" in terms of both the living and the physical . . 
. and their necessarily entangled nature.

“Meaning” is first a linguistic construct; its construction requires interhuman 
communication. However, its use in terms of the living and/or the physical is 
metaphorical. Instead of a discourse, one can this consider (with Maturana) as 
a “second-order consensual domain” that functions AS a semantic domain without 
being one; Maturana (1978, p. 50):

“In still other words, if an organism is observed in its operation within a 
second-order consensual domain, it appears to the observer as if its nervous 
system interacted with internal representations of the circumstances of its 
interactions, and as if the changes of state of the organism were determined by 
the semantic value of these representations. Yet all that takes place in the 
operation of the nervous system is the structure-determined dynamics of 
changing relations of relative neuronal activity proper to a closed neuronal 
network.”

Failing to "make that connection" simply leaves one with an explanatory gap. 
And then, once connected, a further link to "space-time" is also easily located 
. . .

Yes, indeed: limiting the discussion to the metaphors instead of going to the 
phore (

[Fis] Is quantum information the basis of spacetime?

2016-11-03 Thread John Collier
Apparently some physicists think so.

https://www.scientificamerican.com/article/tangled-up-in-spacetime/?WT.mc_id=SA_WR_20161102

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Scientific communication (from Mark)

2016-10-14 Thread John Collier
Peirce's answer is a definite "yes", and is a form pf realism. The idea that 
patterns require an observer is the basis for nominalism, which was adopted by 
most empiricists like Locke and Hume. Plato, though, was also a nominalist, 
though the reasoning is not so straight-forward. The empiricist Berkeley, with 
his requirement of God's observation, is an objective idealism, but 
nominalistic nonetheless, in line with the other British Empiricists of his era.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

> -Original Message-
> From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Dai Griffiths
> Sent: Friday, 14 October 2016 4:16 PM
> To: fis@listas.unizar.es
> Subject: Re: [Fis] Scientific communication (from Mark)
> 
> To trying to answer this question, I find myself asking "Do patterns exist
> without an observer?".
> 
> A number of familiar problems then re-emerge, which blur my ability to
> distinguish between foreground and background.
> 
> Dai
> 
> On 13/10/16 11:32, Karl Javorszky wrote:
> > Do patterns contain information?
> 
> --
> -
> 
> Professor David (Dai) Griffiths
> Professor of Education
> School of Education and Psychology
> The University of Bolton
> Deane Road
> Bolton, BL3 5AB
> 
> Office: T3 02
> http://www.bolton.ac.uk/IEC
> 
> SKYPE: daigriffiths
> UK Mobile +44 (0)749151559
> Spanish Mobile: + 34 687955912
> Work: + 44 (0)7826917705
> (Please don't leave voicemail)
> email:
> d.e.griffi...@bolton.ac.uk
> dai.griffith...@gmail.com
> 
> ___
> Fis mailing list
> Fis@listas.unizar.es
> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Essential Core?

2016-07-08 Thread John Collier
Comment inserted below yours.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Michel Godron
Sent: Friday, 08 July 2016 4:52 PM
To: fis@listas.unizar.es; l...@leydesdorff.net
Subject: Re: [Fis] Essential Core?


My responses are in red
Bien reçu votre message. MERCI. Cordialement. M. Godron
Le 08/07/2016 à 14:42, Pedro C. Marijuan a écrit :
Dear FIS Colleagues,

Some brief responses to the different parties:

Marcus: there were several sessions dealing with info physics, where I remember 
some historical connotations with mechanics emerged. Mostly 1998 and 2002 
chaired by Koichiro Matsuno and 2004 by Michel Petitjean. Afterwards the theme 
has surfaced relatively often. About the present possibilities for a UTI, my 
opinion is that strictly remaining within Shannon's and anthropocentric 
discourse boundaries there is no way out.
Yes, but it is not the same  with  Brillouin's information : I could send to 
you a text in French which gives a demonstration of the convergence between  
that information and thermodynamical neguentropy. Since twenty years, I did not 
find an english review which was interested by this problem, because I am 
biologist and the biological reviews were not interested.
[John Collier] I agree. I have read only an English translation of Science and 
Information Theory. I read it as an undergrad, and it has strongly influenced 
my views. It is unfortunate, I think, that it hasn't influenced English 
speaking scientists much. I have also seen some bad misreadings of what he was 
saying.

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Progress on black hole information paradox

2016-07-01 Thread John Collier
That is insulting. Please be more careful in the future.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Krassimir Markov [mailto:mar...@foibg.com]
Sent: Tuesday, 28 June 2016 7:00 PM
To: John Collier <colli...@ukzn.ac.za>; fis <fis@listas.unizar.es>
Subject: Re: [Fis] Progress on black hole information paradox

Dear John and FIS Colleagues,
The main paradox of the “black hole information paradox” is that maybe someone 
knows what is the “black hole” but in the same time he/she has no imagination 
what is “information”.
Friendly regards
Krassimir






From: John Collier<mailto:colli...@ukzn.ac.za>
Sent: Tuesday, June 28, 2016 2:01 PM
To: fis<mailto:fis@listas.unizar.es>
Subject: [Fis] Progress on black hole information paradox

Not solved yet, as method applies only EM radiation, and not to gravity (where 
the real problem lies in any case).

I note that the problem can be stated properly only by using information theory 
(or something that is equivalent – same models).

http://physicsworld.com/cws/article/news/2016/jun/08/soft-hairs-help-resolve-the-black-hole-information-paradox

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier


___
Fis mailing list
Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Progress on black hole information paradox

2016-07-01 Thread John Collier
That is one limited way to think of information . It is reasonably precise, 
which is an advantage. But ignores and in fact rules out other usages that 
share important basic properties, suggesting a unified notion that goes well 
beyond the narrow usage you prefer, Krassimir.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Krassimir Markov
Sent: Tuesday, 28 June 2016 8:58 PM
To: FIS <fis@listas.unizar.es>
Subject: [Fis] Progress on black hole information paradox

Dear Francesco,
Thank you for the polite words!
In addition to your explanation, I have to point that, from mine point of view, 
we have principally and opposite understandings of the concept information.
Your position is that the information is primary and matter and energy are 
secondary, i.e. information created both of them.
My understanding is that the information is a kind of reflection in the 
material entities but not every reflection is information.
The “reflection” is internal structural of functional difference which has been 
created after an interaction between entities.
Only living creatures may operate with reflections in their consciousness.
In other words, the “information” is a reflection in the consciousness for 
which in the same consciousness there exist evidence what the refection 
reflects.
Friendly regards
Krassimir

PS: This is my second post for this week.
Next half month I will spend on Summer Session of ITHEA International 
Conferences (http://www.ithea.org/conferences/itaf2016.htm).
Because of this I shall be silent till middle of July.
Have nice and happy summer!

From: Francesco Rizzo<mailto:13francesco.ri...@gmail.com>
Sent: Tuesday, June 28, 2016 8:29 PM
To: Krassimir Markov<mailto:mar...@foibg.com>
Subject: Re: [Fis] Progress on black hole information paradox

Cari John, Krassimir e Tutti,
informazione è un infinito o molteplice modo di prendere forma (neg-entropia), 
dis-informazione è un infinito o molteplice modo di perdere forma (entropia). 
Con il mio processo di tras-in-formazione, cuore della "Nuova economia", 
consistente nell'immissione (input) di materia, energia e informazione e 
nell'emissione (output) di materia, energia e informazione in stati diversi, ho 
capito 20 anni prima di S. Hawking, pur essendo un economista, che la sua 
teoria non funzionava. Lui è arrivato alle mie, modeste, stesse conclusioni nel 
2004-2005. Inoltre energia e materia non sono altro che due tipi di 
informazione, quindi l'unica o fondamentale legge dell vita e della scienza è 
l'INFORMAZIONE. Questo ora stanno incominciando a conoscerlo od ammetterlo 
tanti, ma io l'ho sempre pensato, scritto e proposto agli economisti che sono 
spesso duri di cervice come l'apostolo Pietro. Non mi dilungo ad esporre i 
dettagli o particolari di questa problematica contenuti almeno in una dozzina 
di miei libri, a proposito soprattutto dell'indeterminazione quantistica e 
dell'indeterminazione gravitazionale.
Ad onor del vero sono stato stimolato a trasmettere questa e-mail molto, 
troppo, sintetica dal problema the black-hole-infromation-paradox presentato e 
suggerito in modo magnifico da John Collier e dalla domanda di Krassimir 
Markov, altrettanto notevole, "qualcuno, lui/lei non  immagina cosa sia 
informazione". Mille grazie a tutti e due e a a tutti Voi che sopportate il mio 
(essere) italiano.
Un abbraccio veramente affettuoso e riconoscente.
Francesco

2016-06-28 19:00 GMT+02:00 Krassimir Markov 
<mar...@foibg.com<mailto:mar...@foibg.com>>:
Dear John and FIS Colleagues,
The main paradox of the “black hole information paradox” is that maybe someone 
knows what is the “black hole” but in the same time he/she has no imagination 
what is “information”.
Friendly regards
Krassimir






From: John Collier<mailto:colli...@ukzn.ac.za>
Sent: Tuesday, June 28, 2016 2:01 PM
To: fis<mailto:fis@listas.unizar.es>
Subject: [Fis] Progress on black hole information paradox

Not solved yet, as method applies only EM radiation, and not to gravity (where 
the real problem lies in any case).

I note that the problem can be stated properly only by using information theory 
(or something that is equivalent – same models).

http://physicsworld.com/cws/article/news/2016/jun/08/soft-hairs-help-resolve-the-black-hole-information-paradox

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier


___
Fis mailing list
Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es<mailto:Fis@listas.unizar.es>
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___

Re: [Fis] Fw: "Mechanical Information" in DNA

2016-06-09 Thread John Collier
I am inclined to agree with Joseph. That is why I put “mechanical information” 
in shudder quotes in my Subject line.

On the other hand, one of the benefits of an information approach is that one 
can add together information (taking care to subtract effects of common 
information – also describable as correlations). So I don’t think that the 
reductionist perspective follows immediately from describing the target 
information in the paper as “mechanical”. “Mechanical”, “mechanism” and similar 
terms can be used (and have been used) to refer to processes that are not 
reducible. “Mechanicism” and “mechanicist” can be used to capture reducible 
dynamics that we get from any conservative system (what I call Hamiltonian 
systems in my papers on the dynamics of emergence – such systems don’t show 
emergent properties except in a trivial sense of being unanticipated). I think 
it is doubtful at best that the mechanical information referred to is 
mechanicist.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: Thursday, 09 June 2016 11:10 AM
To: fis <fis@listas.unizar.es>
Subject: [Fis] Fw: "Mechanical Information" in DNA

Dear Folks,

In my humble opinion, "Mechanical Information" is a contradiction in terms when 
applied to biological processes as described, among others, by Bob L. and his 
colleagues. When applied to isolated DNA, it gives at best a reductionist 
perspective. In the reference cited by Hector, the word 'mechanical' could be 
dropped or replaced by spatial without affecting the meaning.

Best,

Joseph

- Original Message -
From: Bob Logan<mailto:lo...@physics.utoronto.ca>
To: Moisés André Nisenbaum<mailto:moises.nisenb...@ifrj.edu.br>
Cc: fis<mailto:fis@listas.unizar.es>
Sent: Thursday, June 09, 2016 4:04 AM
Subject: Re: [Fis] "Mechanical Information" in DNA

Thanks to Moises for the mention of my paper with Stuart Kauffman. If anyone is 
interested in reading it one can find it at the following Web site:

https://www.academia.edu/783503/Propagating_organization_an_enquiry

Here is the abstract:

Propagating Organization: An Inquiry.

Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich.

2007. Biology and Philosophy 23: 27-45.
Abstract
Our aim in this article is to attempt to discuss propagating organization of 
process, a poorly articulated union of matter, energy, work, constraints and 
that vexed concept, “information”, which unite in far from equilibrium living 
physical systems. Our hope is to stimulate discussions by philosophers of 
biology and biologists to further clarify the concepts we discuss here. We 
place our discussion in the broad context of a “general biology”, properties 
that might well be found in life anywhere in the cosmos, freed from the 
specific examples of terrestrial life after 3.8 billion years of evolution. By 
placing the discussion in this wider, if still hypothetical, context, we also 
try to place in context some of the extant discussion of information as 
intimately related to DNA, RNA and protein transcription and translation 
processes. While characteristic of current terrestrial life, there are no 
compelling grounds to suppose the same mechanisms would be involved in any life 
form able to evolve by heritable variation and natural selection. In turn, this 
allows us to discuss at least briefly, the focus of much of the philosophy of 
biology on population genetics, which, of course, assumes DNA, RNA, proteins, 
and other features of terrestrial life. Presumably, evolution by natural 
selection – and perhaps self-organization - could occur on many worlds via 
different causal mechanisms.
Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
o

Re: [Fis] Fw: Clarifying Posting. Speculative Realism

2016-05-08 Thread John Collier
Stan, Joseph,

I don’t see any general advantage of a process philosophy over a philosophy of 
things, though Every Thing Must Go argues that things are misleading in modern 
physics, and aren’t needed anyway. We argue that in many cases processes work 
better, bu7t we don’ argue solely in favour of processes, either, since they 
have their own problems. Instead we argue for the more inclusive idea of 
structures, which are definitionally relational. They are more accessible than 
things, but don’t rule out entire metaphysics that includes things, qualia and 
much else. Structuralism merely imposes some discipline on the chaos. It does 
not propose oppositions (though many are constructed in its name)
Processes come in subsets, but only as types. This follows fro9m the definition 
of set. Interlinking of processes spatiotemporally produces networks.

Retaining two-valued logic in some cases at least seems me to be an advantage. 
Logic is an apparatus, a tool, and to predefine which tools are to be useful is 
as fallacious on one side as on the other. Especially when the subject matter 
appears to be confused. It becomes much too easy to give in too quickly. In 
particular, applying logic to itself seems to require a two-valued approach to 
avoid degenerating into Babaylonic nihilism (Zi’inovev). The most appropriate 
application of two valued logic is to logic itself. It illuminates logic an a 
way that nothing else is able to.

Two valued logic give birther to a myriad of logics. I am not a big fan of 
pluralism, preferring simplicity if it can be effectrive, but sometimes it is 
the best we can do, given our mental limitations and the inherent complexity of 
some of the things we study (see both Bill Wimsatt’s methods for finite minds 
and Paul Cilliers’ positive postmodernism  here. There is room for, nay need 
for, at least four basic foundational  but complexly inter-related metaphysical 
attitudes going back at least to the Greeks, and found in many other cultures 
as well. (See William Irwin Thomson’s excellent, At the Edge of History and/or 
Cosmography in the Review of Metaphysics starting in the mid-50s.)

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Stanley N Salthe
Sent: Sunday, 08 May 2016 4:13 PM
To: Joseph Brenner <joe.bren...@bluewin.ch>; fis <fis@listas.unizar.es>
Subject: Re: [Fis] Fw: Clarifying Posting. Speculative Realism

Joseph -- Regarding:

?As it turns out, however, Speculative Realism possesses its own set of 
weaknesses which can be ascribed in a general way to its retention of concepts 
embodying classical binary, truth-functional logic. These include an ontology 
of 'things' rather than processes as the furniture of the world, a logic of 
non-contradiction and a ground of existence that has reason and value, but 
excludes the possibility of a ground of existence which includes incoherence 
and contradiction.
S: Well, why cannot processes be described by subsetting? As in: {energy 
dissipation {work {building a box}}}
and
{energy dissipation {finds quickest route around an obstruction {fails to win 
the race}}}
STAN

On Fri, May 6, 2016 at 9:32 PM, Joseph Brenner 
<joe.bren...@bluewin.ch<mailto:joe.bren...@bluewin.ch>> wrote:
Dear Friends and Colleagues,

The last couple of postings have opened the discussion in a direction their 
authors may not have intended. Bob's felt personal plea for a phenomenological 
approach to biology, and hence to other sciences, and as the foundation of a 
philosophy, begs the question of non-phenomenological approaches which may be 
equally or more valid.

We all agree the mind is capable of phenomenal experience and is not a machine, 
but the (correct) arguments being made seem to me expressions, in various 
styles, of the non-fundamentality of matter and energy. Unless I am wrong, this 
is at least a still open question. Further, Terry's (again correct) statements 
about the importance of the Liar and Goedel paradoxes perhaps overlooks one 
aspect of them: they (the paradoxes) themselves are only relatively simple 
binary cases that can be considered reduced versions of some more fundamental, 
underlying princple governing relationships in the real, physical world. These 
relationships are crucial to an understanding of the non-binary properties of 
information.

A recent book by Tom Sparrow is entitled "The End of Phenomenology". It 
proposes a new science-free doctrine, Speculative Realism, to provide a link 
between phenomena and reality which in my opinion also fails, but may be of 
interest to some of you. I wrote about this doctrine:

As it turns out, however, Speculative Realism possesses its own set of 
weaknesses which can be ascribed in a general way to its retention of concepts 
embodying classical binary, truth-functional logic. These include an ontology 

Re: [Fis] Meaning in neurosceinces

2016-04-15 Thread John Collier
A short comment on one of Pedro’s suggestions.

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Pedro C. Marijuan
Sent: Friday, April 15, 2016 11:01 AM
To: 'fis' <fis@listas.unizar.es>
Subject: [Fis] Meaning in neurosceinces

Dear FIS colleagues,

[John Collier] … clip

The suggestion (to all) is to explore whether phi, rather than relating it to 
the emergence of consciousness, would relate to the emergence of meaning. All 
the fast circulating activations and inhibitions between neural mappings, 
usually involving opposing flows of neuronal "energy" and informational 
"entropy", when they finally "click" and achieve convergence on an optimized 
state, it represents the collective achievement of meaning. Thus, phi would be 
a highly dynamic, fluctuating indicator showing the evolution of the cascades 
of meaning. Let us imagine the thresholds pointed by Bob in ecological 
networks, but circulating at a fiendish speed (could values of phi and 
resilience indexes have similar nature?)

[John Collier] Interesting suggestion, Pedro. I have read a bit about phi, and 
it seems to me to be sound, but I really need to investigate it at greater 
depth. Assuming it is sound, I have been unclear what it has to do with 
consciousness. Conciousness doesn’t seem to me to be a property that admits of 
degrees (one can be conscious of more or less, but not more or less conscious 
is my worry here). However the suggestion that it has to do with meaning seems 
to me to be more appealing, since meaning can come in degrees I would think – 
my objection above to degrees of consciousness does not seem to apply so 
readily. Certainly some works of art (poetry, especially) are more meaningful 
than others, and I would think that applies to representations in general, 
e.g., of the colour red compared to being coloured.

If we think that meaning requires an interpretant (I do, though I am not sure 
that anything with an interpretant is meaningful), then the interpretant can 
vary both in scope and specificity. A very general interpretant has a broad 
scope (think, for example, of the final interpretant of a functional trait, 
which is in the preservation of the autonomy of whatever bears it compared to 
the immediate interpretant of the trait, which will be a specific goal or end). 
I think that specificity is related, but not on the same dimension: functions 
might be more or less specific, but might well have a common final 
interpretant, with the same scope resulting. I am pretty sure it is easy to 
come up with linguistic examples as well (e.g., mass considered in the scope of 
physical theory compared to mass as something measured, and mass-energy and the 
more specific mass alone).

This fits fairly well with my understanding of Bob’s work as well, though I am 
not so ready to use the notion of meaning there, but there is something similar.



John
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Information Conservation in black holes

2016-01-30 Thread John Collier
List,

Sorry I haven't been able to respond to the interesting remarks on my last 
post, but it took a while to digest them, and my current health concerns take 
up a lot of my time, so I haven't had time to come up with responses that are 
properly thought out.

In the meantime, here is an interesting Nature news report about Hawking's (and 
Strominger's) recent proposal for how information can be preserved in black 
holes (which his 1976 paper set up as a problem for the laws of physics, which 
imply information conservation at the most basic level. The solution involves a 
way empty space can carry information in QM via "soft particles". The answer is 
apparently not completely worked out as yet, and there are critics.

http://www.nature.com/news/hawking-s-latest-black-hole-paper-splits-physicists-1.19236?WT.ec_id=NEWS-20160128=50572206=MTc2NjY1MTQ2NQS2=843774519=ODQzNzc0NTE5S0

Seth Lloyd described a different possible explanation in his book Programming 
the Universe: A Quantum Computer Scientist Takes On the 
Cosmos<https://www.wikiwand.com/en/Programming_the_Universe>, 
Knopf<https://www.wikiwand.com/en/Alfred_A._Knopf> (2000) that involves taking 
into consideration the information in boundaries, which I found plausible, 
since the information preservation in physics follows from consideration of 
basic laws together with the constraints of boundary conditions, neither alone.

Perhaps the two approaches are not really distinct. They may eventually cast 
light on each other. For the time being the Hawking/Strominger proposal also 
looks like it can solve the "firewall" problem as well, which has the Black 
Hole boundary being very hot (again, contrary to physical expectations), 
because information can be transferred into radiation instead of energy, so the 
information transfer doesn't require a high temperature at the black hole 
boundary, unlike other forms of radiation production.  All of these 
explanations, and even stating the problem, require information notions, not 
just energy as in classical physics.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Toyabe 2010 [ Information converted to energy ] / Van den Broeck 2010 Thermodynamics of Information / Cartlidge 2010 Information converted to energy

2016-01-14 Thread John Collier
Stan Salthe sent the item below to Pedro and myself, but not to the list, as he 
had used up his posting allotment. With the permission of both of them, who 
think that this is an important issue, I am posting some brief comments I made 
back to Stan, as well as Stan’s email content, in the hope that the issue will 
get more discussion this time.(I posted a link to the 2010 article when it came 
out.)  The relevant material starts below the line, and Stan’s email forwarded 
from Malcolm Dean is below that. It concerns the use of changed boundary 
conditions to move things rather than energy differences, suggesting that 
information can be used instead of energy to cause changes in a system (another 
way of looking at this is that information can be a force in itself, not merely 
a constraint on other actions). In particular, the final state has greater free 
energy than the initial state (it is in end state potential energy of the 
manipulated particles in an electric field), the energy arising from the 
manipulation of the boundary conditions based on the particle location. The 
original authors described this as information-to-energy conversion.



I posted a different pointer to this to fis some time ago, but the reaction 
from the list was almost nothing, or skeptical, though the main objection was 
that we could understand what was going on without using the information 
concept. My response to that was that not  using the word does not mean that 
the concept is not being used.

Of course, if you think that information is always meaningful to some 
interpreter (alternatively, always a coding of something that has had meaning 
to some mind, or the like) then the argument in the paper is a nonstarter. I 
would argue that this puts unnecessary obstacles in the way of a unified 
approach to information, and that the issue of the interpretation of 
information gets obscured by presupposing information is carried only by 
meaningful communication.

John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Stanley N Salthe [mailto:ssal...@binghamton.edu]
Sent: Thursday, 14 January 2016 4:56 PM
To: Pedro Marijuan; John Collier
Subject: Fwd: Toyabe 2010 [ Information converted to energy ] / Van den Broeck 
2010 Thermodynamics of Information / Cartlidge 2010 Information converted to 
energy


-- Forwarded message --
From: Malcolm Dean <malcolmd...@gmail.com<mailto:malcolmd...@gmail.com>>
Date: Thu, Jan 14, 2016 at 6:13 AM
Subject: Toyabe 2010 [ Information converted to energy ] / Van den Broeck 2010 
Thermodynamics of Information / Cartlidge 2010 Information converted to energy
To:
​http://www.nature.com/nphys/journal/v6/n12/full/nphys1821.html
​
Nature Physics 6, 988–992 (2010) doi:10.1038/nphys1821
Experimental demonstration of information-to-energy conversion and validation 
of the generalized Jarzynski equality
Shoichi Toyabe,
​
Takahiro Sagawa,
​
Masahito Ueda,
​
Eiro Muneyuki
​
& Masaki Sano

In 1929, Leó Szilárd invented a feedback protocol1 in which a hypothetical 
intelligence—dubbed Maxwell’s demon—pumps heat from an isothermal environment 
and transforms it into work. After a long-lasting and intense controversy it 
was finally clarified that the demon’s role does not contradict the second law 
of thermodynamics, implying that we can, in principle, convert information to 
free energy2, 3, 4, 5, 6. An experimental demonstration of this 
information-to-energy conversion, however, has been elusive. Here we 
demonstrate that a non-equilibrium feedback manipulation of a Brownian particle 
on the basis of information about its location achieves a Szilárd-type 
information-to-energy conversion. Using real-time feedback control, the 
particle is made to climb up a spiral-staircase-like potential exerted by an 
electric field and gains free energy larger than the amount of work done on it. 
This enables us to verify the generalized Jarzynski equality7, and suggests a 
new fundamental principle of an ‘information-to-heat engine’ that converts 
information into energy by feedback control.


http://www.nature.com/nphys/journal/v6/n12/full/nphys1834.html
​[ <--- Please send this PDF if you have access.  -- M.   ]​

​
​
Nature Physics 6, 937–938 (2010) doi:10.1038/nphys1834
Thermodynamics of information: Bits for less or more for bits?
Christian Van den Broeck
Recent advances in the formulation of the second law of thermodynamics have 
rekindled interest in the connections between statistical mechanics and 
information processing. Now a 'Brownian computer' has approached the 
theoretical limits set by the rejuvenated second law. Or has it?




http://physicsworld.com/cws/article/news/2010/nov/19/information-converted-to-energy
Physics World, 19
​​
November 2010
​​
Information converted to energy

Physicists in Japan have shown experimentally that a particle can be made

Re: [Fis] Sustainability through multilevel research: The Lifel, Deep Society Build-A-Thon - 1

2015-12-17 Thread John Collier
Interesting post, Nikhil. One of my PhD students is doing his thesis on 
egalitarian (living system) centred morality. He is not aiming to draw moral 
conclusions, but to lay out a coherent position based in complexity theory, 
especially in the work of Paul Cilliers (who he studied with for his MA) and 
myself.

Extension to include the values of all living systems within economics is a 
natural extension of my student’s work, though he has enough on his plate right 
now.

John Collier
Professor Emeritus, UKZN
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Nikhil Joshi
Sent: Thursday, 17 December 2015 10:53
To: FIS Group
Cc: Nikhil Joshi
Subject: [Fis] Sustainability through multilevel research: The Lifel, Deep 
Society Build-A-Thon - 1

Dear All,
The research presented here is focused on gleaning insights leading to new 
solutions to the economics vs ecosystem conflict. The roots of many of our 
problems in ecological sustainability lie in the fact that our socio-economic 
systems are largely focused on fulfilling only human needs and the needs of 
human organizations. In doing so, as pointed out by Pedro, Bob, Francesco and 
others in this group our economics largely ignores the productive value of our 
ecosystems and the true costs of our development on our life supporting living 
systems.


I term such a society as a “shallow society”, a society that is focused on the 
development of a single species and largely ignores the value of its own 
life-supporting living systems. With global population predicted to grow to 9 
billion people, the next level of human development requires a transition of 
human society from being a “shallow society” that is only focused on only human 
needs to what I call a “deep society”. A deep society is a society that 
includes all living systems in its development.


In this view, a deep society is not only focused on needs of human beings and 
their organizations but its development models also include development of the 
entire gamut of life-supporting living systems. Such a society grows not by 
exploiting the resources of a living planet, but also it possesses the 
capability to nurture, grow and actively manage a “living planet” (and perhaps 
seed life on other planets as well). Human development in the future will 
require the creation of new capabilities to develop models leading to a deep 
society. The question then is- can we develop systems that will enable a 
fair-value reciprocity and exchange between living ecosystems and economic 
systems?

While, the notion that economics does not adequately value natural systems has 
been highlighted by many researchers in the field of ecological economics. 
Ideas on how natural systems can be understood, valued and integrated into 
economics have remained elusive. A multilevel view (like the one presented 
here) allows one to compare socio-economic organizations with natural 
organizations and could also provide new insights into how the dynamics of 
natural ecosystems could be synergised with economic systems.
The model presented in the kick-off session shows two levels of energetically 
and materially coupled exchange networks in ecosystems. At the first level of 
exchange networks geochemical molecules are organized into different 
autotrophic species, and modulated by Mycorrhiza (level 1). Different 
autotrophic species then become food for the different heterotrophic species 
hence giving rise to the next higher level of exchange networks in ecosystems, 
modulated by gut bacterial networks (Level 2). The question then is- how does 
nature organize to build-in synergies between these two levels?
At level 1, Mycorrhiza networks are known to modulate growth rates across 
different autotrophic species by providing phosphorous to different autotrophic 
species in quantitative exchange for carbohydrates. Autotrophic species (or 
groups of autotrophic species) that provide more carbohydrate hence get more 
phosphorous. Hence carbohydrates play a role in influencing phosphorous 
allocation across different autotrophic species connected to a Mycorrhiza 
network. At the next higher level in the exchange networks between different 
autotrophic species and different heterotrophic species gut bacteria use 
carbohydrates to modulate growth rates in heterotrophic species. Hence 
carbohydrates seem to play a role both in influencing dynamics in exchange 
networks at level 1, as well as in influencing dynamics in exchange networks at 
level 2.
Could such an organization where carbohydrates are a common influencing factor 
in exchanges at both levels serve to align both levels towards increasing 
overall carbohydrate production in ecosystems (hence increasing the overall 
primary production in ecosystems) by synergizing dynamics across both levels 
(and two different modulator networks)?
Could this two-level role of carbohydrates provide new insights on aligning the 
third level of exchange

[Fis] The Measurement Problem from the Perspective of an Information-Theoretic Interpretation of Quantum Mechanics

2015-11-26 Thread John Collier
A paper by my former graduate advisor, Jeff Bub, who was a student of David 
Bohm's.
http://www.mdpi.com/1099-4300/17/11/7374

The Measurement Problem from the Perspective of an Information-Theoretic 
Interpretation of Quantum Mechanics

The aim of this paper is to consider the consequences of an 
information-theoretic interpretation of quantum mechanics for the measurement 
problem. The motivating idea of the interpretation is that the relation between 
quantum mechanics and the structure of information is analogous to the relation 
between special relativity and the structure of space-time. Insofar as quantum 
mechanics deals with a class of probabilistic correlations that includes 
correlations structurally different from classical correlations, the theory is 
about the structure of information: the possibilities for representing, 
manipulating, and communicating information in a genuinely indeterministic 
quantum world in which measurement outcomes are intrinsically random are 
different than we thought. Part of the measurement problem is deflated as a 
pseudo-problem on this view, and the theory has the resources to deal with the 
remaining part, given certain idealizations in the treatment of macrosystems.

John Collier
Senior Research Associate and Professor Emeritus,
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Short article by Chaitin on ionformation incompleteness

2015-10-27 Thread John Collier
A particularly clear statement of basic results of incompleteness, randomness, 
creativity, and a proposal for application to metabiology.

http://inference-review.com/article/an-algorithmic-god

John Collier
Professor Emeritus, UKZN
http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information Foundation of the Act--F.Flores L.deMarcos

2015-07-27 Thread John Collier
Dear folks,

I think that Koichiro is right. I would say more, though: that the loops just 
have to be non-reducible to look a lot like biological things. This is 
basically Robert Rosen's position. The sort of loops required aren't just 
iterations (that can be decomposed). Rather they are the sort of loop that 
logicians (and Rosen) call impredicative. Such loopy things have no computable 
model. As Rosen points out there are far more functions of this sort than 
merely iterative kind.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Koichiro Matsuno
Sent: July 27, 2015 4:13 AM
To: 'Marcos Ortega Luis de'; 'fis'
Subject: Re: [Fis] Information Foundation of the Act--F.Flores  L.deMarcos

At 4:13 AM 07/27/2015, Luis de Marcos Ortega wrote:

a) cycles can imply infinite loops that in our opinion are not appropriate to 
model human actions
b) even considering cycles a set of actions can still be modeled a as a tree, 
so we consider that loops add unnecessary complexity to the model

Loops are clumsy, to be sure. Nonetheless, loops look indispensable in 
implementing the cohesion for making an organization. An organization 
maintaining itself through the exchange of component elements has recourse to 
the cohesion acting between the individual elements incumbent in the organized 
body and the de novo individuals to be recruited from nearby for replacemt. In 
fact, a loop can be the cohesive factor of a structural nature emerging from 
the participating individuals.

Koichiro



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread John Collier
Folks,

Doing dimensional analysis entropy is heat difference divided by temperature. 
Heat is energy, and temperature is energy per degree of freedom. Dividing, we 
get units of inverse degrees of freedom. I submit that information has the same 
fundamental measure (this is a consequence of Scott Muller’s asymmetry 
principle of information. So fundamentally we are talking about the same basic 
thing with information and entropy.

I agree, though, that it is viewed from different perspectives and they have 
differing conventions for measurement.

I agree with Loet’s other points.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: July 26, 2015 8:50 AM
To: 'Joseph Brenner'; 'Fernando Flores'; fis@listas.unizar.es
Subject: Re: [Fis] Answer to the comments made by Joseph

Dear Joe,


a) information is more than order; there is information in absence 
(Deacon), in disorder, in incoherence as well as coherence;

The absent options provide the redundancy; that is, the complement of the 
information to the maximal information [H(max)].

See also my recent communication (in Vienna) or at 
http://arxiv.org/abs/1507.05251


b) information is not the same as matter-energy, but it is inseparable from 
it and reflects its dualistic properties;

Information is dimensionless. It is coupled to the physics of matter-energy 
because S = k(B) * H.
k(B) provides the dimensionality (Joule/Kelvin) and thus the physics. In other 
domains of application (e.g., economics), this coupling [via k(B)] is not 
meaningful.


c) information is both energy and a carrier of meaning, which is not, in my 
humble opinion, a hard physicalist approach;



Meaning provides more options to the information and thus increases the 
redundancy. In the case of reflexivity and further codification of meanings, 
the generation of redundancy can auto-catalytically be reinforced (Ulanowicz).



Best,

Loet


d) it remains to be shown that digitalism or computationalism is or could be 
the natural language for the description of the non-digital world, that is, of 
the complexity of the world that is of interest. Rafael Capurro has talked 
about the 'digital casting' of the world that we (or most of us) use in our 
daily lives, but this philosophical concept, with which I agree, is not a 
scientific description of the physics of informational processes as such. The 
best synthesis here of which I am aware is the Informational-Computationalism 
of Gordana Dodig-Crnkovic and even that is a framework, not an ontology.
e) it is possible to use probabilities to describe the evolution of real 
processes, as well as as a mathematical language for describing acts;
f) your presentation of a parameter designated as 'freedom' is indeed original, 
but it is a classificatory system, based on bits. It will miss the 
non-algorithmic aspects of values. I am suspicious of things that have infinite 
levels and represent 'pure' anything;
g) I do not feel you have added value to human acts by designating them as 
∞-free This may not be intended as doctrine but it looks like it.
h) your conclusions about informational value are correct from what I will call 
a hard neo-capitalist ;-) standpoint, but I am sure you agree there are other 
ones.

In trying to learn through association with this FIS group, I have come to 
believe that Informational Science is unique in that it can capture some of the 
complexity of nature, culture and society. It is not a 'hard simplification' as 
you suggest some sciences are.  The concept of (its) foundations is very broad, 
and it can and should include careful binary analyses such as the one you have 
made. However, I am pleading for a more directed positioning of your approach 
with respect to others. Is this an acceptable basis for you for continuing the 
debate?

Thank you again,

Joseph
- Original Message -
From: Fernando Floresmailto:fernando.flo...@kultur.lu.se
To: fis@listas.unizar.esmailto:fis@listas.unizar.es
Sent: Thursday, July 23, 2015 3:58 PM
Subject: [Fis] Answer to the comments made by Joseph

Hello everybody:


I will answer to the comments made by Joseph and Luis will answer to the 
comments made by Moisés.

Dear Joseph:

Thank you for your comments. We are not sure about the usefulness of 
identifying “information” (order) with “mater”. In this sense we are very 
carefully to avoid any hard physicalist approach. In this sense we believe with 
Norbert Wiener:
The mechanical brain does not secrete thought “as the liver does bile”, as the 
earlier materialist claimed, nor does it put it out in the form of energy, as 
the muscle puts out its activity. Information is information, not matter nor 
energy. No materialism, which does not admit this, can survive at the present 
day.
An informational description of the world must stand as a new branch of science 
in which “digitalism” will be the natural language.  Of course as any other 
science, it is a simplification of the 

Re: [Fis] Answer to the comments made by Joseph

2015-07-27 Thread John Collier
Loet,

I think that is consistent with what I said. Different ways of measuring and 
perspectives. I prefer to see the unity that comes out of the dimensional 
analysis approach, but I was always taught that if you wanted to really 
understand something, absorb that first. But my background is in applied 
physics. Research, but on applied issues in business and government. The 
advantage is that you see through the basic physical values (or parameters in 
general), and then you can apply it to the results of measurements. Always 
worked for me. One tricky problem I solved was a model for how the values I was 
getting were possible. Turned out that not enough dimensions were being taken 
into consideration in the text book solutions. So relevant information was 
being ignored. It might seem that dimensionality is given for physics, but not 
when you use generalized coordinate systems. The Boltzmann equation doesn't 
hold very well in some cases like that - he explicitly assumes a 6N dimensional 
system in his derivations. Not always true.

I will shut up now. These are the first posts I have had in weeks.

John

From: l...@leydesdorff.net [mailto:leydesdo...@gmail.com] On Behalf Of Loet 
Leydesdorff
Sent: July 27, 2015 7:10 PM
To: John Collier; 'Joseph Brenner'; 'Fernando Flores'; fis@listas.unizar.es
Subject: RE: [Fis] Answer to the comments made by Joseph

Dear John and colleagues,

So fundamentally we are talking about the same basic thing with information and 
entropy.

The problem is fundamentally: the two are the same except for a constant. 
Most authors attribute the dimensionality to this constant (kB).

From the perspective of probability calculus, they are the same.

Best,
Loet

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] It-from-Bit and information interpretation of QM

2015-06-27 Thread John Collier
Sorry Loet, but I just don't see the need for an observer. I do think the 
difference must be by something to something (perhaps the same thing) but 
Koichiro's formulation implies this.  Again, I warn against unneeded 
complication.


Sent from Samsung Mobile


 Original message 
From: Loet Leydesdorff
Date:27/06/2015 10:00 (GMT+02:00)
To: 'Koichiro Matsuno' ,John Collier ,'fis'
Subject: RE: [Fis] It-from-Bit and information interpretation of QM

Koichiro: In order to make them decidable or meaningful, some qualifier must 
definitely be needed. A popular example of such a qualifier is a subjective 
observer.

A difference that makes a difference for a qualifier, thus requires 
specification of:

1.  The first difference;

2.  The second difference;

3.  The qualifier (e.g., the observer).

The first difference can be measured using Shannon-type information, since a 
probability distribution can be considered as a set of (first-order) 
differences. Brillouin tried to specify the second difference as a ?H. ?H can 
also be negative (negentropy). But how does one proceed to the measurement?

Best,
Loet


Loet Leydesdorff
Emeritus University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net mailto:l...@leydesdorff.net ; http://www.leydesdorff.net/
Honorary Professor, SPRU, http://www.sussex.ac.uk/spru/ University of Sussex;
Guest Professor Zhejiang Univ.http://www.zju.edu.cn/english/, Hangzhou; 
Visiting Professor, ISTIC, http://www.istic.ac.cn/Eng/brief_en.html Beijing;
Visiting Professor, Birkbeckhttp://www.bbk.ac.uk/, University of London;
http://scholar.google.com/citations?user=ych9gNYJhl=en

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Koichiro Matsuno
Sent: Saturday, June 27, 2015 9:04 AM
To: 'John Collier'; 'fis'
Subject: Re: [Fis] It-from-Bit and information interpretation of QM

At 4:00 AM 06/27/2015, John Collier wrote:

I also see no reason that Bateson's difference that makes a difference needs to 
involve meaning at either end.

[KM] Right.  The phrase saying a difference that makes a difference must be a 
prototypical example of second-order logic in that the difference appearing 
both in the subject and predicate can accept quantification. Most statements 
framed in second-order logic  are not decidable. In order to make them 
decidable or meaningful, some qualifier must definitely be needed. A popular 
example of such a qualifier is a subjective observer. However, the point is 
that the subjective observer is not limited to Alice or Bob in the QBist 
parlance.

   Koichiro



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] It-from-Bit and information interpretation of QM

2015-06-26 Thread John Collier
Dear folks,

I believe that information in itself must be interpreted, and is not, therefore 
intrinsically meaningful. The addition requires, I think, semiotics. Without 
that there are mere mechanical relations, and at best codes that translate one 
domain to another without understanding or integration required. I also see no 
reason that Bateson’s difference that makes a difference needs to involve 
meaning at either end. He did not add makes a difference “to something about 
something”. He just talked about making a difference. Best not to 
over-interpret.

I think that to ignore this distinction does a great disservice to information 
theory by glossing over a problem that any information processing system needs 
to deal with if it is to achieve meaning.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Loet Leydesdorff
Sent: June 26, 2015 7:34 PM
To: 'Marcus Abundis'; 'fis'
Subject: Re: [Fis] It-from-Bit and information interpretation of QM

Dear Marcus and colleagues,

Katherine Hayles (1990, pp. 59f.) compared this discussion about the definition 
of “information” with asking whether a glass is half empty or half full. 
Shannon-type information is a measure of the variation or uncertainty, whereas 
Bateson’s “difference which makes a difference” presumes a system of reference 
for which the information can make a difference and thus be meaningful.

In my opinion, the advantage of measuring uncertainty in bits cannot be 
underestimated, since the operationalization and the measurement provide 
avenues to hypothesis testing and thus control of speculation (Theil, 1972). 
However, the semantic confusion can also be solved by using the words 
“uncertainty” or “probabilistic entropy” when Shannon-type information is meant.

I note that “a difference which makes a difference” cannot so easily be 
measured. ☺ I agree that it is more precise to speak of “meaningful 
information” in that case. The meaning has to be specified in the system of 
reference (e.g., physics and/or biology).

Best,
Loet


References:

Hayles, N. K. (1990). Chaos Bound; Orderly Disorder in Contemporary Literature 
and Science Ithaca, etc.: Cornell University.
Theil, H. (1972). Statistical Decomposition Analysis. Amsterdam/ London: 
North-Holland.


Loet Leydesdorff
Emeritus University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net mailto:l...@leydesdorff.net ; http://www.leydesdorff.net/
Honorary Professor, SPRU, http://www.sussex.ac.uk/spru/ University of Sussex;
Guest Professor Zhejiang Univ.http://www.zju.edu.cn/english/, Hangzhou; 
Visiting Professor, ISTIC, http://www.istic.ac.cn/Eng/brief_en.html Beijing;
Visiting Professor, Birkbeckhttp://www.bbk.ac.uk/, University of London;
http://scholar.google.com/citations?user=ych9gNYJhl=en

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Marcus Abundis
Sent: Friday, June 26, 2015 7:02 PM
To: fis@listas.unizar.esmailto:fis@listas.unizar.es
Subject: [Fis] It-from-Bit and information interpretation of QM

Dear Andrei,

I would ask for clarification on whether you speak of information in your 
examples as something that has innate meaning or something that is innately 
meaningless . . . which has been a core issue in earlier exchanges. If this 
issue of meaning versus meaningless in the use of the term information is 
not resolved (for the group?) it seems hard (to me) to have truly meaningful 
exchanges . . . without having to put a meaningful or meaningless qualifier 
in front of information every time it is use.

Thanks.



Marcus Abundis
about.me/marcus.abundis







___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Philosophy, Computing, and Information - apologies!

2015-06-13 Thread John Collier
Paul Davies believes in something like that. The other “it from bit”ers, no. So 
I don’t know why you say that, Krassimir. I took the structure below directly 
from uses that appear in scientific sources, not from some a priori 
consideration. Each nesting generates hypotheses that can be tested (and has). 
I find the unification, which involves similar methods at each nesting, 
attractive methodologically. Not everyone does. But I don’t think it is more 
than the sort of usual abductive inference that is common in science. The 
proof, of course, is in the productivity in producing testable and eventually 
tested hypotheses, not in any a priori belief.

John

From: Krassimir Markov [mailto:mar...@foibg.com]
Sent: June 12, 2015 11:19 PM
To: John Collier; Stanley N Salthe; fis
Subject: Re: [Fis] Philosophy, Computing, and Information - apologies!

Dear John and Stan,
Your both hierarchies are good only if you believe in God.
But this is believe, not science.
Sorry, nothing personal!
Friendly regards
Krassimir




From: John Colliermailto:colli...@ukzn.ac.za
Sent: Friday, June 12, 2015 5:02 PM
To: Stanley N Salthemailto:ssal...@binghamton.edu ; 
fismailto:fis@listas.unizar.es
Subject: Re: [Fis] Philosophy, Computing, and Information - apologies!

Not quite the same hierarchy, but similar:

[cid:image001.png@01D0A5BE.4B3DB950]

It from bit is just information, which is fundamental, on Seth Lloyd’s 
computational view of nature. Paul Davies and some other physicists agree with 
this.
Chemical information is negentropic, and hierarchical in most physiological 
systems.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Stanley N Salthe
Sent: Friday, June 12, 2015 3:40 PM
To: fis
Subject: Re: [Fis] Philosophy, Computing, and Information - apologies!

Pedro -- Your list:

physical, biological, social, and Informational

is implicitly a hierarchy -- in fact, a subsumptive hierarchy, with the 
physical subsuming the biological and the biological subsuming the social.  But 
where should information appear?  Following Wheeler, we should have:

{informational {physicochemical {biological {social

STAN

On Fri, Jun 12, 2015 at 5:34 AM, Pedro C. Marijuan 
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es wrote:
Thanks, Ken. I think your previous message and this one are drawing sort of the 
border-lines of the discussion. Achieving a comprehensive view on the 
interrelationship between computation and information is an essential matter. 
In my opinion, and following the Vienna discussions, whenever life cycles are 
involved and meaningfully touched, there is info; while the mere info 
circulation according to fixed rules and not impinging on any life-cycle 
relevant aspect, may be taken as computation. The distinction between both may 
help to consider more clearly the relationship between the four great domains 
of sceince: physical, biological, social, and Informational. If we adopt a 
pan-computationalist stance, the information turn of societies, of 
bioinformation, neuroinformation, etc. merely reduces to applying computer 
technologies. I think this would be a painful error, repeating the big mistake 
of 60s-70s, when people band-wagon to developed the sciences of the artificial 
and reduced the nascent info science to library science. People like Alex 
Pentland (his social physics 2014) are again taking the wrong way... Anyhow, 
it was nicer talking face to face as we did in the past conference!

best ---Pedro

Ken Herold wrote:
FIS:

Sorry to have been too disruptive in my restarting discussion post--I did not 
intend to substitute for the Information Science thread an alternative way of 
philosophy or computing.  The references I listed are indicative of some bad 
thinking as well as good ideas to reflect upon.  Our focus is information and I 
would like to hear how you might believe the formal relational scheme of 
Rosenbloom could be helpful?

Ken

--
Ken Herold
Director, Library Information Systems
Hamilton College
198 College Hill Road
Clinton, NY 13323
315-859-4487tel:315-859-4487
kher...@hamilton.edumailto:kher...@hamilton.edu 
mailto:kher...@hamilton.edumailto:kher...@hamilton.edu


--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526tel:%2B34%20976%2071%203526 ( 6818)
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.esmailto:Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


___
Fis mailing list
Fis@listas.unizar.esmailto:Fis@listas.unizar.es
http

Re: [Fis] Philosophy, Computing, and Information - apologies!

2015-06-13 Thread John Collier
Dear Joseph, List,

I am running past my allotment, so I will shut up after this for a while. (I 
have to go to California for a workshop in any case, and won’t have much 
internet access for the two days I am traveling.)

The “it from bit” view was developed (after its origins for other reasons I 
will come to) partly to pose questions about black holes that cannot be posed 
in terms of energy. It also applies to any horizon, including event and 
particle horizons. Whatever the answer, it permits well-posed questions that 
have not been able to be posed in other terms, at least so far.

The “it from bit” view is independent of, but strongly recommends a 
computational view. I have argued for a transfer of information view of 
causation on independent philosophical grounds as a development of Russell’s 
at-at view of causation. The two approaches converge nicely.

My understanding of the “it from bit” view does not require a binary logic of 
causation, but emergence of information comes from bifurcations (Layzer, 
Frautschi, Collier, among others). So that is another happy convergence of two 
approaches. I see no reason why trifurcations and other higher order splits 
might not be possible, if unlikely. This is an empirical question, but makes no 
difference to the underlying mathematics, which takes base 2 logarithms by 
convention, for convenience. I don’t see this issue as empirical in itself, but 
the convenience has some empirical force.

The stronger “it from bit” view that applies to everything was due originally 
to Wheeler, not any of the physicists mentioned so far, and supported by 
Gell-Mann. Their reason is that empirical values in quantum mechanics often 
have been shown to arise from asymmetries, and they assume this will continue 
(proton spin is one notable current problem, but the problem is being pursued 
by this method, to the best of my understanding). My former student Scott 
Muller was able to show that asymmetries in a system assign a unique 
information content in the it from bit sense. In any case, the view has an 
empirical motivation, and has produced empirically satisfying results, if not 
universally so far.

With all due respect, Joseph, the scientists I have mentioned have been 
motivated by empirical issues (problems), not dogma, but you are not working on 
empirical problems. I have argued that the approach is motivated primarily by 
empirical issues, and it is simply wrong to attribute it to “authority”, since 
anyone in principle has access to the empirical issues and can make their own 
proposals. I have not seen these forthcoming for the issues involved.

I will shut up now.

Regards,
John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: June 13, 2015 10:16 AM
To: fis
Subject: [Fis] Philosophy, Computing, and Information - apologies!


- Original Message -
From: Joseph Brennermailto:joe.bren...@bluewin.ch
To: fismailto:fis@listas.unizar.es
Sent: Saturday, June 13, 2015 10:13 AM
Subject: Fw: [Fis] Philosophy, Computing, and Information - apologies!

Dear Colleagues,

I completely agree with Krassimir's position and on the importance of the issue 
on which it taken. Neither he nor I wish to say that there cannot be models and 
insights for science in religious beliefs, such as the Kabbala, but then John's 
diagram would be more appropriate if it had En Sof at the center rather than 
It-from-Bit.

The statement It-from-Bit is just information, further, requires analysis: do 
we 1) accept this as dogma, including the implied limitation of information to 
separable binary entities? or 2) assume that the universe is constituted by 
complex informational processes, in which the term 'It-from-Bit' is misleading 
at best, and should be avoided?

I feel particularly uncomfortable when dogmatic computational views such as 
those of Lloyd and Davies are presented as authoritative without comment, 
except by appeal to the authority of 'some physicists'. Those FISers who would 
like to see a reasonably considered rebuttal might look at my article in 
Information: The Logic of the Physics of Information.

Best wishes,

Joseph


- Original Message -
From: Krassimir Markovmailto:mar...@foibg.com
To: John Colliermailto:colli...@ukzn.ac.za ; Stanley N 
Salthemailto:ssal...@binghamton.edu ; fismailto:fis@listas.unizar.es
Sent: Friday, June 12, 2015 11:18 PM
Subject: Re: [Fis] Philosophy, Computing, and Information - apologies!

Dear John and Stan,
Your two hierarchies are good only if you believe in God.
But this is belief, not science.
Sorry, nothing personal!
Friendly regards
Krassimir




From: John Colliermailto:colli...@ukzn.ac.za
Sent: Friday, June 12, 2015 5:02 PM
To: Stanley N Salthemailto:ssal...@binghamton.edu ; 
fismailto:fis@listas.unizar.es
Subject: Re: [Fis] Philosophy, Computing, and Information - apologies!

Not quite the same hierarchy, but similar:

[cid:image001.png@01D0A5D6.D997C110]

It from bit is just information, 

Re: [Fis] Philosophy, Computing, and Information - apologies!

2015-06-12 Thread John Collier
Not quite the same hierarchy, but similar:

[cid:image001.png@01D0A529.DBE58A40]

It from bit is just information, which is fundamental, on Seth Lloyd’s 
computational view of nature. Paul Davies and some other physicists agree with 
this.
Chemical information is negentropic, and hierarchical in most physiological 
systems.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Stanley N Salthe
Sent: Friday, June 12, 2015 3:40 PM
To: fis
Subject: Re: [Fis] Philosophy, Computing, and Information - apologies!

Pedro -- Your list:

 physical, biological, social, and Informational

is implicitly a hierarchy -- in fact, a subsumptive hierarchy, with the 
physical subsuming the biological and the biological subsuming the social.  But 
where should information appear?  Following Wheeler, we should have:

{informational {physicochemical {biological {social

STAN

On Fri, Jun 12, 2015 at 5:34 AM, Pedro C. Marijuan 
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es wrote:
Thanks, Ken. I think your previous message and this one are drawing sort of the 
border-lines of the discussion. Achieving a comprehensive view on the 
interrelationship between computation and information is an essential matter. 
In my opinion, and following the Vienna discussions, whenever life cycles are 
involved and meaningfully touched, there is info; while the mere info 
circulation according to fixed rules and not impinging on any life-cycle 
relevant aspect, may be taken as computation. The distinction between both may 
help to consider more clearly the relationship between the four great domains 
of sceince: physical, biological, social, and Informational. If we adopt a 
pan-computationalist stance, the information turn of societies, of 
bioinformation, neuroinformation, etc. merely reduces to applying computer 
technologies. I think this would be a painful error, repeating the big mistake 
of 60s-70s, when people band-wagon to developed the sciences of the artificial 
and reduced the nascent info science to library science. People like Alex 
Pentland (his social physics 2014) are again taking the wrong way... Anyhow, 
it was nicer talking face to face as we did in the past conference!

best ---Pedro

Ken Herold wrote:
FIS:

Sorry to have been too disruptive in my restarting discussion post--I did not 
intend to substitute for the Information Science thread an alternative way of 
philosophy or computing.  The references I listed are indicative of some bad 
thinking as well as good ideas to reflect upon.  Our focus is information and I 
would like to hear how you might believe the formal relational scheme of 
Rosenbloom could be helpful?

Ken

--
Ken Herold
Director, Library Information Systems
Hamilton College
198 College Hill Road
Clinton, NY 13323
315-859-4487tel:315-859-4487
kher...@hamilton.edumailto:kher...@hamilton.edu 
mailto:kher...@hamilton.edumailto:kher...@hamilton.edu


--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526tel:%2B34%20976%2071%203526 ( 6818)
pcmarijuan.i...@aragon.esmailto:pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.esmailto:Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] It From Bit video

2015-05-29 Thread John Collier
I certainly agree, Marcus. One would do much better spending the time reading 
Seth Lloyd’s book, Programming the Universe: A Quantum Computer Scientist Takes 
On the Cosmoshttps://www.wikiwand.com/en/Programming_the_Universe, 
Knopfhttps://www.wikiwand.com/en/Alfred_A._Knopf, March 14, 2006, 240 p., 
ISBN 1-4000-4092-2https://www.wikiwand.com/en/Special:BookSources/1400040922. 
He does not get to meaning, though. Some people have tried to infer physics is 
based on information, information involves meaning, so physics involves 
meaning. It usually (but not always) ends up with “The universe is meaningful” 
or something like that or stronger (intelligent design of some sort). A book 
that certainly does not do this is David Layzer’s Cosmogenesis. Which does work 
its way through to meaning, but it is a late arrival. The book that most takes 
the meaningful universe as a consequence of information being fundamental view 
is Paul Davies and Niels Henrik Gregersen (Editors), Information and the Nature 
of Reality: From Physics to Metaphysics, Cambridge University Press, 2010, 398 
pp., $30.00, ISBN:9780521762250. The last chapters try to show this supports 
particularly Christian beliefs.

John

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Marcus Abundis
Sent: May 28, 2015 3:10 AM
To: fis@listas.unizar.es
Subject: Re: [Fis] It From Bit video

While the interviews on the video are interesting, in general, I also find them 
a bit annoying. I never hear information actually described in a specific 
way. They could as easily be discussing raw data as far as I can tell. For 
example, when is meaning associated with information (or data) and how does 
that meaning arise, who/what is ascribing meaning, etc..? The interview could 
have gone much further, and did not seem particularly well thought out in 
advance. Without a clear sense of how information is being used here, 
subsequent thoughts would seem to be equally unclear or confused (to my mind).
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] It From Bit video

2015-05-27 Thread John Collier
That is most interesting, Francesco. It agrees with my understanding, but there 
are people reluctant to call it ‘inofrmation’. I don’t know what else to call 
it.
Cheers,
John

From: Francesco Rizzo [mailto:13francesco.ri...@gmail.com]
Sent: May 27, 2015 8:27 AM
To: John Collier
Cc: Srinandan Dasmahapatra; u...@umces.edu; fis
Subject: Re: [Fis] It From Bit video

Caro John e Cari colleghi,
Stephen Hawking nel 1975 riteneva che i buchi neri fagocitassero tutto ciò che 
si ritrovava nelle loro vicinanze, all'interno di una regione detta orizzonte 
degli eventi. Fin da allora diventò evidente che questa proprietà portasse a 
un paradosso. Infatti se i buchi neri inghiottono tutto, allora dovrebbero 
fagocitare e distruggere anche l'informazione, perdendo di ciò che ingoiano 
qualsiasi traccia. Secondo la meccanica quantistica, però, l'informazione 
contenuta nella materia non può andare persa del tutto. Circa trent'anni dopo 
Hawking ha affermato che sui buchi neri aveva torto. Rivedendo la sua teoria 
sostiene che i buchi neri non si limitano a perdere massa attraverso una 
radiazione di energia, ma evaporano o rilasciano informazione. Con-tengono 
un'informazione sulla materia di cui sono fatti che consente di pre-dirne il 
futuro. In tal modo i buchi neri non  evaporano o irradiano un'energia 
invisibile o enigmatica priva di informazione come se fossero delle 
inafferrabili e indecifrabili entità cosmiche,  e non sfuggono alla (mia) 
super-legge della combinazione creativa (anche se talvolta stupefacente) di 
energia e in-formazione. I buchi neri quindi possono considerarsi come speciali 
scatole nere o magici processi di tras-in-formazione produttivi ( i cui input 
 e output sono materia, energia e informazione) e prospettici.
Questo significa che da economista ho:
-elaborato una legge che vale anche per l'astronomia e l'intera fisica;
-preceduto di circa vent'anni quel che Hawking ha scoperto nel 1998 
(Gravitational  entropy) e nel 2005 (Information loss in black holes, 
Phisical review. D 72).
 Quindi all'INTERNO dei buchi neri si avrebbe una minore entropia (o una 
maggiore neg-entropia) rispetto alla maggiore entropia (o minore neg-entropia) 
ESTERNA. La formazione di maggiore entropia ESTERNA (corrispondente ad una 
minore informazione) dovrebbe essere necessariamente bilanciata da una maggiore 
informazione INTERNA (corrispondente ad una minore entropia). In base a questo 
ragionamento o bilanciamento - coerente con la logica della Nuova economia - i 
buchi neri dovrebbero produrre ed  emettere informazione netta al pari di 
qualunque processo produttivo. Tale asimmetria ESTERNA-INTERNA fa una 
differenza che è proprio l'informazione. Non sono pochi i saggi che ho dedicato 
alla capacità creativa dell'asimmetria in qualunque processo di avanzamento 
scientifico (cfr. soprattutto Incontro d'amore tra il cuore della fede e 
l'intelligenza della scienza, Aracne, Roma, 2014).
Quel che ho descritto schematicamente e sinteticamente, cosa di cui mi scuso, 
di-mostra la mirabile e meravigliosa armonia che governa il mondo.
Grazie.
Francesco Rizzo.


2015-05-26 23:19 GMT+02:00 John Collier 
colli...@ukzn.ac.zamailto:colli...@ukzn.ac.za:
Dear Srinandan,

He relation of geometry to information theory (and also of particle theory in 
the Standard Theory) is by way of group theory. Groups describe symmetries, 
which are reversible. What is left over are the asymmetries, which are the 
differences that can be identified as information. This is worked out in some 
detail by my former student, Scott Muller, in Asymmetry: The Foundation of 
Information. Springer: Berlin. 2007. Seth Lloyd relates the information concept 
to quantum mechanics via group theory and other means in his Programming the 
Universe: A Quantum Computer Scientist Takes on the Cosmos. More direct 
connections can be made via the entropy concept where the information is the 
difference between the entropy of a system and its entropy with all internal 
constraints relaxed, but it comes to the same thing in the end. There are 
several convergent ways to relate information to form, then, in contemporary 
physics. But basically it is in the asymmetries.

As far as the relation between the asymmetries and symmetries go, I think this 
is still a bit open, since the symmetries represent the laws. Some physicists 
like Paul Davies talk as if the symmetries add nothing once you have all the 
asymmetries, so the laws are a result of information as well. I don’t see 
through this adequately myself as yet, though.

John


From: Srinandan Dasmahapatra 
[mailto:s...@ecs.soton.ac.ukmailto:s...@ecs.soton.ac.uk]
Sent: May 26, 2015 10:20 PM
To: u...@umces.edumailto:u...@umces.edu; John Collier

Cc: fis
Subject: Re: [Fis] It From Bit video

Re: boundary conditions, etc.

I struggle to understand many/most of the posts on this list, and the 
references to boundary conditions, geometry and information leave me quite 
befuddled as well. Is it being claimed

Re: [Fis] It From Bit video

2015-05-26 Thread John Collier
Dear Srinandan,

He relation of geometry to information theory (and also of particle theory in 
the Standard Theory) is by way of group theory. Groups describe symmetries, 
which are reversible. What is left over are the asymmetries, which are the 
differences that can be identified as information. This is worked out in some 
detail by my former student, Scott Muller, in Asymmetry: The Foundation of 
Information. Springer: Berlin. 2007. Seth Lloyd relates the information concept 
to quantum mechanics via group theory and other means in his Programming the 
Universe: A Quantum Computer Scientist Takes on the Cosmos. More direct 
connections can be made via the entropy concept where the information is the 
difference between the entropy of a system and its entropy with all internal 
constraints relaxed, but it comes to the same thing in the end. There are 
several convergent ways to relate information to form, then, in contemporary 
physics. But basically it is in the asymmetries.

As far as the relation between the asymmetries and symmetries go, I think this 
is still a bit open, since the symmetries represent the laws. Some physicists 
like Paul Davies talk as if the symmetries add nothing once you have all the 
asymmetries, so the laws are a result of information as well. I don’t see 
through this adequately myself as yet, though.

John


From: Srinandan Dasmahapatra [mailto:s...@ecs.soton.ac.uk]
Sent: May 26, 2015 10:20 PM
To: u...@umces.edu; John Collier
Cc: fis
Subject: Re: [Fis] It From Bit video

Re: boundary conditions, etc.

I struggle to understand many/most of the posts on this list, and the 
references to boundary conditions, geometry and information leave me quite 
befuddled as well. Is it being claimed that geometry the same as information? 
That the requirement of predictions makes the focus on physical laws irrelevant 
unless the boundary conditions are specified? Or even that the continuum is at 
odds with the speed of light, considering classical electromagnetism is a 
well-defined continuum field theory. As for galactic distances, the only 
scientific basis upon which we conceive of the large scale structure of the 
universe is via the field equations of gravity, which brings a coherent package 
of causal thinking built into it. I did understand the bit on Noether, as 
energy conservation is indeed a consequence of time translation invariance, but 
that comes embedded in a continuum description, typically.

In biological systems, energy input makes the picture specific to the system 
one cordons off for study, and often it is hard to adequately describe 
phenomena by scalar potentials alone due to the currents in the system. And 
Noether cannot deliver reversibility.

To me the message of Sean Carroll in the YouTube video that an equivalent 
redescription of physics (or biology) in terms of information is not enough, 
strikes me as sane.

Cheers,
Srinandan


 Original message 
From: Robert E. Ulanowicz
Date:26/05/2015 16:16 (GMT+00:00)
To: John Collier
Cc: fis
Subject: Re: [Fis] It From Bit video

I would like to strongly reinforce John's comments about boundary
conditions. We tend to obsess over the laws and ignore the boundary
statements. (Sort of a shell game, IMHO.) If boundary conditions cannot be
stated in closed form, the physical problem remains indeterminate! (The
aphorism from computer science, Garbage in, garbage out! is appropriate
to reversible laws as well.)

Then there is the issue of the continuum assumption, which was the work of
Euler and Leibniz, not Newton. Newton argued vociferously against it,
because it equated cause with effect. The assumption works quite well,
however, whenever cause and effect are almost simultaneous, as with a
force impacting an object, where the force is transmitted over small
distances at the speed of light. It doesn't work as well when large
velocities are at play (relativity) or very small distances and times
(quantum phenomena) -- whence the need arose to develop the exceptional
sciences, thermodynamics, relativity and quantum physics.

I would suggest it doesn't work well at very large distances, either.
Consider galaxies, which are on the order of 100,000 or more light years
in diameter. (I was surprised to learn recently that we really don't have
decent models for the dynamics of galaxies.) Gravitational effects are
relatively slow to traverse those distances, so that cause and effect are
not immediate. (Sorry, I don't think quantum entanglement is going to
solve this conundrum.) If cause and effect are widely separated, then the
continuum assumption becomes questionable and by implication,
reversibility as well. Now Noether demonstrated that reversibility and
conservation are two sides of the same coin. So I see it as no great
mystery that we encounter problems with conservation of matter and energy
at galactic scales or higher -- witness dark matter and dark energy.

Of course, I am neither a particle physicist nor

Re: [Fis] RV: THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL? (R.Capurro)

2015-05-19 Thread John Collier
Rafael, Joseph, list members,

That is an interesting way of putting it, but I think the answer is yes. C.S. 
Peirce's pragmatacism is aimed at doing exactly that. Mathematical structures 
and other structural models have no implication of reality in the sense that 
reality is contingent, so we need a way to test applications. For Peirce, this 
is against our expectations of reality, which give meaning to the models in 
particular applications (pragmatic maxim).

This goes some way to responding to Joseph, who says:
When John C. talks about references crossing ecology, management and political 
science, what is of interest to me and perhaps others is the 'substance' so to 
speak of the crossing. To make things difficult (rather than easy for a 
change), let us assume that this substance includes, but is not limited to 
common assumptions and common attitudes. (My informational exchanges today are 
more interdisciplinary because I am paying more attention to the way in which 
information is processed in the different disciplines.)

Peirce's maxim goes a long way towards getting at the substance (you don't need 
his categories to apply his pragmatic maxim), and should be sufficient, but I 
would agree that it would be easier if there are shared presuppositions, domain 
specific (or not so domain specific) paradigms in Kuhn's sense. Because we 
can't fully express our presuppositions (Polanyi, Quine, Wittgenstein, Barwise 
and Perry) our ideas can never be made fully clear without their losing 
anything but tautological sense. So common ground is not always easy to find, 
and it requires a fair degree of cooperation and willingness to compromise, 
especially on what seem to be certainties.

Joseph also says:
The task then becomes to express the 'substance' in informational terms. What 
informational terms are possible that are not numbers or ad hoc Peircean 
categories? The first thing I see is that the corresponding logic and category 
theory must be non-standard or it will miss the interactions and overlaps 
between disciplines. The next thing might be to change to a process 
perspective, looking at the way in which the disciplines, considered as 
informational entities, influence one another, and find some formal but 
non-mathematical language for referring to this. Are there any suggestions for 
such a language?

I think that nonstandard here requires at least that noncomputability is 
allowed. I have written ab out this in my discussion of an informational view 
of causal connection (or transfer of causation - a version of Russell's 'at-at' 
approach) in Information, causation and 
computationhttp://web.ncf.ca/collier/papers/CollierJohn%20formatted.pdf 
(2012. Information and 
Computation:http://astore.amazon.co.uk/books-books-21/detail/9814295477 
Essays on Scientific and Philosophical Understanding of Foundations of 
Information and Computation, Ed by Gordana Dodig Crnkovic and Mark Burgin, 
World Scientific). It probably requires more as well, depending on what we mean 
by 'nonstandard'. I think of nonstandard analysis as an example, but perhaps 
Joseph has more in mind, or something different.

Cheers,
John


From: Rafael Capurro [mailto:raf...@capurro.de]
Sent: May 19, 2015 3:15 AM
To: John Collier; Joseph Brenner; PEDRO CLEMENTE MARIJUAN FERNANDEZ; 
fis@listas.unizar.es
Subject: Re: [Fis] RV: THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL? 
(R.Capurro)

then the problem is, how can a 'realist' detach theoretical problems from the 
real problems of the real world.
Rafael
An earlier version was blocked due to the large set of earlier messages. 
Usually I delete them if they are not relevant. I have done that this time.

Cheers,
John

Dear fis list,
List,

Popper is famous for his Three Worlds model, in which ideas sit out there in 
their own world (the others are material and mental, roughly). The problems 
approach, I think, is directed at this world. However I think that systems 
theorists should agree at least that there are general problems that involve 
many different disciplines (Rosen calls them sometime metaphors, but he means 
mathematical or structural Formalisms that have wide generality). By solving 
some of these general problems we can facilitate the generation of solutions to 
more specific problems, both theoretical and practical. That is what systems 
theory is about.

Popper considered himself a realist, but thought that the object of theory 
(problem solutions) was verisimilitude. Exactly what that means is still a 
matter of debate.

I agree with Joseph about the usefulness of the bibliometric work. I found it 
interesting, working in ecology right now, that despite many ecologists 
accepting that there is a socio-ecological system that requires study to solve 
ecological problems, that there were few if any references crossing ecology and 
management and political science. That reflects my reading in the fields.

John



From: Fis [mailto:fis-boun...@listas.unizar.es

Re: [Fis] RV: THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL? (R.Capurro)

2015-05-18 Thread John Collier
An earlier version was blocked due to the large set of earlier messages. 
Usually I delete them if they are not relevant. I have done that this time.

Cheers,
John

Dear fis list,
List,

Popper is famous for his Three Worlds model, in which ideas sit out there in 
their own world (the others are material and mental, roughly). The problems 
approach, I think, is directed at this world. However I think that systems 
theorists should agree at least that there are general problems that involve 
many different disciplines (Rosen calls them sometime metaphors, but he means 
mathematical or structural Formalisms that have wide generality). By solving 
some of these general problems we can facilitate the generation of solutions to 
more specific problems, both theoretical and practical. That is what systems 
theory is about.

Popper considered himself a realist, but thought that the object of theory 
(problem solutions) was verisimilitude. Exactly what that means is still a 
matter of debate.

I agree with Joseph about the usefulness of the bibliometric work. I found it 
interesting, working in ecology right now, that despite many ecologists 
accepting that there is a socio-ecological system that requires study to solve 
ecological problems, that there were few if any references crossing ecology and 
management and political science. That reflects my reading in the fields.

John



From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: May 17, 2015 11:14 AM
To: PEDRO CLEMENTE MARIJUAN FERNANDEZ; fis@listas.unizar.es
Subject: Re: [Fis] RV: THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL? 
(R.Capurro)

Dear All,

I agree with Rafael that there is an anti-realist flavor to Popper's concept of 
problems. However, it indicates to me an intiution that there is something 
important going on between disciplines. This is a dynamic aspect which I feel 
is not captured by diagrams such as Loet's :-) in which the connections between 
disciplines are represented by sets of lines.

I would not be so hard as Dino on bibliometrics as such, but I think that once 
classifications and maps have been established, it is important to talk about 
where to go next.

Best wishes,

Joseph
- Original Message -
From: PEDRO CLEMENTE MARIJUAN FERNANDEZmailto:pcmarijuan.i...@aragon.es
To: fis@listas.unizar.esmailto:fis@listas.unizar.es
Sent: Sunday, May 17, 2015 1:17 PM
Subject: [Fis] RV: THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL? 
(R.Capurro)



De: Rafael Capurro [raf...@capurro.de]
Enviado el: sábado, 16 de mayo de 2015 9:34
Para: PEDRO CLEMENTE MARIJUAN FERNANDEZ
Asunto: Re: [Fis] THE FOURTH GREAT DOMAIN OF SCIENCE: INFORMATIONAL?
Karl Popper once suggested (Conjectures and Refutations, p. 67) that we should 
not think in terms or subject matter(s) or disciplines but in terms of 
problems. Problems do not arise within a fixed definition of a discipline 
(essentialism) but within a tradition where a theory is being discussed. In 
this sense, theories are in some sense disciplines or can be conceived as 
loose clusters of  theories. But Popper speaks about a world of problems in 
themselves which is a kind of Platonism not only because it separates such 
problems in themselves from their connection to the world _as_ perceived (ie. 
interpreted) by humans, but also because it creates a knowledge hierarchy  by 
giving theoretical knowledge a higher status than practical knowledge. Thirty 
years ago (sic) I wrote some thoughts on this issue. See: 
http://www.capurro.de/trita.htm

Rafael

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] THE FRONTIERS OF INTELLIGENCE SCIENCE--Zhao Chuan

2015-03-18 Thread John Collier
List,

I find that it works well to use Google Translate. It is hardly perfect, but 
much better than Bing, which gives laughable translations. I have used it here 
in Brazil on both my computer and cell phone, as well as having my bank use it 
when there were communications problems. Here is the translation I got this 
time:

Dear Yixin Zhong and Dear All,
I'm sorry that my words are not understood. On the other hand I do not want to 
miss out on anyone. Who can understand it is free to do or not to use as I 
want. The world turns the same, including the field of intelligence, regardless 
of my words. Anyway, thank you and best wishes for a well-deserved success.
Francesco Rizzo.

Best,
John


From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Francesco Rizzo
Sent: March 18, 2015 7:21 AM
To: 钟义信
Cc: fis
Subject: Re: [Fis] THE FRONTIERS OF INTELLIGENCE SCIENCE--Zhao Chuan

Caro Yixin Zhong e Cari Tutti,
mi dispiace che le mie parole non siano capite. D'altra parte non voglio 
mancare di riguardo a nessuno. Chi le può comprendere è libero di farne o non 
farne l'uso che vuole. Il mondo gira lo stesso, compreso il campo 
dell'intelligenza, a prescindere dalle mie parole. Comunque, grazie e auguri di 
un meritato successo.
Francesco Rizzo.

2015-03-15 12:12 GMT+01:00 钟义信 z...@bupt.edu.cnmailto:z...@bupt.edu.cn:

Dear Francesco,



Thank you for your e-mail.

I am sorry not to give you a reply because I am unable to understand your 
language.



Best regards,



Yixin ZHONG



- 回复邮件 -
发信人:Francesco Rizzo 
13francesco.ri...@gmail.commailto:13francesco.ri...@gmail.com
收信人:钟义信 z...@bupt.edu.cnmailto:z...@bupt.edu.cn
抄送:JohnPrpic pr...@sfu.camailto:pr...@sfu.ca,fis 
fis@listas.unizar.esmailto:fis@listas.unizar.es
时间:2015年03月15日 18时01分07秒
主题:Re: [Fis] THE FRONTIERS OF INTELLIGENCE SCIENCE--Zhao Chuan



Cari Tutti,
seguendo, per quel che posso capire, la discussione che si è accesa a proposito 
dell'intelligenza della scienza o della scienza dell'intelligenza, mi piace 
ricordare che il concetto di caos dimostra la sua importanza quando guida i 
ricercatori a creare nuove idee. I sistemi caotici sono creativi. Senza questa 
creatività la legislazione del nostro intelletto  non potrebbe conferire forma 
(tras-informare) e significare i dati altrimenti sconnessi dell'esperienza. Le 
trascendenze intellettuali  e le intuizioni empiriche servono a costruire la 
concordanza o la connessione tra le leggi del cervello e le leggi della natura 
o della società che si com-penetrano, esaltano e nobilitano reciprocamente.
Saluti augurali e grati.
Francesco Rizzo.

2015-03-12 10:57 GMT+01:00 钟义信 z...@bupt.edu.cnmailto:z...@bupt.edu.cn:



Dear John,



Thank you very much for the comments you made, which are very useful for me to 
think about.



May I just say a few words as my simple responses to the two points you wrote 
in your mail.



-- To my understanding, context and goals among others are necessary 
elements for an intelligence science system. Otherwise it would be unable to 
know where to go, what to do and how to do. In the latter case, it cannot be 
regards as intelligence system.



--  As an intelligent system, it would usually be self-organized under certain 
conditions. This means thar the system has clear goal(s), is able to acquire 
the information about the changes in environment, able to learn the strategy 
for adjusting the structures of the system so as to adapt the system to the 
exchanged environment. This is the capability of self-organizing. If the change 
of the environment is sufficiently complex and the system is able to adapt 
itself to the change, then the system can be said a compplex system.



Do you think so? Or you have different understanding?





Best regards,





Yixin ZHONG





- 回复邮件 -
发信人:John Prpic pr...@sfu.camailto:pr...@sfu.ca
收信人:钟义信 z...@bupt.edu.cnmailto:z...@bupt.edu.cn
抄送:fis fis@listas.unizar.esmailto:fis@listas.unizar.es
时间:2015年03月12日 11时43分09秒
主题:Re: [Fis] THE FRONTIERS OF INTELLIGENCE SCIENCE--Zhao Chuan



Dear Professor Zhong  Colleagues,

Unsurprisingly, some very rich food for thought in the FIS group so far this 
year!
Here's a few comments that I hope are useful in some respect:

- As I think about the idea of intelligence science as put forward, would it be 
useful to say that context and goals (as constructs) would always be 
antecedents to intelligence science outcomes?
Said another way, must intelligence science systems always include these two 
elements (among others) in a particular system configuration?

- Also, when I look at the list of elementary abilities of intelligence 
science (ie A-M), it strikes me that more than a few of them can currently be 
considered to be core knowledge management techniques (storing, retrieving, 
transferring, transforming of information etc)... therefore, is there a 
difference between intelligence science in systems that are self-organized (ie 
complexity science), compared to intelligence 

Re: [Fis] The Travelers

2014-10-30 Thread John Collier
 mailing list
Fis@listas.unizar.es

http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
___
Fis mailing list
Fis@listas.unizar.es

http://listas.unizar.es/cgi-bin/mailman/listinfo/fis





John
Collier
colli...@ukzn.ac.za
Philosophy, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The Travellers

2014-10-30 Thread John Collier
 regards
Krassimir




-Original Message- 
From: Pedro C. Marijuan 
Sent: Wednesday, October 29, 2014 3:45 PM 
To: fis@listas.unizar.es 
Subject: Re: [Fis] The Travellers 

Dear FIS colleagues,

Quite interesting exchanges, really. The discussion reminds me the times

when behaviorism and ethology were at odds on how to focus the study of

human/animal behavior. (Maybe I already talked about that some months

ago.) On the one side, a rigorous theory and a strongly reductionist

point of view were advanced --about learning, conditioned  
unconditioned stimuli, responses, observation standards, laboratory 
exclusive scenario, etc. On the other side, it was observing behavior in

nature, approaching without preconceptions and tentatively 
characterizing the situations and results; it was the naturalistic 
strategy, apprehending from nature before forming any theoretical scheme

(of course, later on Tinbergen, Lorenz, Eibl-Eibestfeldt, etc. were to

develop ad hoc theoretical schemes).

How can we develop a theory on signals without the previous naturalistic

approach to the involved phenomena? Particularly when the panorama has

dramatically changed after the information-biomolecular revolution. We

have a rich background of cellular signaling systems, both prokaryotic

and eukaryotic, to explore and cohere. We have important neuroscientific

ideas (although not so well developed). We have social physics and 
social networks approaches to the social dynamics of information. We

should travel to all of those camps, not to stay there, but to advance a

soft all-encompassing perspective, later on to be confronted with the

new ideas from physics too. The intertwining between self-production and

communication is a promising general aspect to explore, in my opinion...

socially and biologically it makes a lot of sense.

Semiotics could be OK for the previous generation--something attuned to

our scientific times is needed now.

best ---Pedro

-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 ( 6818)
pcmarijuan.i...@aragon.es

http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.es

http://listas.unizar.es/cgi-bin/mailman/listinfo/fis





John
Collier
colli...@ukzn.ac.za
Philosophy, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The Travellers

2014-10-27 Thread John Collier

Folks,

I agree with Pedro that the meaning issue is 
important. After trying to give a coherent 
account within established information theory for 
a number of years (starting with Intrinsic 
Information in 1990) I came to the conclusion 
that information theory was not enough, and 
admitted that at the Biosemiotics Gathering in 
Tartu about ten years ago. I now believe that 
semiotics is the way to go to understand meaning, 
and that information theory alone is inadequate to the task.


Of course information theory could be extended, 
but I think the correct extension is semiotics. 
As Pedro said, we have not got agreement in many 
years. I think it is time to give it up and move 
into semiotics if we want to fully understand 
information. In direct opposition to Pedro's 
appeal to the Travellers metaphor, I think that 
history has shown that semiotics is distinct from 
information theory, and that information theory 
should restrict itself to the grounds that it has 
already accomplished. Oddly, Pedro seems to be 
saying that information theory includes meaning 
in exactly the opposite way to the way that 
gypsies do not historically include Travellers. So I don't get his argument.


I believe that without an explicit theory of 
signs, we cannot hope to get a theory of meaning 
from the idea of information alone. I would not 
be upset if I were proven wrong.


My best,
John

At 02:35 PM 2014-10-23, Pedro C. Marijuan wrote:

Dear FIS colleagues,

Regarding the theme of physical information raised by Igor and Joseph,
the main problematic aspect of information (meaning) is missing there.
One can imagine that as two physical systems interact, each one may be
metaphorically attributed with meaning respect the changes experimented.
But it is an empty attribution that does not bring any further
interesting aspect. Conversely we see real elaboration of meaning in
the cellular structures of life, particularly in brains, and we see in
our societies how scientific, technological, and economic advancements
are bringing together more and more flows of information around (social
complexity and information completely dovetail, and that's a very
important feature). Together with physical information (information
theory, logics, symmetry, etc.) each one of those realms has something
important to tell us regarding the unifying perspective necessary to
make sense of the different approaches to information: we have to
carefully listen to all of them. Thus, at the time being, the mission of
information science --or FIS at least-- would remind The Travellers,
those people in the UK and Ireland, pretendedly gypsies, who live a
nomadic life camping from site to site...  It may look unfortunate for
the disciplinarily specialized parties, but  we cannot settle any
permanent info camp --seemingly for quite a long time.

best --Pedro

-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 ( 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



--
John Collier colli...@ukzn.ac.za
Philosophy, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
Http://web.ncf.ca/collier


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] MAXENT applied to ecology

2014-09-27 Thread John Collier
List,

I am curious what people think of this.

http://www.wired.com/2014/09/information-theory-hold-key-quantifying-nature/

From the article:

MaxEnt is based on principles of simplicity and consistency, but it has 
additional assumptions baked into it, starting with the fact that researchers 
must choose just a few variables to feed into the procedure. In 2008, when 
Harte first considered the idea, he decided to try it out using the size of an 
area, the number of species there, the number of individuals, and the total 
metabolic rate of all those organisms. He didnt pick these characteristics at 
random; he had an inkling, from reading work on metabolic theory, that these 
had promise for describing biological systems. In some cases, they do very well.

The simplification of a complex ecosystem into just a handful of variables has 
fueled criticisms of MaxEnt, because it assumes that those numbers and whatever 
processes generate them are the only things shaping the environment. In 
essence, it generates predictions of biodiversity without taking into account 
how that diversity arises. It implies that the details many ecologists focus on 
might not matter if you want to understand the larger patterns of an ecosystem. 
Harte said he usually gets two responses: Youve opened up a whole new theory, 
and youre an idiot, because we all know that mechanism matters in ecology.


Other extrapolation methods are mentioned in the article that I am also curious 
about.

John



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Fwd: information.energy] Joseph Brenner

2014-09-27 Thread John Collier
Catching up on old mail since I have been dealing with visa and banking issues 
(someone got into my account the old way with phone calls and faxes and stole 
$650). Nothing is resolved yet, but I have some spare time from these grueling 
necessities.

First: All energy has form. Without differences energy would just be a uniform 
0. So matter and energy do not differ in this respect.

Second: In sufi (Islamic mystics) tradition the first mane of God is Hu. It is 
an aspiration cutting off silence from noise. All the names of God have an 
icon9ic sound (Allah is a downthrust from the head to the heart and back to the 
head). A friend of mine who had studied with sufis for some time and was also a 
mathematician familiar with information theory suggested that this was the 
first distinction from which all others emerge, basically the distinction 
between something and nothing. This is in line with the sufi tradition. We can 
find sufi influences in the rationalist philosophy of both Spinoza and Leibniz. 
Leibniz, of course, is known for his attempt to found existence on distinction.

Third: I suspect that information and energy are the same at a very basic 
level, but they can become separated. Information is more closely tied to 
boundary conditions, which guide energy (and all change, as it turns out). 
These can be decoupled to a greater or lesser degree. Once we get to biology 
there is a strong decoupling, as I have argued numerous times elsewhere in 
connection with my work with Brooks and Wiley in the 80s, and the energy and 
information budgets are thus also decoupled, though never entirely separated. 
To prove this the common dimensional grounds of information and energy need to 
be established. I think that dimensional analysis gives us an  equivalence by 
way of temperature, which is average kinetic energy per degree of freedom. 
Using Brillouin's characterization of information in terms of the complement of 
entropy, it works out that information has dimensions of degrees of freedom, 
which makes some sense.

John


At 10:13 AM 2014-09-09, Pedro C. Marijuan wrote:


- Original Message -
From: Joseph Brennermailto:joe.bren...@bluewin.ch
To: Stanley N Salthemailto:ssal...@binghamton.edu ; 
fismailto:fis@listas.unizar.es ; Robert Ulanowiczmailto:u...@umces.edu
Sent: Monday, September 08, 2014 6:01 PM
Subject: Re: [Fis] information.energy

Dear Stan, Bob and All,

This was a very interesting thread which I feel is worth coming back to. First 
of all, I see the attitudes of Stan and Bob as not mutually exclusive but 
complementary. What 'history' means in the 'dim region' where it all began is 
pretty dim. Second, I agree with Stan's formulation that information implies 
more than one entity. This suggests to me that it, like energy, is a dualism, 
sharing some of the dualistic properties of that dim region, somwhere  between 
what is and, to use Arthur Eddington's phrase, what is not.

Please do not ask me if and how the above idea can be proven. I consider it as 
worth mentioning in the context of the foundations of information science 
because it leaves the door open to the complexities and contradictions of 
information you much earlier and later I have been struggling with.

It is even possible that Peirce's notions of Firstness and Secondness could be 
related to the above. The problems with these notions would be, then, a 
consequence of his trying to keep them separate to avoid contradictions, which 
he did not like.

Best regards,

Joseph

- Original Message -
From: Stanley N Salthemailto:ssal...@binghamton.edu
To: fismailto:fis@listas.unizar.es
Sent: Monday, August 04, 2014 4:21 PM
Subject: Re: [Fis] information.energy
Bob -- Note that I was pointing out a sense in which information implies 
something different from energy -- especially in the context of dialectics, 
which is the basis of Joseph's approach. There can be no 'precipitated' energy 
(matter) without some kind of form, realizing one or some constraints, but the 
concept of information (its history) tends to imply interaction.
STAN
On Sun, Aug 3, 2014 at 11:13 PM, Robert E. Ulanowicz 
u...@umces.edumailto:u...@umces.edu wrote:
 Stanley N Salthe ssal...@binghamton.edumailto:ssal...@binghamton.edu
 9:32 AM (0 minutes ago)
 to Joseph
 Joseph -- Commenting on:
 ...
 Is there not also a sense that information implies more than one entity
 (sender-receiver, object-interpreter)? That too would tend to align with
 the idea of energy being primary.
But Stan, you were one of the first to recognize the broader nature of
information as constraint. It is also inherent in structure (Collier's
enformation). Hence, wherever inhomogeneities exist, so does information
-- an argument for a common origin. Bob



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Japanese physicists convert information into energy

2014-09-21 Thread John Collier

Folks,

This is a step further than the experiment I posted on previously 
(mentioned in the article) that used information to do work. Here 
information is converted into energy. The story is at

http://www.technologyreview.com/view/428670/entangled-particles-break-classical-law-of-thermodynamics-say-physicists/

You can follow the references to the original paper. It is at 
http://arxiv.org/abs/1207.6872


From the news article:

Imagine two boxes of particles with trap door between them. You want 
to use the trap door to guide the faster particles into one box and 
the slower particles into the other. In a classical experiment you 
would have to measure the particles in both boxes to do this experiment.


But things are different if the particles in one box are entangled 
with the particles in the other. In that case, measurements on the 
particles in one box give you info about both sets of particles.


In essence, you're getting information for nothing. And since you can 
convert that information into energy, there is clear advantage when 
entanglement plays a role.


That's hugely significant. It means that the laws of thermodynamics 
depend not only on classical phenomenon and information but on 
quantum effects  too. The breakthrough that Funo and co make is to 
extend the theory to take this into account. We show that entangled 
states can be used to extract thermodynamic work beyond classical 
correlation, they say.


That will have important implications for all kinds of phenomenon, 
from black holes and astrobiology to quantum chemistry and nanomachines.


Now the race will be on to see who can measure it first.
---

The result is not surprising, if you accept that information can 
exist as a purely physical phenomenon, and also accept quantum 
information (see work by Seth Lloyd, e.g). Both assumptions are 
common in basic physics. If you think that information must have 
meaning, or that it must at least be representational, you are going 
to have trouble understanding this work.


Cheers,
John

--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
Http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Krassimir's Information Quadruple and GIT. Quintuples?

2014-09-05 Thread John Collier


I stand corrected. They produce more work for the same input. I think my
point stands, Bob.
John
At 12:21 AM 2014-09-05, Guy A Hoelzer wrote:
John, 
I think you are misreading Stan’s comments a little. [Stan:
please correct me if I am wrong about that.] I think it would be
fair to say that older car engines were less well fit between the
energy gradient and the system attempting to utilize it”. Another
way of saying this is that the older car engine mechanism was less
efficient in dissipating that gradient, which translated into low gas
mileage. Those engines had to work harder in delivering the same
outcome (say driving 1 mile) than the newer, more efficient
engines. The capacity of the new engines to work harder than old
engines does not mean they work harder to produce the same outcome.
I don’t see the flaw in saying that working harder to achieve a constant
outcome degrades more energy. Clever design and selection can
indeed utilize information to yield greater efficiencies, which can only
approach the limit imposed by the 2nd law. It looks to me like you
and Stan are really in agreement here. Am I missing
something?
Cheers,
Guy
On Sep 4, 2014, at 1:06 PM, John Collier
colli...@ukzn.ac.za
wrote:
S: In decline in the actual
material world that we inhabit. That is, the local world -- the
world of input and dissipation. I think the information problem may
be advanced if we try to explain why the energy efficiency of any work is
so poor, and gets worse the harder we work. This is the key local
phenomenon that needs to be understood.

JC: Information can be used to improve efficiency.

SS: That is not same question. Which is: Why is any work
constitutively poor in energy efficiency? I wrote a little essay (
Entropy: what does it really mean? General Systems
Bulletin 32:5-12.) suggesting that it results from a lack of
fittingness between energy gradient and the system attempting to utilize
it -- that is, that it is an information problem.

Actually, it is part of the same question. As I have said many times, you
trivialize the idea of maximum entropy production if you relativize it to
all constraints. Howard has made this sort of point over and over as
well.

But you are right that the important factor is an information
problem.

I was once asked to referee a paper that argued that we could get around
2nd law degradation by using the exhaust heat in a clever way,
and keep doing this ad infinitum. I pointed out (sarcastically) that we
could do this, but only if we could make smaller and smaller people to
use the energy (apologies to Kurt Vonnegut).

We get much more work out of gasoline engines than we used to, even
though most are smaller and work harder. So, no, it is not in general
true that harder work degrades more energy. Clever design (and selection)
can make a difference that is more significant.

John


___
Fis mailing list
Fis@listas.unizar.es

http://listas.unizar.es/cgi-bin/mailman/listinfo/fis





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Krassimir's Information Quadruple and GIT. Quintuples?

2014-09-04 Thread John Collier


Catching up after a myriad of distracting problems.
At 03:51 PM 2014-08-25, Stanley N Salthe wrote:
Bob wrote: 
Recall that some thermodynamic variables, especially work functions
like
Helmholz  Gibbs free energies and exergy all are tightly related
to
information measures. In statistical mechanical analogs, for example,
the
exergy becomes RT times the mutual information among the
molecules
S: So, the more organized, the more potential available energy.

I think not, Stan. Organization requires a middling degree of complexity.
Exergy is maximized when the mutual information is 1, like in a crystal.
Crystals are not highly organized. See Collier and Hooker
Complexly Organised
Dynamical Systems (1999) for discussion.
I happen to be a radical who
feels that the term energy is a construct
with little ontological depth.
S: I believe it has instead ontological breadth!
It is a bookkeeping device (a nice one, of course, but bookkeeping
nonetheless). 
It was devised to maintain the Platonic worldview. Messrs. Meyer 
Joule simply 
gave us the conversion factors to make it look like energy is
constant.
S: It IS constant in the adiabatic boxes used to measure it.
*Real* energy is always in decline -- witness what happens to the
work functions I 
just mentioned.
S: In decline in the actual material world that we inhabit. That
is, the local world -- the world of input and dissipation. I think
the information problem may be advanced if we try to explain why the
energy efficiency of any work is so poor, and gets worse the harder we
work. This is the key local phenomenon that needs to be understood.

Information can be used to improve efficiency.
John





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Krassimir's Information Quadruple and GIT. Quintuples?

2014-09-04 Thread John Collier
S: In decline in the actual material world that we inhabit.  That is, the local 
world -- the world of input and dissipation.  I think the information problem 
may be advanced if we try to explain why the energy efficiency of any work is 
so poor, and gets worse the harder we work. This is the key local phenomenon 
that needs to be understood.



JC: Information can be used to improve efficiency.



SS: That is not same question.  Which is: Why is any work constitutively poor 
in energy efficiency?  I wrote a little essay ( Entropy: what does it really 
mean?  General Systems Bulletin  32:5-12.) suggesting that it results from a 
lack of fittingness between energy gradient and the system attempting to 
utilize it -- that is, that it is an information problem.


Actually, it is part of the same question. As I have said many times, you 
trivialize the idea of maximum entropy production if you relativize it to all 
constraints. Howard has made this sort of point over and over as well.

But you are right that the important factor is an information problem.

I was once asked to referee a paper that argued that we could get around 2nd 
law degradation by using the exhaust heat in a clever way, and keep doing this 
ad infinitum. I pointed out (sarcastically) that we could do this, but only if 
we could make smaller and smaller people to use the energy (apologies to Kurt 
Vonnegut).

We get much more work out of gasoline engines than we used to, even though most 
are smaller and work harder. So, no, it is not in general true that harder work 
degrades more energy. Clever design (and selection) can make a difference that 
is more significant.

John


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Krassimir's Information Quadruple and GIT. Quintuples?

2014-08-25 Thread John Collier
 is a thing itself.  That is, R generates S.

 E as evidence is a vague term which infers an observer (2nd Order
 Cybernetics?) that both receives and evaluates the signal (S) from the
 thing (R).  CSP categorizes evidence as icon, index or symbol with
 respect to the entity of observation.

 I  as Krassimirian information is a personal judgment about the
 evidence.  (Correspondence with CSP's notion of argument is
 conceivable.)

 Krassimir's assertion that:
  For different I , information may be different because of subjects'
 finite memory and reflection possibilities.
  Because of this, a physical event with an infinite bandwidth may
 have finite information content (for concrete information subject) .


 moves these 'definitions' of individual symbols into the subjective
 realm. (CSP's notion of interpretation?)
 Different researchers have the freedom to interpret the evidence as
 they choose, including the relationships to engineering terms such as
 bandwidth.


 Pridi's post appropriately recognizes the tension between objective
 scientific theories and subjective judgments about evidence by
 different  individuals with different professional backgrounds and
 different symbolic processing powers.

 The challenge for Krassimirian information, it appears to me, is to
 show that these definitions of symbols motivate a coherent symbol
 system that can be used to transfer information contained in the
 signal from symbolic representations of entities. It may work for
 engineering purposes, but is it extendable to life?

 (For me, of course, this requires the use of multiple symbol systems
 and multiple forms of logic in order to gain the functionality of
 transfer of in-form between individuals or machines.)

 Pridi writes:
   How can we really quantify meaningful (semantic) information beyond
 Shannon (that disregards semantics) and his purely statistical
 framework?

 One aspect of this conundrum was solved by chemists over the past to
 two centuries by developing a unique symbol system that is restricted
 by physical constraints, yet functions as an exact mode of
 communication.

 Chemical notation, as symbol system, along with mathematics and data,
 achieves this end purpose (entelechy) of communication, for some
 entities, such as the meaning of an atomic number as a relational
 term and hence the meaning of a particular integer as both quantity
 and quality.

 This requires a dyadic mathematics and synductive logic for
 sublations.


 Pridi writes:

  It does give me a better understanding of how information (beyond
 Shannon) can be formalized!

 Can you communicate how this better understanding...   ...
 foramlized  works?

 It is not readily apparent to me how Krassimirian information can be
 formalized.

 Anybody have any suggestions on how this quadruple of symbols can be
 formalized into a quantitative coherent form of communication?

 Cheers

 Jerry








 


 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis




 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

 ___
 Fis mailing list
 Fis@listas.unizar.es
 http://listas.unizar.es/cgi-bin/mailman/listinfo/fis




--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
Http://web.ncf.ca/collier


___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] The remote Maxwell demon as energy down-converter

2014-08-20 Thread John Collier

http://arxiv.org/abs/1408.3797

This article describes an intersting implementation of the use of 
information instead of energy, something I have been arguing for on 
this list for some time (see also my 1990 paper on Maxwell's demon).


John

--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
Http://web.ncf.ca/collier

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] FIS in Varna. Analogue Computation

2014-07-15 Thread John Collier


At 03:14 PM 2014-07-15, you wrote:
Dear John,
Thank you for this interesting perspective. Regarding the origin of
the
limited band width of physical processes, could this have its
origin in
some regularity other than circularity? For example, the continuous
going
back and forth (the phrase is Botero's) between opposing attitudes
or
states, alternately predominantly actual and potential?
My understanding of waves is that is how they work, also similar
phenomena like pendula and oscillating springs, not to mention
orbits.

All natural processes, then,
have a capacity for continuous information
bearing. The problem is then the origin of /discreteness/, not only in
your
countercase, which involves quantum particles, but at higher levels
of
interactions between complex entities! For me, the only solution is
that
continuity and discontinuity are properties of information which are
not totally separate from one another.
I was thinking more of billiard ball collisions, not ones that depend on
quantum states. In my article, Causation is the transfer of
information (available on my web site) and expanded in 



Information, causation and computation (2012.

Information and Computation: Essays on Scientific and Philosophical
Understanding of Foundations of Information and Computation, Ed by
Gordana Dodig Crnkovic and Mark Burgin, World Scientific)
I use a formal notion of an information channel to deal with
information transmission in classical systems. There are special problems
when the dynamics are not computable, but I explain how the idea can work
there as well. I do, however, need more formal proofs of sufficiency at
this time, though. Fortunately, my approach does not require computation
of the amount of information transferred, so I suppose it could be
infinite and still work, but I doubt it is infinite in real processes. I
suppose I will have to work that out at some point, one way or the
other.
John




Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] COLLECTIVE INTELLIGENCE

2014-03-09 Thread John Collier


Loet, List,
I think that codification is important, but we can get emergent social
phenomena even when the information channels are not codified or even
fully recognized. I would prefer to focus on the quality of the
information channels, which are aided (sometimes -- also in some ways
hindered) by codification. I don't see the problem with collective
intelligence if reference is made to information channels in a
strong sense like in Barwise and Seligman, Information Flow.

Nonetheless, I take your point about the quality of information flow
being important. Moving this one step up to communication (opening the
coding and decoding boxes in communications theory) and its quality is
also often important, I think, but not necessary in all cases. I agree
that looking at the individual representations/actions and the sum
total of reflections distracts from what is important to the issue.
That is an important point.
Regards,
John
At 01:49 PM 2014-03-08, Loet Leydesdorff wrote:
Dear John, 

Beyond the case of pyramids, one can think of more abstract forms of
social organization such as the rule of law as a supra-individual
coordination mechanism. 

I doubt that “collective intelligence” is the fruitful category. As in
the rule of law, it seems to me that codification of the communication
(e.g., legislation and jurisprudence) are the vehicles. In other words,
the quality of the communication is more important than the individual or
sum total of reflections.

Best,
Loet



Loet Leydesdorff 
Professor Emeritus, University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
l...@leydesdorff.net ;

http://www.leydesdorff.net/ 
Honorary Professor, SPRU,
University of Sussex; Visiting Professor,
ISTIC,
Beijing;
Visiting Professor,
Birkbeck, University of London.

http://scholar.google.com/citations?user=ych9gNYJhl=en


From: fis-boun...@listas.unizar.es
[
mailto:fis-boun...@listas.unizar.es] On Behalf Of John
Collier
Sent: Saturday, March 08, 2014 11:26 AM
To: Foundations of Information Science Information Science
Subject: Re: [Fis] COLLECTIVE INTELLIGENCE

Guy, 
This looks fruitful, but it might be argued that the exchanges of
information in a colony can be reduced to individual exchanges and
interactions, and thus there is not really any activity that is holistic.
This is what Steven is doing with his example of pyramid
building.
On the other hand, with ants, for example, it has been shown by de
Neuberg and others that in ant colonies the interactions cannot be
reduced, but produce complex organization that only makes sense at a
higher level of behaviour. Examples are nest building and bridge
building, among others. I assume the same is true for humans.
For example, in the pyramid case, why is it being built, why are people
so motivated to cooperate on such a ridiculous project? Contrary to
widespread opinion the workers were not slaves, but they were individual
people. I doubt this can be explained at the individual level. If ants
have complexly organized behaviour, then surely humans do as well -- we
are far more complex, and our social interactions are far more
complex.
John
At 10:33 PM 2014-03-07, Guy A Hoelzer wrote:
I think of ‘collective intelligence’ as synonymous with collective
‘information processing’. I would not test for its existence by
asking if group-level action is smart or adaptive, nor do I think it is
relevant to ask whether ‘collective intelligence’ informed or misinformed
individuals. I would say that in the classic example of eusocial
insect colonies (like honey bees, for example) there is no reasonable
doubt that information is processed at the level of the full colony,
which can be detected by the coordination of individual activities into
coherent colony-level behavior. Synchronization and complementarity
of individual actions reflect the top-down influences of colony-level
information processing. 
It is the existential question that I think is key here, and I hope our
conversation includes objective ways to detect the existence or absence
of instances where a ‘collective intelligence’ has manifested as a way to
keep this concept more tangible and less metaphorical.
Cheers,
Guy
On Mar 6, 2014, at 9:22 PM, Steven Ericsson-Zenith
ste...@iase.us
wrote:

 Is there such a thing as Collective Intelligence?
I am concerned that the methods of the Harvard paper demonstrate nothing
at all and, however well intended, they appear to be insufficiently
rigorous and one might say unscientific. 
If the question were: are there things that a group of individuals may
achieve that an individual may not, build the Pyramids or go to the Moon,
for example, then manifestly this is the case. 
However, can we measure the objective efficiency of a group by
considering the problems solved by individuals working together in groups
such that we may identify whether there is an environment independent
quantifiable addition or loss of efficiency in all cases? Perhaps, but
one suspects not.
Bottomline: I

Re: [Fis] COLLECTIVE INTELLIGENCE

2014-03-08 Thread John Collier
, innovation may indeed benefit
from

this new info-crowd turn, and other societal changes are
occurring

(from new forms of social uprising and revolt, to the detriment
of the

natural info flows --conversation--, an increase of
individual

isolation, diminished happiness indicators, etc.)

Brave New World? Not yet, but who knows...

best ---Pedro


Prpic wrote:

 ON COLLECTIVE INTELLIGENCE: The Future of IT-Mediated
Crowds

 John Prpiæ

 Beedie School of Business

 Simon Fraser University

 pr...@sfu.ca





 Software (including web pages and mobile applications etc) is
the key building block of the IT field in terms of human interaction, and
can be construed as an artifact that codifies organizational process “…in
the form of software embedded “routines” (Straub and Del Guidice 2012).
These organizational processes are frozen into the artifact, though not
fossilized, since the explicit codification that executes an artifact can
be readily updated when desired (Orlikowski and Iacono 2001, Yoo et al.
2012).



 A software artifact always includes “a setting of interaction”
or a user interface, for example a GUI or a DOS prompt (Rogers 2004),
where human beings employ the embedded routines codified within the
artifact (including data) for various purposes, providing input, and
receiving programmed output in return. The setting of interaction
provides both the limits and possibilities of the interaction between a
human being and the artifact, and in turn this “dual-enablement”
facilitates the functionality available to the employ of a human being or
an organization (Del Giudice 2008). This structural view of artifacts
(Orlikowski and Iacono 2001) informs us that “IT artifacts are, by
definition, not natural, neutral, universal, or given” (Orlikowski and
Iacono 2001), and that “IT artifacts are always embedded in some time,
place, discourse, and community” (Orlikowski and Iacono 2001).



 Emerging research and our observation of developments in
Industry and in the Governance context signals that organizations are
increasingly engaging Crowds through IT artifacts to fulfill their
idiosyncratic needs. This new and rapidly emerging paradigm of
socio-technical systems can be found in Crowdsourcing (Brabham 2008),
Prediction Markets (Arrow et al. 2008), Wikis (Majchrzak et al. 2013),
Crowdfunding (Mollick 2013), Social Media (Kietzmann et al 2011), and
Citizen Science techniques (Crowston  Prestopnik 2013).
Acknowledging and incorporating these trends, research has emerged
conceptualizing a parsimonious model detailing how and why organizations
are engaging Crowds through IT in these various substantive domains
(Prpiæ  Shukla 2013, 2014). The model considers Hayek's (1945)
construct of dispersed knowledge in society, as the antecedent condition
(and thus the impetus too) driving the increasing configuration of IT to
engage Crowds, and further details that organizations are doing so for
the purposes of capital creation (knowledge  financial).



 However, as might be expected, many questions remain in this
growing domain, and thus I would like to present the following questions
to the FIS group, to canvas your very wise and diverse views.





 Is there such a thing as Collective Intelligence?

 How does IT effect the existence or non-existence of Collective
Intelligence?

 -

http://www.wjh.harvard.edu/~cfc/Woolley2010a.pdf

 -

http://papers.ssrn.com/sol3/cf_dev/AbsByAuth.cfm?per_id=1919614

 -

http://www.collectiveintelligence2014.org/



 How do national innovation systems (and thus policy too) change
when we consider IT-mediated crowds as the 4th Helix of innovation
systems?

 -

http://www.springer.com/business+%26+management/book/978-1-4614-2061-3




 Does the changing historical perception of crowds signal other
societal changes?

 -

http://www.emeraldinsight.com/journals.htm?articleid=1907199





--

-

Pedro C. Marijuán

Grupo de Bioinformación / Bioinformation Group

Instituto Aragonés de Ciencias de la Salud

Centro de Investigación Biomédica de Aragón (CIBA)

Avda. San Juan Bosco, 13, planta X

50009 Zaragoza, Spain

Tfno. +34 976 71 3526 ( 6818)

pcmarijuan.i...@aragon.es



http://sites.google.com/site/pedrocmarijuan/

-


___

fis mailing list

fis@listas.unizar.es



https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis



Steven

___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis






Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Feedforward II and Anticipation] Joseph Brenner

2014-02-18 Thread John Collier
/cgi-bin/mailman/listinfo/fis



___

fis mailing list

fis@listas.unizar.es



https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis



-- 
Professor, University of Amsterdam
Amsterdam School of Communications Research (ASCoR)
Honorary Professor, SPRU,
University of Sussex; Visiting Professor,
ISTIC, Beijing;


http://scholar.google.com/citations?user=ych9gNYJhl=en





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] RV: Encoding and Decoding information-- From Jerry Chandler

2014-02-04 Thread John Collier



From: Jerry LR Chandler

jerry_lr_chand...@mac.com
Subject: Encoding and Decoding information
Date: February 1, 2014 11:30:44 PM CST
To:
fis@listas.unizar.es

List
John writes:
Sometimes ignored in the
mathematics of Shannon's approach are the coding and decoding steps,
which he does not put in mathematical form, but appear in his
diagrams.
John, I think your remark goes to the very heart of the problems of
foundations of information sciences. 
I heartily concur.
I would add a couple of brief comments on why this is such a profoundly
difficult problem. Over the years, I have attempted to induce a
conversation here on FIS on the coding problems, to no avail. I am
delighted to learn of your interest in it. Problems of this depth strain
our individual and collective resources.
At the root of the problem, from my perspective, is the very notion of
codes. In the absence of direct sensory communication, all
human communication is by artifacts, symbol systems invented and used by
individuals. A priori, all symbol systems, as human artifacts, must
be learned anew by each passing generation. As human inventions, no
necessity for consistency exists. They are intrinsically unstable. Ever
human being tends to adapt their own perspectives on the meaning, if any,
of a particular code.
The two exceptions are the codes for mathematics and chemistry. The rigid
structure of number systems and arithmetic operations is sufficient to
preserve the foundation codes of arithmetic for millennia, since the
Sumerians, yet flexible enough to allow steady expansions of meanings of
new symbols. The code of chemistry is grounded in physical atomism.
Natural elements are rigidly defined in terms of properties that appear
to be stable for millions/billions of years
Thus, as social communities, the mathematicians and the chemists
communicate very effectively within their own symbol systems. But no
formal logic exists which match the meanings of these two coding systems.

Other communities, for example, philosophy and political and economic and
music and religion and ... have deep problems in establishing consistent
encoding and decoding pathways. The nature of encoding and decoding
severely limit the discourse in bio-semiotics and make communication
extremely difficult. The many conundrums in bio-semiotics are often
merely mis-codings of natural processes.
In my own lifelong work on biological mutations as changes of the
biological encoding of information, I have encountered conundrums of
encoding and decoding in its many molecular biological forms. It appears
to involve many forms of differential equations.
IMO, An understanding of the processes of encoding and decoding is
essential to the understanding of the foundations of information
sciences. 
A trivial example of the perplexities of encoding and decoding are the
relationships among computer languages, an area that Ted Gorenson has
focused a lot of attention and who I have learned much
from.
Ted was the source of my information about what was going on at Stanford.
I haven't seen any concrete results, though. Ted hasn't been on the fis
list for some time now.
My PhD thesis was basically about the problems of communication across
different paradigms, hence my interest in informal approaches to
pragmatics. The Barwise-Seligman program seems to me to be a formal
structure in which I can put my ideas about informal pragmatics required
to establish communication as outlined in my dissertation. This what I am
developing at the Cape Town meeting in August on scientific realism. My
approach has some similarities to some approaches to conflict resolution,
but it, like them, requires both sides to be looking for a resolution.

An example from my thesis is that affine geometry permitted relativity
and Newtonian theories to be put within a common framework. I would like
to see the same happen with information in its various guises. I don't
think that arguing the merits of various interpretations of the idea help
much compared to getting clear what the positions and their relations
are. But arguing the merits can serve the purpose of revealing the
positions more clearly, perhaps ironically.
John





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Fw: Responses

2014-01-22 Thread John Collier


At 09:45 PM 2014-01-21, Robert E. Ulanowicz wrote:
 The reason of being of
information, whatever its content or quantity, is
 to be used by an agent (biological or artificial).
Dear Christophe,
In making this restriction you are limiting the domain of information
to
communication and excluding all information that inheres in
structure
per-se. John Collier has called the latter manifestation
enformation,
and the calculus of IT is quite effective in quantifying its extent.
Perhaps John would like to comment?
I developed this concept in order to reply to Jeff Wicken's complaint
that Brooks and Wiley did not distinguish properly between the complement
of entropy and structural information, but I used it in print to discuss,
in the context of cognitive science and especially John Perry's use of
information (see Barwise and Perry Situations and Attitudes and
his What is information?, as well as Dretske's book on information and
perception) what the world must be like in order to make sense of
information coming from the world into our brains. The article can be
found at
Intrinsic
Information (1990) In P. P. Hanson (ed) Information, Language and
Cognition: Vancouver Studies in Cognitive Science, Vol. 1 (originally
University of British Columbia Press, now Oxford University Press, 1990):
390-409. Details about information are there, but the gist of it is that
can be measured, is unique, and depends on time scale to distinguish it
from informational entropy in information systems. The uniqueness
hypothesis was developed very carefully in my former student, Scott
Muller's PhD thesis, published as Asymmetry: The Foundation of
Information (The Frontiers Collection) by Springer in 2007.
I am rather busy now at a conference, or else I would say more
here.
John




Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information, signals and data.

2014-01-13 Thread John Collier
 
 President, International Center for Information Ethics
(ICIE) 

(http://icie.zkm.de)
Editor in Chief, International Review
of 
 Information Ethics (IRIE)
(
http://www.i-r-i-e.net) Postal 
 Address: Redtenbacherstr. 9, 76133 Karlsruhe, Germany
 E-Mail: raf...@capurro.de
 Voice: + 49 - 721 - 98 22 9 - 22 (Fax: -21)
 Homepage:
www.capurro.de
 
 -- next part --
 An HTML attachment was scrubbed...
 URL: 


http://webmail.unizar.es/pipermail/fis/attachments/20140111/
1b
 183bd0/attachment.htm 
 
 --
 
 ___
 fis mailing list
 fis@listas.unizar.es


https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis
 
 
 End of fis Digest, Vol 579, Issue 18
 
 

___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] New Year Lecture

2014-01-03 Thread John Collier


At 02:55 AM 2014/01/03, Joseph Brenner wrote:
Happy New Year and
Goodwill to all FIS'ers and distinguished guests!

I found the concept of Quantum Bayesianism as presented by
Professor von Baeyer most interesting. From the point of view of bringing
the subject-object balance back into physics it is very congenial to
Logic in Reality (LIR). I have several criticisms of this approach,
however, which I will try to make clear in the absence of any real skills
in quantum mechanics:

1. QBism seems not to consider the option of using
non-standard, non-Kolmogorivian probabilities to describe quantum and
non-quantum nature, that is, with values 0 but 1.

2. It excludes the case, impossible by classical logic, but
basic to physics and LIR, of a dynamic interaction between the subject
and the object which allows both views (belief and
facts) to be partly true or better operative at the same time
or at different times.

3. Since the QBism interpretation does not deal with points
1. and 2. above (also in the Fuchs, Mermin, Shack paper), it leaves the
door open to an anti-realist interpretation not only of quantum
mechanical reality, but of reality /tout court/ which must be based on
and reflect the quantum 'situation'. 
Sorry Joseph, but I don't understand your point 1. Could you
expand?
On 3, I think all forms of Bayesianism not only leave the door open to
antirealist interpretations, but are antirealist by their nature that
subjective probabilities are what probabilities are (Hume was the first
to declare this point, to the best of my knowledge). Bayes Theorem itself
is not antirealist, and can be applied to systems both internally and
externally. It is also a theorem of information theory that applies
whether you take information to be a subjective interpretation or an
objective intrinsic property of systems. But Bayesianism is subjective by
tradition and largely (there are exceptions in applications of
algorithmic information theory along Wallace's lines) by usage. I find
that people get a visceral reaction to Bayesianism much like they do to
generalized antirealism (as opposed to antirealism about a class of
things, which everyone accepts). Before an examination of a (realist)
thesis, another (antirealist) member of the examining committee joked to
me that being a realist or antirealist must be genetic. It is certainly
deep seated.
John





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Article on panpsychism

2014-01-03 Thread John Collier


Folks,
The article on the Scientific American site at

http://www.scientificamerican.com/article.cfm?id=is-consciousness-universalprint=true
 might be of interest to this group. It discusses an information based
measure of consciousness.


Is Consciousness Universal?Panpsychism, the ancient doctrine
that consciousness is universal, offers some lessons in how to think
about subjective experience today
By Christof
Koch | Wednesday, January 1, 2014 |

I am not a panpsychist, but this is the most reasonable version I have
seen (barring, perhaps, Leibniz', with its distinction between confused
and clear perceptions, which takes a similar route). I think the measure
is of interest independently of panpsychism.
John 




Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] FW: 2nd CFP: Is computation observer-relative? (The 7th AISB Symposium on Computing and Philosophy, AISB-50, Goldsmiths, London, 1-4 April 2014)

2013-12-06 Thread John Collier
This might be of interest to some members of the list.

John

From: owner-philo...@marnier.louisiana.edu 
[mailto:owner-philo...@marnier.louisiana.edu] On Behalf Of John Preston
Sent: 06 December 2013 11:21 AM
To: philo...@liverpool.ac.uk; philo...@louisiana.edu
Subject: 2nd CFP: Is computation observer-relative? (The 7th AISB Symposium on 
Computing and Philosophy, AISB-50, Goldsmiths, London, 1-4 April 2014)

***full paper submission deadline: 3 January 2014***

The 7th AISB Symposium on Computing and Philosophy:
Is computation observer-relative?

AISB-50, Goldsmiths, London, 1-4 April 2014

As part of the AISB-50 Annual Convention 2014 to be held at Goldsmiths, 
University of London
http://www.aisb.org.uk/events/aisb14

The convention is organised by the Society for the Study of Artificial 
Intelligence and Simulation of Behaviour (AISB)
http://www.aisb.org.uk/

OVERVIEW:

One of the claims integral to John Searle's critique of computational cognitive 
science and 'Strong AI' was that computation is 'observer-relative' or 
'observer-dependent' (Searle, The Rediscovery of the Mind, 1992). This claim 
has already proven to be very controversial in cognitive science and AI 
(Endicott 1996; Coulter  Sharrock, Rey, and Haugeland in Preston  Bishop 
(eds.), Views into the Chinese Room, 2002).

Those who come to the subject of computation via physics, for example, often 
argue that computational properties are physical properties, that is, that 
computation is 'intrinsic to physics'. On such views, computation is comparable 
to the flow of information, where information is conceived of in statistical 
terms, and thus computation is both observer-independent and (perhaps) 
ubiquitous. Connected with this are related issues about causality and identity 
(including continuity of), as well as the question of alternative formulations 
of information.

This symposium seeks to evaluate arguments, such as (but not limited to) 
Searle's, which bear directly on the question of what kind of processes and 
properties computational processes and properties are. It thus seeks to address 
the general question 'What is computation?' in a somewhat indirect way. 
Questions that might be tackled include: Are computational properties syntactic 
properties? Are syntactic properties discovered, or assigned? If they must be 
assigned, as Searle argues, does this mean they are or can be assigned 
arbitrarily? Might computational properties be universally realized? Would such 
universal realizability be objectionable, or trivialise computationalism? Is 
syntax observer-relative? What kinds of properties (if any) are 
observer-relative or observer-dependent? Is observer-relativity a matter of 
degree? Might the question of whether computation is observer-relative have 
different answers depending on what is carrying out the computation in 
question? Might the answer to this question be affected by the advent of new 
computing technologies, such as biologically- and physically-inspired models of 
computation? Is it time to start distinguishing between different meanings of 
'computation', or is there still mileage in the idea that some single notion of 
computation is both thin enough to cover all the kinds of activities we call 
computational, and yet still informative (non-trivial)? Does Searle's idea that 
syntax is observer-relative serve to support, or instead to undermine, his 
famous 'Chinese Room argument'?

TOPICS OF INTEREST:

1.  COMPUTATIONAL-PHILOSOPHICAL ISSUES

Questions of ontology and epistemology
i.  COMPUTATION AS OBSERVER RELATIVE

Is computation an observer relative phenomenon? What implications do answers to 
this question have for the doctrine of computationalism?

ii.  WHAT IS COMPUTATION?

Does computation (the unfolding process of a computational system) define a 
natural kind? If so, how do we differentiate the computational from the 
non-computational?

iii. IMPLICATIONS FOR COMPUTATIONAL ONTOLOGY, and PAN-COMPUTATIONALISM

To what extent and in what ways can we say that computation is taking place in 
natural systems? Are the laws of natural processes computational? Does a rock 
implement every input-less FSA (Putnam, Chalmers)? Is the evolution of the 
universe computable as the output of an algorithm? I.e. is the temporal 
evolution of a state of the universe a digital informational process akin to 
what goes on in the circuitry of a computer? Digital ontology' (Zuse), the 
nature of the physical universe is ultimately discrete; cf. Kant's distinction 
- from the antinomies of pure reason - of simple parts and no simple parts; 
the discrete and the analogue.


2.  SOME COMPUTATIONAL-PHILOSOPHICAL ISSUES

Computation in machines and computation in nature; Turing versus non-Turing 
computation
i.  COMPUTATION IN NATURE

Investigating the difference between formal models of physical and biological 
systems and physical/biological reality-in-itself and the implication(s) for 
theory of 

Re: [Fis] FW: social flow

2013-11-25 Thread John Collier
 a lot of work to be done in these essential matters?

best ---Pedro

De: Joseph Brenner
[joe.bren...@bluewin.ch]
Enviado el: jueves, 21 de noviembre de 2013 20:22
Para: Roly Belfer; PEDRO CLEMENTE MARIJUAN FERNANDEZ
Cc: fis@listas.unizar.es
Asunto: Re: [Fis] social flow

Dear Roly, Dear Pedro,

Thank you for taking this thread in a for me very
interesting direction. As you know, interesting means what I find my
logical system can confirm, improve, validate, etc. The two notes share
one feature that one might criticize, namely, that they deal essentially
with present, conscious material, whereas information flow
almost by defintion seems to involve components that are absent,
potential, unconscious, etc.

Similarly, the application of the Square of Opposition in
Roly's reference would at first sight appear to be explanatory, but on
closer inspection, I find everything reduced back to binary logic, arrows
in a box. What has to be added, pace Jakobson, is some notion of
the actual dynamics of what Roly calls a mutual relateable
framework. And let's not be too greedy: let's get the pairwise
interactions right and then see where we can go with more complex
ones.

Cheers,

Joseph





- Original Message - 

From: Roly Belfer 

To: Pedro C.
Marijuan 

Cc:
fis@listas.unizar.es 

Sent: Thursday, November 21, 2013 4:44 PM

Subject: Re: [Fis] social flow

Dear Pedro 

Thank you! there is some sort of synchronicity here: I was just
recently thinking about Roman Jakobson and his 6 levels of semiotic
analysis. Especially the phatic _expression_, as some kind of white
noise that is necessary for the interpersonal informational
handshake. That is, an infosphere - be it organic or more
like artificial info networks - would need to have actants operate in a
mutually relateable framework (even if it is only pairwise).

The meaningless/senseless datum is important for establishing the
lines of communication, and perhaps some emergent properties (such as
intimacy, grouping, pre-communicative acceptance). 

Do you know of any quantified work re Jakobson? (I keep

this around for different purposes) 

Best

Roly


On Thu, Nov 21, 2013 at 1:50 PM, Pedro C. Marijuan

pcmarijuan.i...@aragon.es wrote:


Dear FIS colleagues,

Just a wandering thought, in part motivated by the highly formal

contents of the other discussion track. What are the major
contents,

topics, and styles in our social, spontaneous exchanges? Seemingly
the

response is that most of those exchanges are just casual,
irrelevant,

performed for their own sake. There are scholarly references
about

that---though our own perusal of social life may quite agree.
The

information flow, the circulation of social information, becomes
the

message itself (echoing McLuhan), amorphously gluing the
different

networks of the social structure... Flowing naturally in
spontaneous

exchanges and also fabricated and recirculated by the media. Our

talkative species needs the daily dose --otherwise mental health
resents

quite easily.

I am these days reading Robert Trivers (2011) on self-deception and
how

the info flow we are conscious of becomes a highly self-centered

concoction for for our own social self-promotion. I think it
partially

dovetails with the above: we are the content.

best ---Pedro

--

-

Pedro C. Marijuán

Grupo de Bioinformación / Bioinformation Group

Instituto Aragonés de Ciencias de la Salud

Centro de Investigación Biomédica de Aragón (CIBA)

Avda. San Juan Bosco, 13, planta X

50009 Zaragoza, Spain

Tfno. +34 976 71 3526 ( 6818)


pcmarijuan.i...@aragon.es



http://sites.google.com/site/pedrocmarijuan/

-

___

fis mailing list

fis@listas.unizar.es



https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis





___

fis mailing list

fis@listas.unizar.es



https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


___ fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] social flow

2013-11-21 Thread John Collier
Interesting point, Pedro. Robin Dunbar's work is 
closer to a pure social bonding role. He argues, 
and has some evidence for, oral communication 
playing a similar role in our cultures as 
grooming does in chimpanzees and other species. 
He uses this to explain how we can have larger 
group sizes. To my knowledge neither he nor 
others have applied idea to the implications of 
writing, though I have read some speculation 
about internet communication on group sizes, but 
none of it seemed very scientific to me.

I have some further things I can say about roles 
of communication with respect to bonding, content 
and meaning prescriptions, but I will keep them 
for now as I am way behind in a number of things 
I must do. Basically, though, verbal 
communication plays multiple roles the same time.

John


At 01:50 PM 2013/11/21, Pedro C. Marijuan wrote:
Dear FIS colleagues,

Just a wandering thought, in part motivated by the highly formal
contents of the other discussion track. What are the major contents,
topics, and styles in our social, spontaneous exchanges? Seemingly the
response is that most of those exchanges are just casual, irrelevant,
performed for their own sake. There are scholarly references about
that---though our own perusal of social life may quite agree. The
information flow, the circulation of social information, becomes the
message itself (echoing McLuhan), amorphously gluing the different
networks of the social structure... Flowing naturally in spontaneous
exchanges and also fabricated and recirculated by the media. Our
talkative species needs the daily dose --otherwise mental health resents
quite easily.
I am these days reading Robert Trivers (2011) on self-deception and how
the info flow we are conscious of becomes a highly self-centered
concoction for for our own social self-promotion. I think it partially
dovetails with the above: we are the content.

best ---Pedro

--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta X
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 ( 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
Http://web.ncf.ca/collier


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Praxotype

2013-10-15 Thread John Collier
This term might be useful in the context of the present discussion, especially 
in the contest of coordinated practice(s). Cognotype might also be useful. I 
think these might lead to a more fine-grained analysis of the more integrative 
sociotype.

http://blogs.scientificamerican.com/guest-blog/2013/09/27/words-are-thinking-tools-praxotype/

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Collier's Metaphysics

2013-05-27 Thread John Collier
Another vapid criticism with no argument. Give me an idea, Jerry, 
give me an idea. You obviously think I don't have it, so it would be 
rude of you to just say this sort of thing and refrain. List some 
things that are involved with metaphysics that I have missed.

Otherwise I will have to assume that you cannot do this.

John

At 05:27 AM 2013/05/27, Jerry LR Chandler wrote:

On May 26, 2013, at 10:46 AM, John Collier wrote:

I don't have much idea.


I concur.

Jerry


--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
Http://web.ncf.ca/collier

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] About FIS 2005

2013-04-15 Thread John Collier
 information. The argument can be found in the paper
Propagating of Organization: An Inquiry by Stuart Kauffman, Robert K.
Logan, Robert Este, Randy Goebel, David Hobill and Ilya Smulevich.
published in 2007 in Biology and Philosophy 23: 27-45. I am happy to
share this paper with anyone requesting it. 
Bob Logan
On 2013-04-14, at 9:59 PM, Xueshan Yan wrote:

Dear Michel,
Thank you!
I am very familiar with your FIS 2005 website long before.
Have you read the Polish chemist Nalewajski's book:
Information theory of molecular systems (Elsevier, 2006), I
really want to know if there are INFORMATON that play a role
between two atoms, or two molecules, or two supramolecules
as Jean-Marie Lehn said.
As to FIS 2005, I need every review about all four FIS
conferences held in Madrid, Vienna, Paris, and Beijing, but
only a general review about FIS 2005 not be given by people
so far.
Best regards,
Xueshan
9:59, April 15, 2013 Peking
University

-Original Message-
From: Michel Petitjean
[
mailto:petitjean.chi...@gmail.com]
Sent: Sunday, April 14, 2013
6:19 PM
To: Yan Xueshan
Subject: Re: About FIS 2005
Dear Xueshan,
As far as I know, there is no longer report, but I am atyour

disposal if you wish to get
more: please feel free to askme. 
Also you may have a look at the
programme, theproceedings, 
and all what is available from
the main welcome page: 
http://www.mdpi.org/fis2005/
Best, Michel.

2013/4/14 Xueshan Yan
y...@pku.edu.cn:

Dear Michel,
May I ask you a favor?
Do you have any more detailed review about FIS
2005,except 
your FIS 
2005 brief conference report
published in 

http://www.mdpi.org/entropy/htm/e7030188.htm?
Best regards,
Xueshan
17:47, April 14, 2013

__ 
Robert K. Logan
Chief Scientist - sLab at OCAD
Prof. Emeritus - Physics - U. of Toronto 

www.physics.utoronto.ca/Members/logan







Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

Http://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] FW: fis Digest, Vol 570, Issue 2

2013-04-14 Thread John Collier
I am afraid that it was my fault. I thought I recalled a quote, but actually it 
is an interpretation of Mackay. Sorry about that. Amazing that it spread so 
much, but that probably reflects that it is endemic in Mackay's work.

John

-Original Message-
From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On 
Behalf Of Xueshan Yan
Sent: 14 April 2013 10:53 AM
To: fis@listas.unizar.es
Subject: Re: [Fis] fis Digest, Vol 570, Issue 2

Dear Pedro, Dear Joseph,

About the Milton Keynes Conference, i.e., about DTMD definition, we saw this 
quote long long ago, but there two different sayings: One is Information is a 
distinction that makes a difference from Donald M. MacKay in his Information, 
Mechanism and Meaning (1969), and another is Information is a difference that 
makes a difference from Gregory Bateson in his Steps to an Ecology of Mind 
(1972).

Although I have checked it page by page in Donald M.
MacKay's book but can't found it, whereas it is easy to find Information is a 
difference that makes a difference in Gregory Bateson's Steps to an Ecology 
of Mind at page 230, 361, 339, etc., who can tell the accurate priority about 
DTMD?

Best wishes,

Xueshan
16:49, April 14, 2013   Peking University

 -Original Message-
 From: fis-boun...@listas.unizar.es
 [mailto:fis-boun...@listas.unizar.es] On Behalf Of
 fis-requ...@listas.unizar.es
 Sent: Sunday, April 14, 2013 12:00 AM
 To: fis@listas.unizar.es
 Subject: fis Digest, Vol 570, Issue 2

 Send fis mailing list submissions to
   fis@listas.unizar.es

 To subscribe or unsubscribe via the World Wide Web, visit

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis
 or, via email, send a message with subject or body 'help'
to
   fis-requ...@listas.unizar.es

 You can reach the person managing the list at
   fis-ow...@listas.unizar.es

 When replying, please edit your Subject line so it is more

 specific than Re: Contents of fis digest...


 Today's Topics:

1. Re: FIS News (Moscow 2013) (joe.bren...@bluewin.ch)
2. Re: FIS News (Moscow 2013) (PEDRO CLEMENTE MARIJUAN
FERNANDEZ)
3. Re: FIS News (Moscow 2013) (Gyorgy Darvas)




--

 Message: 1
 Date: Fri, 12 Apr 2013 17:11:58 + (GMT+00:00)
 From: joe.bren...@bluewin.ch joe.bren...@bluewin.ch
 Subject: Re: [Fis] FIS News (Moscow 2013)
 To: pcmarijuan.i...@aragon.es, fis@listas.unizar.es
 Message-ID:
15776686.90091365786718476.javamail.webm...@bluewin.ch
 Content-Type: text/plain; charset=utf-8





 Dear Pedro,

 Glad to hear from you. Your silence was, of course, expressive,
 containing much information . . .

 Now all of us will be waiting impatiently to learn about
the
 the new, exciting themes that were discussed at the Milton

 Keynes Conference.

 Best wishes,

 Joseph

 Message d'origine
 De: pcmarijuan.i...@aragon.es
 Date: 12.04.2013 11:02
 À: fis@listas.unizar.es
 Objet: [Fis] FIS News (Moscow 2013)

 Dear FIS Friends,

 Apologies for my long silence. As I have already said
several
 times, my science management duties are killing not only
my
 time but also my nerve (well, not completely!). Imagine
what
 is happening with the financing and organization of
Spanish
 science these years...

 Anyhow, a couple of good news about our common Information

 Science endeavor. First, there has been an excellent conference in
 Milton Keynes, organized by the Open University, about Information
 (the difference that makes
a
 difference). Quite exciting discussions on our most dear themes, and
 some new ones that we have rarely addressed
here.
 The organizers, a very active team indeed, are cordially invited to
 lead a discussion session in our FIS list to continue with the
 conceptual explorations addressed in
their
 conference.

 And the second news is about an imminent FIS CONFERENCE, MOSCOW 2013,
 the Sixth FIS, and the 1st of the ISIS organization. It will be held
 this May, from 21 to 24 in Moscow. This time the Russian organizers
 have followed a singular procedure, a relatively closed conference
centered
 in the diffusion of information science in the Russian scientific
 community.  At the time being, to my knowledge
(I
 could not follow very well the process), only the members
of
 the ISIS board have been enlisted as foreign participants.

 But given that there will be several absences, interested
FIS
 parties might ask about their possible participation.
 The schedule is too tight for travels, visas etc, and
again I
 have to apologize for not having posted this info before (info glut!).
 In any case, am sure that our colleague Konstantin  Kolin (
 koli...@mail.ru ), leading organizer,
and
 member of the Russian Academy of Science, will be happy to

 respond to interested parties and help them to accelerate
the process.

 Best wishes to all

 ---Pedro

 -
 Pedro C. Marijuán
 Grupo de Bioinformación / Bioinformation Group 

Re: [Fis] [Fwd: SV: Science, Philosophy and Information. An Alternative Relation] S.Brier

2013-02-11 Thread John Collier
I guess I am at a loss to see them as separate 
discourses.  Especially in the domain of Information.

Contrary to what Stan said, I think that many of 
the major advances in science from Statistical 
Mechanics, to Relativity Theory to Quantum 
Mechanics did and continue to have a major 
philosophical component, and professional 
philosophers work with scientists directly in 
each of these fields, It used to be true in 
Computer Science, but is less so now. In 
Cognitive Science there is currently virtually 
now separation. In Biology there are many 
philosophers who work with biologists, and vice 
versa, but far too many who do not.

I think that technology is much more linked to 
industry than it is to the sciences above.

John

At 06:03 PM 2013/02/11, Loet Leydesdorff wrote:
How does one measure the synergy among three discourses?
That is an interesting question within information theory (as part of both
science and philosophy).

Best,
Loet


-Original Message-
From: fis-boun...@listas.unizar.es [mailto:fis-boun...@listas.unizar.es] On
Behalf Of Pedro C. Marijuan
Sent: Monday, February 11, 2013 4:29 PM
To: fis@listas.unizar.es
Subject: [Fis] [Fwd: SV: Science, Philosophy and Information. An Alternative
Relation] S.Brier

 Original Message 
Subject:SV: [Fis] Science, Philosophy and Information. An
Alternative
Relation
Date:   Thu, 07 Feb 2013 20:32:04 +0100
From:   Søren Brier sb@cbs.dk
To: joe.bren...@bluewin.ch joe.bren...@bluewin.ch, Pedro Clemente
Marijuan Fernandez pcmarijuan.i...@aragon.es, fis@listas.unizar.es
fis@listas.unizar.es, John Collier colli...@ukzn.ac.za
References: 6043399.89641360255002322.javamail.webm...@bluewin.ch



Dear Joseph



I go for each of the three nominally independent disciplines are not
independent, but that each provides a dynamic ontological and
epistemological link to the other two, more or less strong or actual
depending on the extent to which one wishes to emphasize certain aspects
of knowledge. Science without philosophy is stupid but philosophy
without science is blind. I am for a synergetic interaction.





Best wishes



   Søren Brier



Professor in the semiotics of information, cognition and commmunication
science,

department of International Business Communication, Copenhagen Business
School,

Dalgas Have 15, 2000 Frederiksberg, Denmark,









*Fra:* fis-boun...@listas.unizar.es
[mailto:fis-boun...@listas.unizar.es] *På vegne af *joe.bren...@bluewin.ch
*Sendt:* 7. februar 2013 17:37
*Til:* Pedro Clemente Marijuan Fernandez; fis@listas.unizar.es; John Collier
*Emne:* [Fis] Science, Philosophy and Information. An Alternative Relation



Dear FIS Colleagues,

The formation of the the Society for the Philosophy of Information at
the University of Hertfordshire is announced in the link in John's note.
It includes the announcement and Call for Papers of the International
Conference on the Philosophy of Information to be held in Xi'An, China
in October, 2013, sponsored by both the above Society, led by Professor
Luciano Floridi and the Institute for the Philosophy of Information in
Xi'An under the direction of Professor Wu Kun.

This increased activity in the area of the philosophy of information
(another major Workshop is planned this Spring) raises the issue of the
relation between the science and philosophy of information as well as of
the philosophy of science. I am aware of and agree with the position
expressed by Pedro that information science in the FIS framework should
emphasize scientific research in the sense of knowledge that is
quantifiable and/or provable. However, I do not believe that either he
or others of you intend to exclude rigorous qualitative knowledge,
especially as it concerns the dual nature of information.

The ubiquitous presence of information in all disciplines, as emphasized
by Wu, suggests an alternative relation linking philosophy, science and
information that is NOT one of simple hierarchical inclusion or
possession (of). One possibility is to say that it is information that
links philosophy and science, but this formulation perhaps fails to
recognize the general properties of the latter two.

Another possibility is to say that each of the three nominally
independent disciplines are not independent, but that each provides a
dynamic ontological and epistemological link to the other two, more or
less strong or actual depending on the extent to which one wishes to
emphasize certain aspects of knowledge.

I look forward to your comments regarding the pros and cons of such a
conception. Thank you.

Best wishes,

Joseph

Ursprüngliche Nachricht
Von: colli...@ukzn.ac.za mailto:colli...@ukzn.ac.za
Datum: 04.02.2013 18:57
An: fisfis@listas.unizar.es mailto:fis@listas.unizar.es
Betreff: [Fis] Society for the Philosophy of Information

http://www.socphilinfo.org/


--
Professor John Collier
colli...@ukzn.ac.za mailto:colli

[Fis] Society for the Philosophy of Information

2013-02-04 Thread John Collier
http://www.socphilinfo.org/


--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://web.ncf.ca/collierhttp://web.ncfhttp://web.ncf.ca/collier.ca/collier

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] John Collier's Large Correction. And Information?

2012-12-04 Thread John Collier


Dear folks,
I have been rather busy recently, but I replied to Joseph's comments
privately. I now post my responses to the list. Joseph's remarks were of
course first published there.
Cheers,
John

Hi Joseph,
Sorry to take so long to reply, but I have been pretty busy. I finally
got my South African taxes together, and when I submitted them they
immediately said they were auditing me. So I had to put together all my
documentation and send it in. As a lot of it was in Durban this was not
easy.

At 05:50 PM 2012/11/26, you wrote:
John,

Have you seen this note? I don't know when Pedro will get
around to circulating it.

It is your comment that I would welcome most.

Cheers,

Joseph


- Original Message - 
From: Joseph Brenner

To: John Collier ;
fis 
Sent: Friday, November 23, 2012 10:13 AM
Subject: John Collier's Large Correction. And
Information?
Dear John, Dear Colleagues,



In his detailed note of
November 19, John made a series of points which, taken together, add up
to a good foundation for a non-computationalist view of real processes. I
have identified my glosses by initials.



1. Specific reference is made
to non-computable processes in biological development, with implications
for evolution and phylogeny.



2. Reaction diffusion systems
cannot be solved because dissipation is an essential part of their
dynamics.

JEB: This opens the door to
the existence of systems in which some process other than diffusion is
also at work.
I am not quite sure what you mean here, but the result can be larger
scale patterns or structures. For example, in development of animals the
formation of the gastrula in the blastula (the gut from the round oocyte)
is a reaction-diffusion process. This leads to differentiation that can
lead to further differentiation, and so on.




3. Analytical models that are
not synthetic are non-computable for infinite possible data.

JEB: I suggest also for
transfinite data, data that are infinite to all intents and
purposes. I agree that synthetic models may (or may not) be reducible to
their data, but only if their mereology is classical, which it seems to
be in John’s note. Rosen’s distinction is itself too
classical.
I am not sure what you mean by classical here, Joseph. They are the only
logically possible types of models if we take a model to be a logical
structure. Synthetic models are reducible from their definition;
analytical models are not. Behaviorist models are synthetic, for example.
Of course they don't work in most cases (they do work in a restricted
range of behaviors). Computational models of mind can have the same
reducibility, but need not be if the computations are only partial
recursive (which is equivalent to saying that they are not algorithms in
the usual sense of Knuth -- that is always terminating).
If I am missing something here, please expand.



4. Irrationality is not a
property of a machine, and as such would indeed be unimaginable and not
understandable. 

JEB: Irrationality in human
behavior is complex but not illogical in my terms. We are all
rational and irrational to different extents at different
times.
That would be a consequence of partial recursive functions being an
correct analytical model of the processes, I would think. There would be
a range of predictability (and hence controllability) but it would be
limited.



5. Again “if my body is a
machine”; “if the world is a quantum computer”, but I (John) worry about
decoherence.

JEB: The hedging and the
restrictions placed by Lloyd and Tegmark on their theories of the
universe amount to intellectual dishonesty. By the time they are applied,
you have a computational model all right, but it is a caricature of the
real world.
I think the problem arises because of the failure to recognize that
decoherence is a form of entropy production, and to further recognize
that this is part of the dynamical process that can involve the system
working on itself, particularly on its boundary conditions (constraints).
This is not a typically computational process (though it can fit the
weaker notion of computation as a recursively enumerable process, but not
necessarily recursive). Recursive functions are all computable. A
recursive function is one whose values and their complements are both
recursively enumerable, so you can tell of any value whether or not it is
in the range of the function. It might seem that this approach ignores
the possibility that some processes are many-many relations rather than
many-one, but I think this would violate some basic conditions on
causality that don't have any empirical counter-examples that I know
about. I am willing to allow that there might be non-causal connection
principles, but I would want empirical evidence of some sort for those as
well.



6. To avoid reductionism in
reality, as opposed to it in logic and mathematics, I think we need the
additional condition of dissipation.

JEB: The point of my
logic in reality is that it is non-reductionist. It gives

Re: [Fis] FW: The Information Flow

2012-11-19 Thread John Collier
 harmonic of the
Moon's rotation to revolution speed, and the 3-2 harmonic of Mercury's
rotation to revolution speed. The dynamical equations in these cases are
not solvable.
So to Gordana's argument, I would add dissipation (energy coming into the
system from within would work as well, but we don't know of this sort of
case with any certainty -- it would violate conservation of energy, or
the non-decrease of entropy, or both). In other words, the system has to
a) produce entropy within, and b) dissipate it outwards (as heat or other
form of lesser order).


If Gordana writes
models then there is a world that is modeled and its logic
and rules can be quite different.This is the world whose processes I am
trying to describe. A computational theory of models is fine. 
Similarly, when Bruno writes:But I am not Turing
emulable would be as hypothetical, and, in my opinion much more
speculative, especially with the weak form of computationalism I use as a
working hypothesis., why not let the two flowers bloom?

I am open to that idea, but I have never seen a non-comp theory, and for
good reason: indeed, to assume non comp, you have to diagonalize against
the partial computable functions. Strictly speaking this does not work,
so you have to assume something irrational in the picture, and why not,
but this seems premature to me, given the range and power of the comp
hypothesis, when well understood and not reduced to its total functions
parts. 
I disagree here as well. I won't go into this in depth here, as I have to
do my taxes urgently, and this sort of thing takes up a lot of time for
me. Robert Rosen distinguishes between what he calls synthetic and
analytic models. Synthetic models can be broken down into parts (like
inputs and outputs, or the dynamics of pairwise parts) and summed to get
a total model of the system (they can be reduced logically to their data,
if we have enough of it). Not all analytic models are synthetic -- given
data the number of possible synthetic models is in the set of logical
sums of the data sets. The possible models are the in crossproduct
(logical product) of the data set with itself. This is bigger than the
first set, so their are more possible analytic models than synthetic
models in general. Analytic models that are not synthetic are
noncomputable for infinite possible data, but I won't prove that here; it
is either obvious or you need to learn more logic. (Rosen's account isn't
much better.)

Assuming something irrational at
the start, can only hide the irrationality which somehow already exists
for the machine, and deprive comp from its explanation power in some
arbitrary way. If comp is refuted, then we will have a way to localize
where something irrational, and not comp derivable, occurs.

I am not sure what you mean by irrational, Gordana. It could mean
effectively noncomputable in the Turing sense, in which case it would be
effectively random (within constraints on the whole model). Or it could
mean something else, in which case it is beyond my current imagining
abilities to understand. I don't see analytic but not synthetic models as
irrational, though some data would be unpredictable, and not fully
understandable.


All of his remarks
apply to that part of the world to which his weak form of
computationalism applies, but not to the entire
world.
But what is the world? Computationalism can answer that, and can guaranty
that the world (whatever it is) is NOT computable or emulable by a
computer. Comp is an hypothesis on me (or you), not on the world, which,
assuming comp, is not a computable object, if it is an object at all. A
summing up slogan: if my body is a machine, then consciousness and matter
are not Turing emulable. Or if my body is a machine, my soul cannot be a
machine.
Seth Lloyd argues rather convincingly that the world is a quantum
computer. I worry about the role of decoherence in making it really like
a computer nonetheless. But Newman's cases are not Turing emulable if you
mean by a halting machine (Knuth algorithm), but by a programme in
general (or a Rosen analytic model) the issue is different. 

If someone want me to explain
more on why Church thesis protects us against reductionism, and why there
is no total universal machine, please just ask. Keep also in mind that we
can send only two posts per week.
As I have tried to argue above, to avoid reductionism in reality as
opposed to in logic and mathematics I think we need the additional
condition of dissipation (what I call nonHamiltonian mechanics elsewhere
-- the usual condition of conservation breaks down due to the loss of
free energy to the system).
Best,
John





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031
http
://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Some of you will find this of interest

2012-10-27 Thread John Collier



http://www.edge.org/conversation/constructor-theory
CONSTRUCTOR THEORYA Conversation With
David Deutsch
[10.22.12] 


There's a notorious problem with defining information within physics,
namely that on the one hand information is purely abstract, and the
original theory of computation as developed by Alan Turing and others
regarded computers and the information they manipulate purely abstractly
as mathematical objects. Many mathematicians to this day don't realize
that information is physical and that there is no such thing as an
abstract computer. Only a physical object can compute things.
~~
I think it's important to regard science not as an enterprise for the
purpose of making predictions, but as an enterprise for the purpose of
discovering what the world is really like, what is really there, how it
behaves and why.

DAVID DEUTSCH is a Physicist at the University of Oxford. His research in
quantum physics has been influential and highly acclaimed. His papers on
quantum computation laid the foundations for that field, breaking new
ground in the theory of computation as well as physics, and have
triggered an explosion of research efforts worldwide. He is the recipient
of the $100,000 Edge of Computation Prize, and he is the author of
The Beginning of Infinity and The Fabric of Reality.





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031
http
://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The Information Flow

2012-10-16 Thread John Collier


Good point, Stan. I think that it can be used to create a notion of
'knowing that', but it will require at least another level. I review some
ways to do this in

Explaining Biological Functionality: Is Control Theory Enough? South
African Journal of Philosophy. 2011, 30(4): 53-62. The main references
are more directly related to 'knowing that', but I would see 'knowing
that' as fulfilling a particular functional role,. and requiring
something like explicit representations, both of which I deal with in the
paper. I can see that there is a further paper to be written that takes
the step to the specific case of 'knowing that'.
Cheers,
John
At 03:38 PM 2012/10/15, Stanley N Salthe wrote:
On that curious definition
of knowledge, it looks like 'knowing how' rather than 'knowing
that'. 
STAN 
On Mon, Oct 15, 2012 at 11:56 AM, Pedro C. Marijuan

pcmarijuan.i...@aragon.es wrote:


Dear FIS Colleagues,

Thanks to Zhao Chuan for the Computer Poem/Song. It is a soft way
to

retake our discussions. These weeks there have been a couple of

important achievements in the bio-information field. On the one
side,

the first 'complete' model of a prokaryotic cell (A
Whole-Cell

Computational Model Predicts Phenotype from Genotype, by Karr
et al.,

Cell, 150, 389-401, 2012). On the other, there was the report of
another

'complete' scheme, that of the C. elegans nervous system, now at
the

level of individual synaptic contacts, which was able to explain
the

mating behavior of the worm (The Connectome of a
Decision-Making Neural

Network, by Jarrell et al., Science, 337, 437-444, 2012). It
contained

several references to the information flow through
interneurons and

sensorimotor circuits, and a very curious definition of knowledge
(as

the set of activity weights in an adjacency matrix of a neural
network,

upon which the network's input-output function in part
depends...).

Both papers are very interesting, relatively consistent with each
other,

and I think both represent symbolic milestones in the
bio-information

field. The point on information flows left me thinking on the
larger

perspective beyond single information items that we rarely focus
on.

Actually the first Shannonian information metaphor was about
sources

and channels --wasn't it? Particularly thinking on social
information

matters, how many aspects of contemporary life relate to the
maintenance

of the information flows intertwining and directing the economic
flows.

No doubt that the forces of communication have definitely
won the

upper hand upon the forces of production .

Somehow, Zhao Chuan's poem is but a celebration of the central role
that

computers have come to play in the gigantic information flows of our
time.

best wishes

--Pedro

--

-

Pedro C. Marijuán

Grupo de Bioinformación / Bioinformation Group

Instituto Aragonés de Ciencias de la Salud

Centro de Investigación Biomédica de Aragón (CIBA)

Avda. San Juan Bosco, 13, planta X

50009 Zaragoza, Spain

Tfno. +34 976 71 3526
( 6818)


pcmarijuan.i...@aragon.es



http://sites.google.com/site/pedrocmarijuan/

-

___

fis mailing list

fis@listas.unizar.es



https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis







Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031
http
://web.ncf.ca/collier



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Good (clear) article on information and physics

2012-06-05 Thread John Collier


 
 
Professor John Collier  
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: colli...@ukzn.ac.za On 2012/06/01 at 03:30 PM, in message 
8031ab93-e68f-4760-bba9-a1b18bfee...@ulb.ac.be, Bruno Marchal 
marc...@ulb.ac.be wrote:




but offhand it seems to me to depedn on a sort of idealism that I do not accept.




It does not. It does rely on Church thesis, which relies on arithmetical 
realism, that is the idea that elementary arithmetical truth are NOT a creation 
of the mind, which is a form of anti-idealism.

 
Ok, we are thinking in similar ways. Thanks for the link to your paper.
 
John
 
 

Please find our Email Disclaimer here: http://www.ukzn.ac.za/disclaimer/
___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Good (clear) article on information and physics

2012-06-01 Thread John Collier
http://physicsworld.com/cws/article/print/2012/may/31/the-quantum-game-of-life

Sample excerpt:
Hopes that digital physics might be resurrected in some form rose in 
the early 1980s, when Richard Feynman proposed that the blatant gap 
between the power and information content of quantum theory and that 
of classical computers might be bridged by a new type of computer. 
His idea was born out of frustration at seeing classical computers 
take weeks to simulate quantum-physics experiments that happen faster 
than a blink of an eye. Intuitively, he felt that the job of 
simulating quantum systems could be done better by a computer that 
was itself a quantum system.

Cheers,
John



--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Good (clear) article on information and physics

2012-06-01 Thread John Collier
Hi all. In order to access the article I am pretty sure you need to establish 
an account on physics World. It is free. I did it so long ago I had forgotten.
 
Bruno, I am not sure exactly what you mean by the existence of the first 
person indeterminacy in arithmetic, but offhand it seems to me to depedn on a 
sort of idealism that I do not accept.
 
Incidentally, quantum decoherence is best seen as a sort of thermodynamic 
effect. There are quantum measurements that can be reversed. I can give some 
references if anyone wants.
 
John


 
 
Professor John Collier  
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: colli...@ukzn.ac.za On 2012/06/01 at 02:45 PM, in message 
cc50a53f-b07a-4c24-a602-d02c7c891...@ulb.ac.be, Bruno Marchal 
marc...@ulb.ac.be wrote:

Hi John,

On 01 Jun 2012, at 13:02, John Collier wrote:

 http://physicsworld.com/cws/article/print/2012/may/31/the-quantum-game-of-life

 Sample excerpt:
 Hopes that digital physics might be resurrected in some form rose in
 the early 1980s, when Richard Feynman proposed that the blatant gap
 between the power and information content of quantum theory and that
 of classical computers might be bridged by a new type of computer.
 His idea was born out of frustration at seeing classical computers
 take weeks to simulate quantum-physics experiments that happen faster
 than a blink of an eye. Intuitively, he felt that the job of
 simulating quantum systems could be done better by a computer that
 was itself a quantum system.

He was of course right on that. Actually I don't succeed in getting  
the paper from the link above.

About quantum information, here is an interesting talk by Ron Garrett,  
quite coherent with the (classical) computationalist theory of mind,  
on quantum information, seen as information theory on the complex  
numbers:

http://www.youtube.com/watch?v=dEaecUuEqfc

Personally I am not (yet?) entirely sure that quantum information is  
just classical information on the complex numbers, I think this is  
partially true, and theorem like Gleason theorem makes me believe that  
this is very plausible. Ron Garrett gives a pretty picture of Everett  
QM (QM without collapse). His account of measurement is rather  
illuminating (close to the work of Adami and Cerf).

Ron Garrett is information theoretic minded, and, with respect to  
computationalism (comp), has a coherent view of physics. Of course he  
does not seem aware of the necessity of such a view once we postulate  
comp, and the fact that this necessitates to take all computations  
(the one done below our classical comp substitution level) into  
account, (not just the quantum one) and to justify the quantum  
interferences from the first person perspective any self-justifying  
universal number.

Comp shows that the qubit --- bit road (decoherence) is two sided.

Technically, due to diagonalization used to make the self-reference,  
you get the split between truth and justifiable, which provides a tool  
to distinguish the qualia and the quanta, as different but related  
mode of information, on the inverse road bit -- qubit.

I think Ron Garrett explains (very shortly but rightly ) the qubit -  
bit justification. Comp provides a reverse of that justification, and  
this doubled by the communicable/non-communicable (G/G*) splitting:  
the  bit - quantum-bit, and the bit - quale-bit*,  with the  
explanation of the fact that the quale bit* can't be quantified nor  
described (provably so in the ideal case of arithmetically self- 
referentially correct machine)

Comp forces, just to remain coherent, to extend Everett's way of  
embedding the observer into the physical wave,  to his embedding in  
all arithmetical relations, by first person indeterminacy, with the  
advantage of explaining a fundamental role to the (universal) person  
points of view, and hopefully so, to justify QM or refuting comp, or  
weakening it or constraining it.

To be sure computationalism is incompatible with digital physics. If  
*we* are machine (classical or quantum) then neither the fundamental  
reality, nor its physical part, can be Turing emulable, despite  
quantum machine can be Turing emulated. This is more or less a direct  
consequence of the existence of the first person indeterminacy in  
arithmetic.

Bruno Marchal

http://iridia.ulb.ac.be/~marchal/



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Please find our Email Disclaimer here: http://www.ukzn.ac.za/disclaimer/
___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Stephen Wolfram discussing his ANKS in Reedit this Monday

2012-05-16 Thread John Collier
Dear folks,
 
I think there is a bit of confusion here due to an ambiguity in the idea of 
computation. A function is computable for a given input only if it has an 
equivalent Turing machine that halts. A function is a computation if it is 
representable by a Turing machine. (I assume the Church-Turing thesis in both 
cases. However there are lots of Turing machines that do not halt (more than 
that do halt). So it is quite possible for a function that is noncomputable to 
be representable by a Turing machine. Wolfram, for example, is fairly clear on 
this. If you know Rosen's work, the computable cases are what he calls 
synthetic models. The noncomputable cases are what he calls analytic but not 
synthetic models. Krivine showed a long time ago that Newtonian mechanics 
allows noncomputable functions that are nontrivial. This is not surprising, 
really, since it is possible to model any Turing machine with a mechanical 
(colliding spheres, say) system. Interestingly, Turing left some work on 
computer models that are not Turing computable.
 
In any case, the natural computations (to allow Gordana her sense of this idea) 
need not be computable. These cases are nonreducible in the sense of not 
computable from boundary conditions and the combinatorics of lower level 
interactions.  See my A dynamical account of emergence ( 
http://web.ncf.ca/collier/papers/A%20Dynamical%20Account%20of%20Emergence.pdf ) 
(Cybernetics and Human Knowing, 15, no 3-4 2008: 75-100), 
http://web.ncf.ca/collier/papers/A%20Dynamical%20Account%20of%20Emergence.pdf 
for some more detail on the reduction and boundary condtions issue. 
Incidentally, to the best of my knowledge it was Conrad, Michael and Koichiro 
Matsuno (1990). The boundary condition paradox: a limit to the university of 
differential equations. Applied Mathematics and Computation. 37: 67-74 that 
first analyzed the boundary system problem. For some even more rigourous 
detail, also C.A. Hooker's chapter on emergence in  C. A. Hooker, Philosophy of 
Complex Systems. Handbook of the Philosophy of Science, Volume 10. 20011: 
Elsevier pp. 195ff.
 
Cheers,
John


 
 
Professor John Collier  
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: colli...@ukzn.ac.za On 2012/05/15 at 03:35 PM, in message 
20120515093552.322364lbu120x...@www.cbl.umces.edu, Robert Ulanowicz 
u...@umces.edu wrote:

Quoting Gordana Dodig-Crnkovic gordana.dodig-crnko...@mdh.se:


 2.   Whatever changes in the states of the physical world there  
 are, we understand them as computation.

Dear Gordana,

I'm not sure I agree here. For much of what transpires in nature (not  
just in the living realm), the metaphor of the dialectic seems more  
appropriate than the computational. As you are probably aware,  
dialectics are not computable, mainly because their boundary value  
statements are combinatorically intractable (sensu Kauffman).

It is important to note that evolution (which, as Chaisson contends,  
applies as well to the history of the cosmos [and even the symmetrical  
laws of force]) is driven by contingencies, not by laws. Laws are  
necessary and they enable, but they cannot entail.

Regards,
Bob

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Please find our Email Disclaimer here: http://www.ukzn.ac.za/disclaimer/
___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] [Fwd: Re: Physics of computing]--Plamen S.

2012-04-19 Thread John Collier
Thanks Steven. I think this makes our source of disagreement quite
clear. Unlike you, I take a naturalistic and realist approach to both
information and knowledge, and think that their extent and relations are
an empirical matter on which we can be radically incorrect.
 
I agree that future productivity is hard to assess (I wrote my
dissertation on Kuhn, mostly in agreement, and also with Lakatos). Past
productivity is another issue, though, and that is all I was claiming.
Basically my perspective it pretty similar to Bill Wimsatt's in
Re-Engineering Philosophy for Limited Beings: Piecewise
Approximations to Reality (Harvard, 2007)
http://www.amazon.com/Re-Engineering-Philosophy-Limited-Beings-Approximations/dp/0674015452.
He and I have had similar views since we started in Philosophy, with me
mostly following. I recall back in 1977 when I declared that Philosophy
was basically an engineering problem that I did not get a very good
response. My previous career was in geotechnical engineering using
innovative methods that I developed by taking a multifaceted approach.
Agreement in approaches is a good indication of reality, and failure
means you have made something up that isn't right. Frankly, blocking
potential consiliences a priori I find revolting.
 
John


 
 
Professor John Collier  
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: colli...@ukzn.ac.za On 2012/04/18 at 09:39 PM, in message
d7be50e5-e64d-4df3-be36-c75fc8106...@iase.us, Steven Ericsson-Zenith
ste...@iase.us wrote:

Dear John,

Since Locke established usage and appeals to authority has rarely
been a criteria of right definition or sufficient to deny refinement.
To state one's own usage clearly, in order to disclose its flaws and
intention, seems hardly a cause for criticism, unless that criticism be
simply to state your own usage with the same or greater clarity. I do
not believe we can speak for the usage of others, nor can we appeal to
dictionaries of any kind. It is our individual responsibility to take
charge of our definitions (a position, for example, that I call
definitionism).

I am with you in believing that to assert real meaning, independent
of our own usage, of any term is silly, indeed to do so is
unscientific.

For me, terms like information and knowledge are simply ways of
speaking about the world, they are notions that we force upon the world,
they are not necessary distinctions, forced upon us by the world.
Without an epistemology of this kind in the development of ideas it's
hard to project whether the usage of any given term will be productive.


Of course, we must allow for the vagaries of fortune and perception,
the road to clarity is paved with many corrections.

With respect,
Steven


--
Dr. Steven Ericsson-Zenith
Institute for Advanced Science  Engineering
http://iase.info







On Apr 18, 2012, at 4:54 AM, John Collier wrote:

 Steven,
  
 You are free to use information as you wish; however, physicists,
especially cosmologists, have been using it in ways that involve meaning
in no direct way at all. They do computations on it, and explain
cosmological and astronomical phenomena in terms in which it (or an
equivalent) is essential. See, for example, Smolin, Three Roads to
Quantum Gravity, earlier work by Wheeler and Gell Mann, more recent work
by Seth Lloyd. It is an established usage.
  
 The idea of talking in terms of the real meaning of x, where x is
some term is really a bit silly. The important thing is whether some
idea for which x is a sign can be used productively and scientifically.
  
 John
 
  
 Professor John Collier  
 Philosophy, University of KwaZulu-Natal
 Durban 4041 South Africa
 T: +27 (31) 260 3248 / 260 2292
 F: +27 (31) 260 3031
 email: colli...@ukzn.ac.za On 2012/03/18 at 07:24 PM, in message
4ad9379c-fb4a-40f8-826e-52f5978ff...@iase.us, Steven Ericsson-Zenith
ste...@iase.us wrote:
 
 I'm with Bob on this to a point. 
 
 Too often I see people giving information an existential status that
it is not due. As you will recall, in my terms, information is simply a
way of speaking about that which identifies cause and adds to knowledge,
knowledge is simply a way a way of speaking about that which
determines subsequent action. 
 
 However, this does allow me to identify a rock as the source of
information and to speak about its behavior in terms of its knowledge,
that about its structure and dynamics that determine its subsequent
action.
 
 I do not use semeiosis in the universal way that I use knowledge.
I could see it being so used only if it excludes sensory operation,
since I argue for a role that sense plays in the behavior of living
systems, and I include that role as distinguishing semeiosis, the term
for me refers only to the sign processing of living systems.
 
 With respect,
 Steven
 
 
 --
 Dr. Steven Ericsson-Zenith
 Institute for Advanced Science  Engineering
 http://iase.info
 
 
 
 
 
 
 
 On Mar 18, 2012

Re: [Fis] Physics of Computing

2012-04-10 Thread John Collier
Hi Gavin and others. 
 
Try Information in biological systems ( 
http://web.ncf.ca/collier/papers/Information%20in%20Biological%20Systems.pdf ) 
(Handbook of Philosophy of Science, vol 8, Philosophy of Information ( 
http://www.elsevier.com/wps/find/bookdescription.cws_home/716648/description#description
 ), 2008, Chapter 5f). It isn't complete (you need some of my other papers to 
get the quantity of information innate, transmitted (causally) and received, as 
well as its effects. 
http://web.ncf.ca/collier/papers/Information%20in%20Biological%20Systems.pdf
Information, causation and computation ( 
http://web.ncf.ca/collier/papers/CollierJohn%20formatted.pdf ) (Information and 
Computation: ( http://astore.amazon.co.uk/books-books-21/detail/9814295477 ) 
Essays on Scientific and Philosophical Understanding of Foundations of 
Information and Computation, Ed by Gordana Dodig Crnkovic and Mark Burgin, 
World Scientific) http://web.ncf.ca/collier/papers/CollierJohn%20formatted.pdf
 
Causation is the Transfer of Information ( 
http://web.ncf.ca/collier/papers/causinf.pdf ) (1999) 
http://web.ncf.ca/collier/papers/causinf.pdf
 
Complexly Organised Dynamical Systems ( 
http://web.ncf.ca/collier/papers/Cods.pdf ) with C.A. Hooker (1999) 
http://web.ncf.ca/collier/papers/Cods.pdf
 
Hierarchical dynamical information systems with a focus on biology ( 
http://www.mdpi.org/entropy/papers/e5020100.pdf ) (Entropy 2003) 
http://www.mdpi.org/entropy/papers/e5020100.pdf
 
There are others that might be relevant on my web page 
http://web.ncf.ca/collier/papers.html
 
John
 
 
 

 
 
Professor John Collier  
Philosophy, University of KwaZulu-Natal
Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292
F: +27 (31) 260 3031
email: colli...@ukzn.ac.za On 2012/03/16 at 11:14 PM, in message 
1331932479.81758.yahoomail...@web96106.mail.aue.yahoo.com, Gavin Ritz 
garr...@xtra.co.nz wrote:

Hi FISers
Can anyone show me a calculus for Information relating to biological systems?

And if so show me the relationship with conceptual mathematics?

Regards
Gavin




Dear FISers:
 
Pedro and Plamen raise good and welcomed points regarding the nature of 
physics, information, and biology. Although I believe in a strong relationship 
between information and physics in biology, there are striking examples where 
direct correspondences between information, physics, and biology seem to 
depart. Scientists are only beginning to tease out these discrepancies which 
will undoubtedly give us a better understand of information.
 
For example, in the study of cognition by A. Khrennikov and colleagues and J. 
Busemyer and colleagues, decisional processes may conform to quantum statistics 
and computation without necessarily being mediated by quantum mechanical 
phenomena at a biological level of description. I found this to be true in 
ciliates as well, where social strategy search speeds and decision rates may 
produce quantum computational phases that obey quantum statistics. In such 
cases, a changing classical diffusion term of response regulator 
reaction-diffusion parsimoniously accounts for the transition from classical to 
quantum information processing. Thus, there is no direct correspondence between 
quantum physicochemistry and quantum computation. Because the particular 
reaction-diffusion biochemistry is not unique to ciliates (i.e., the same 
phenomena is observed in plants, animals, and possibly bacteria), this 
incongruity may be widespread across life.
 
Best regards,
 
Kevin Clark

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis



Please find our Email Disclaimer here: http://www.ukzn.ac.za/disclaimer/
___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Physics of computing

2012-03-14 Thread John Collier


Dear folks,
This is a further article demonstrating that information is physical. It
is nice to be getting some empirical results.

http://www.nature.com/news/the-unavoidable-cost-of-computation-revealed-1.10186?WT.ec_id=NEWS-20120313

The previous article, which I mentioned on this list, is Toyabe, S.,
Sagawa, T., Ueda, M., Muneyuki, E.  Sano, M. Nature Phys. 6, 988–992
(2010). It demonstrated that information could be converted to energy,
which I consider a no brainer on first principles, but many people have
been sceptical.
The new article is Bérut, A. et al. Nature 483, 187–189 (2012).

John





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] ISSS 2012 meeting

2012-02-13 Thread John Collier


[Apologies for duplicate mailings]
Dear Colleagues,
The 56th meeting of the International Society for 
the Systems Sciences (ISSS) will be held at San 
Jose State University July 15-20, 2012, in San 
Jose, California.
This meeting is designed at an interactive and 
collegial scale of 200 to 250 thinkers with 
diverse backgrounds and interests in the general 
theory of systems and the arts and sciences of 
systems. These fields provide platforms of 
concepts and language that enable communities of 
interest to transcend disciplinary boundaries 
towards developing new knowledge and perspectives.
The ISSS 2012 theme of Service Systems, Natural 
Systems draws attention to complex issues in 
today's world, where dialogue amongst the learned 
may lead to better futures.
The service systems sciences focus on the value 
cooperatively created and shared in human 
activities. Service systems support basic needs 
such as food and water, develop social potential 
through education and healthcare, and advance our 
societies through businesses, governments and 
social enterprises working in a globalized, 
networked world.
The natural systems sciences focus on the 
sustainability and diversity of life on our 
planet. Social ecological systems balance 
competing interests of human well-being, social 
development and economic progress. Maintaining 
resilience of natural capital and resources 
across temporal and spatial scales challenges 
policies, governance and stewardship. Ways to 
participate include:
Engaging with plenary speakers, discussants and groups in
reflections
Leading conversations on research in progress and early findings
Presenting pre-published works for commentary and refinement
Sharing experiences and knowledge sketched onto posters and outlines
Building personal insights in diverse dialogues about systems
Featured plenary speakers:
Rafael Ramirez, Director, Oxford Scenarios 
Programme; Fellow in Strategy at the Saïd 
Business School and Green-Templeton College; 
James Martin Senior Fellow at the Oxford Martin 
School.
Jim Spohrer, Director of Global University Programs, IBM
Timothy F. H. Allen, Professor Emeritus of Botany 
and Environmental Studies, University of 
Wisconsin Madison
Garry Peterson, Professor in Environmental Studies, Stockholm Resilience
Centre
Discussants invited from the ISSS community
Full details of the conference can be found at 

http://isss.org/world/sanjose-2012. A copy of a 
printable conference flyer can be downloaded from 
this site, and we invite you to share this letter 
and display the flyer for the benefit of other 
interested colleagues and/or students.
Please email isssoff...@dsl.pipex.com any 
questions about this conference. We hope that you 
will be able to submit your current work to the 
conference and look forward to hearing from you. 





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The State of the Art - Discussion of Information Science Education

2011-12-08 Thread John Collier


Good to see that fis perspectives are used in teaching. I use information
ideas fundamentally in our second year Cognitive Science course, and also
in some postgrad courses I teach.
John

At 03:03 PM 2011/12/07, Pedro C. Marijuan wrote:
Thanks a lot, Gordana. It is a
very good idea. Unfortunately I could not participate in the opening of
the session, well, at least I can say now that I had the experience
of teaching for Engineering graduate students two neatly informational
(FIS) disciplines. One of them, Bioinformation:
informational analysis of living systems; and the other Science,
Technology and Society: an introduction to the informational history of
societies. Both of them in Spanish. They were very successful,
particularly the latter. The FIS perspective is ideal not only for
breaking down on impossible topics (our familiar demons) but
also for promoting a new, highly original way of analysis --of knolweldge
recombination processes-- on topics of our time and of the most
contentious past. 
missing a lot the direct involvement in the discussions!
yours,
---Pedro
Gordana Dodig-Crnkovic escribió: 
Hi All,

One way of looking at the question of curriculum would be from the point
of view of what already exists
of education in the Foundations of Information.

Are there any courses which might be a part of such a curriculum?

To start with I can tell about the course I have, which does not cover
much of Science of information, but there are several connections.
As I work at the computer science department, my perspective is
computational.
For me computing is information processing and information is that which
is processed, and that which is a result of processing. 
Processing may be done by a machine or by an organism or anything else –
the whole of nature computes (processes information) in different
ways.
As info-computationalist, I believe that information is unthinkable
without computation.
So the course is on Computing and Philosophy but addresses Philosophy of
Information and Science of Information as well and topics on evolution of
life, intelligence (natural and artificial), consciousness, etc.

http://www.idt.mdh.se/kurser/comphil

I believe it would be good to have a course on the foundations of
information science for people in the computing.
Information and computation are completely entangled! And this gives also
an opportunity to introduce other fields into computing, to contribute to
building bridges and 
facilitating inter-disciplinary/ cross-disciplinary/
trans-disciplinary learning.

This is not as ambitious as the original question, but can help
understanding where we are now and where we want to be.

Best wishes,
Gordana

http://www.mrtc.mdh.se/~gdc/



From:

fis-boun...@listas.unizar.es
[
mailto:fis-boun...@listas.unizar.es] On Behalf Of Stanley N
Salthe
Sent: den 5 december 2011 20:53
To: fis
Subject: Re: [Fis] Discussion of Information Science
Education

And it could feature in 'Science for Non-Majors' courses as well.

STAN
On Mon, Dec 5, 2011 at 12:44 PM, Guy A Hoelzer
hoel...@unr.edu wrote:
Hi All,
I agree with those who are suggesting that Information Science makes
sense
as a widely useful way to think about different scientific
disciplines
even if we don't have a strong consensus on how to define
'information'.
I think there is enough coherence among views of 'information' to
underpin
the unity and universality of the approach. Perhaps Information
Science
is less a discipline of its own and more of a common approach to
understanding that can be applied across disciplines. While I can
imagine
good courses focusing on Information Science, it might be most
productive
to include a common framework for information-based
models/viewpoints
across the curriculum.
Guy Hoelzer

___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis




___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


-- 
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Avda. Gómez Laguna, 25, Pl. 11ª
50009 Zaragoza, Spain
Telf: 34 976 71 3526 ( 6818) Fax: 34 976 71 5554
pcmarijuan.i...@aragon.es

http://sites.google.com/site/pedrocmarijuan/
-

___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Info Theory

2011-01-24 Thread John Collier


At 02:05 PM 1/24/2011, Loet Leydesdorff wrote:
Content-Type:
multipart/alternative;

boundary==_NextPart_000_00C9_01CBBBC7.684CC9E0
Content-Language: en-us
Dear colleagues, 

It seems to me that the relation between information and energy provides
the special case that the entropy is thermodynamic entropy. The relation
is S = k(B) H. H is dimensionless, but S is not because k(B) adds the
dimensionality of Joule/Kelvin. H can also be considered as probabilistic
entropy. S is relevant in the case that the system of reference is the
chemico-physical one based on collisions among particles. This level –
the exchange of momenta and energy – is always involved in higher-order
exchange processes, but the next-order ones emerge on top of the
lower-order ones. 
I think that joules are energy, and temperature is energy per degrees of
freedom. If you cancel out the energy part you get degrees of freedom,
which is dimensionless (a number), and is also a good measure for things
like information in physical terms. So I don't see a problem. W

For example, when specifically molecules are exchanged, life can emerge
(Maturana). The self-organization may also reduce the uncertainty locally
(“negentropy”). The system of reference, however, then is different
from the chemico-physical one. The information exchange is provided with
meaning.
Only because it is interpreted (biosemiotics).

Things change dramatically when meaning can again be communicated because
then models can be entertained at a more rapid speed than the underlying
(that is, modeled) systems. The redundancy generation can then prevail
over the entropy generation and a knowledge-based economy, for example,
maintained. The discursive models proliferate options other than the ones
which occurred historically. This cultural system incurs on the
historical manifestations and thus counteracts upon their following of
the entropy law. The social system, for example, can be based on other
premises than the lower-order ones. For example, the “survival of the
fittest” can be replaced by universal human rights.

In other words: the specification of the system of reference provides the
information exchanges with meaning. 
Quite.
This meaning can again be
communicated reflexively in the respective disciplines. The systems can
be expected to gain in their capacity to process complexity insofar as
these different layers become more nearly decomposable. This expansion
spans the different dimensionalities and thus can be expected to enlarge
the space for knowledge-based interventions. 
Dimensionalities add degrees of freedom, and thus information capacity.
So information capacity can emerge (or even be created) by the sort of
process you mention. It is all physically grounded, though.
One more thing:
GR: That's thermodynamically impossible. Any organic system requires to
convert and transduce energy so you may think it does no work but the
relationship is like Ostwald's Ripening there is always an energy
cost
Sorry Gavin, but you are mistaken. The entropy budget is made at the
expense of information loss in these cases.
Incidentally, you are posting too often. The rules say two a week. This
allows people to check sources, etc. Google is good, plus archives of the
fis list.
Schroedinger, What is Life? (1945). The connection is via
negentropy, and then to biological information in the DNA (he called it a
nonperiodic crystal).
The 1929 Szillard, SZILARD L., Z. Phys., 53 (1929)
840-856, paper is in German. An English translation can be found in Leff
and Rex, Maxwell's Demon (1990, Princeton University Press). It is
generally regarded as the first explicit connection between information
and physics.
John





Professor John Collier, Acting HoS and Acting Deputy HoS

colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://collier.ukzn.ac.za/



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] fis Digest, Vol 543, Issue 19

2010-11-19 Thread John Collier
 pass through membranes through 
successive conformational changes that remove 
energy barriers to the transfer, much like the 
simple experiment reported in the article. This 
has been known for at least 15 years, I think. 
Inasmuch as there is functionality here, semiotic 
considerations may be relevant in this case. But 
not in the case in the article. Intelligence is a 
special case of the biological (so far). 
Conformational change is even more important and 
less dependent on the energetic substrate, and 
more on other conformations and their changes (e.g., in inference).


The intelligent systems mainly do the same.

Everything does the same. It is how it is done that is important.

My best,
John


--
Professor John Collier, Acting HoS  and Acting Deputy HoS
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://collier.ukzn.ac.za/


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Demonic device converts information to energy

2010-11-16 Thread John Collier
http://www.nature.com/news/2010/101114/full/news.2010.606.html?WT.ec_id=NEWS-20101116

Not really surprising, but an interesting demonstration.

John

--
Professor John Collier, Acting HoS  and Acting Deputy HoS
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://collier.ukzn.ac.za/

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Tactilizing processing. Resonance

2010-11-01 Thread John Collier


Mark Burch and I call a similar idea rhythmic entrainment. There was an
article in Symmmetry Vol. 9, Nos. 2-4
1998:

Order From Rhythmic Entrainment and the Origin of Levels Through
Dissipation 
It would be on my web site, but that is currently not functioning due to
University attempts at imposing uniformity and control.
John
At 09:02 AM 01/11/2010, Joseph Brenner wrote:
Dear Bob and Stan,
I also find myself in agreement with you. Resonance is a very good term
for 
a form of reciprocal interaction that defines the entities capable of it.

This is what Lupasco called an adequately contradictorial
relation that is 
possible intra- and well as inter-level, intra-level for example in
organic 
molecules of certain types, or people. This is thus a general principle:

people resonate with some other people but not all . . .
Best,
Joseph
- Original Message - 
From: Robert Ulanowicz u...@umces.edu
To: Stanley N Salthe ssal...@binghamton.edu
Cc: fis@listas.unizar.es
Sent: Monday, November 01, 2010 12:16 AM
Subject: Re: [Fis] Tactilizing processing

Quoting Stanley N Salthe ssal...@binghamton.edu:
 Bob -- I think that 'coupling over such a disparity in scale' is not

 really
 going on differently in biology either. The only messages that
could
 'percolate upwards' in a material system would be those the higher

 level(s)
 are prepared to receive, in all cases. This might allow
information from
 smaller populations of lower scale entities to be detected.
But it would
 always be the larger scale system constructing some kind of
ensemble
 information, or it would be ... magic! Biology manages to get
a greater
 uniformity (via genetic controls) of smaller scale populations,
thus
 increasing the precision or definiteness of the lower scale
'messages',
 which are still a kind of 'mass action', but with clearer, more
reliable 
 and
 less muddy, 'colors'.

 STAN
Stan,
We agree 100% on this one. I have always qualified Prigogine's
order
through fluctuations by pointing out that not just *any*
perturbation
will change the dynamics of the system. (In the Prigogine scenario,
all perturbations are generic and homogeneous.) The system will only
respond to those perturbations (for better or worse) that resonate
with the configuration of the larger system.
Cheers,
Bob
___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis 
___
fis mailing list
fis@listas.unizar.es

https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis





Professor John Collier, Acting HoS and Acting Deputy HoS

 colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://www.ukzn.ac.za/undphil/collier/index.html 



___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Tactilizing processing

2010-11-01 Thread John Collier
At 09:13 AM 01/11/2010, Loet Leydesdorff wrote:
Dear colleagues,

It seems to me that we have a more elaborated apparatus for discussing the
distances of a perturbation across a number of interfaces.

Two information processing systems can be considered as structurally
coupled when the one cannot process without the other. A single
(system/environment) interface is then involved. If two interfaces are
involved, the in-between system mediates; the coupling can then be
considered as operational since the mediating system has to operate before
transfer can take place across the interfaces. When more than two interfaces
are involved, the coupling becomes increasingly loose, and another mechanism
thus has to be specified for the explanation.

There is an attempt to deal with this sort of thing at
http://complex.unizar.es/~yamir/papers/phys_rep_08.pdf

It is quite a bit more general. With my administrative load right now
I haven't had time to read the paper, just to glance over it. Their
central interest is not information or information processing,
but it is mentioned in several places.

Synchronization is another term like harmonization and
rhythmic entrainment.

A friend who sent me the reference says that the mathematics
starts off well, but gets shakier through the paper.

Now back to the budget.

John


--
Professor John Collier, Acting HoS  and Acting Deputy HoS
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Beijing FIS Group

2010-09-20 Thread John Collier


At 03:26 PM 20/09/2010, Stanley N Salthe wrote:

On Mon, Sep 20, 2010 at 9:26 AM,
Stanley N Salthe
ssal...@binghamton.edu
 wrote:


Regarding the question: What is your


opinion about Leroy E. Hood' words: Biology Is an

Informational
Science?
In a general sense the meaning is that, although every locale in the
world is mediated by history -- requiring information to be understand
beyond knowledge of physical and material laws -- biological systems have
internalized and replicate the results of historical accident as
preserved in the information in the genetic system. In general,
history passes away, but biological systems capture some of it in the
form of species and variety differences.


I would add to Stan's correct remarks that unlike physics, in which the
laws tend to dominate, and boundary conditions are pretty irregular (but
not always!), in biology the boundary conditions are very important,
especially their regularities both in individual biological entities,
within kinds of biological entities, and across kinds of biological
entities. For example, most kinds of biological entities are cohesive
levels or nestings in information hierarchies, which allows application
of statistical mechanics to their information dynamics
(Hierarchical
dynamical information systems with a focus on biology Entropy 2003,
5, 100-124). Furthermore, inasmuch as biological systems are emergent,
boundary conditions are not separable from their dynamical principles, so
issues of form (which require information theory for full analysis, or as
full as we can expect), are wound up with the system dynamics, or laws
(
A dynamical account of emergence (Cybernetics and Human Knowing, 15,
no 3-4 2008: 75-100)). The last point was made some time ago by Conrad
and Matsuno, but has not been appreciated as much as it should (much lip
service, perhaps, but not enough precise application).
Cheers,
John 




Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://www.ukzn.ac.za/undphil/collier/index.html 


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The Inventor of Information as Asymmetry

2009-11-15 Thread John Collier
At 11:53 PM 2009/11/15, James Rose wrote:
May I make some comments for clarification of the notion that
(differential) information is equatable (Identical?) with asymmetry,?

Bateson did indeed bring in the distilled meme that information is the
difference that makes a difference.  Asymmetry indeed is a differential
domain that meets Bateson's 'definition', so on the face of it might
qualify as an alternate (if not more explicitly computation-related) 
definition.

The antithesis meme for asymmety vis a vis information would be:
perfect symmetry ergo no information.   But what about the simple
geometry of Decartes?!  A symmetric domain absolutely filled with
'information.

I don't see this, sorry.


 From the initial description and from posting remarks, it is clear to me that
the information-distinction is not 'asymmetry' at all, but optional alternate
frames of reference. Which by extension  includes alternate measuring
systems - templates - models.  This indicates that 'information' would more
correctly be defined as 'pattern differentials' .. under which 
symmetry/asymmetry
is only -one- sub-alternate pattern.

My student, Scott Muller, in the book I previously mentioned, The Asymmetry:
The Foundation of Information, Springer 2007 was able to show that information
can be defined intrinsically to objects independently of such considerations.
Like Leyer, he used group theory. His unique contribution was to show that
a value can defined irrespective of perspective, and then different 
relative values
can be recovered using specific perspectives.



Some thinkers may be more comfortable with the word 'relationship' instead of
'pattern' ,, but the essential umbrellaing notion here is the same, 
even with that
word/meme substitution.

Pattern can be defined using group theory in terms of of symmetries and
asymmetries,


Symmetry/asymmetry is only one form of differential distinction, and is not a
universal model form, applicable to all existential 
'differences'.  Additionally,
as noted above, 'symmetric systems' are innately constructed/organized with
information content.

Information as asymmetry is a myopic and limited notion.

It is the most general and foundational notions, Your versions are merely
special cases.

John


--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information states

2009-11-14 Thread John Collier


At 02:12 PM 2009/11/10, Pedro C. Marijuan wrote:
Dear FIS colleagues,
The comments, days ago, by John H on information states were

intriguing. In my view, the differences he addresses between physical

states and informational states could be compacted as the primacy
of 
the intrinsic regarding informational entities. The physical state
(in 
my limited understanding) contraposes the extrinsic (boundary 
conditions) and the intrinsic (state variables and identity

parameters), and reunites them by means of a set of dynamic equations

that express the laws of nature pertinent to the whole context. In the

information state, the intrinsic and the extrinsic cannot be separated

so easily (only some selected parts of the extrinsic become external

information, those upon which the info entity will perform

distinctional operations), but the intrinsic is not really reducible to

a collection of variables and parameters, it is a life cycle in 
progress. Then, how can we express a life cycle in a compact way so to

interact lawfully with the extrinsic? Socially we consider this new kind

of informational-subject-happenstances as biographies, and
refer to 
their coupling with the extrinsic as events.
Echoing Koichiro Matsuno (as we wrote together in 1996, after the second

FIS event in Washington 1995, in Symmetry Culture and Science, 7,3, 
229-30). This mutual upholding between symmetry and information in

theoretical science suggests a unique perspective addressing how the

description of both 'states' and 'events' could be integrated in a 
unified manner.
Or in other words, the very need of a new abstraction procedure about

the social process of knowledge accretion and
recombination...
I could not agree more. For an excellent review and expansion of the
notion of intrinsic information and how it is viewed extrinsically,
see
the published PhD thesis of my student Scott Muller, Asymmetry:

The Foundation of Information. By Scott Muller. Springer: Berlin. 
2007. VIII, 165 p. 33 illus., Hardcover. CHF 139.50. 
ISBN: 978-3-540-69883-8 I do not agree with Lin's assessment,
but there are questions of 
priority here that are always difficult to resolve. Scott should
have,
and I told him this, be careful to be clear about what was original
to his thesis. I claim the asymmetry principle from a 1996 paper

Information Originates in Symmetry Breaking 

http://www.ukzn.ac.za/undphil/collier/papers/infsym.pdf
In the journal Symmetry. Scott added substantially to the
justification of my basic idea. The ideas however are implicit
in MacKay, Donald M., Information, Mechanism and Meaning. 
Cambridge, MA: MIT Press, 1969. and Bateson, G. (1973), 
Steps to an Ecology of Mind (Paladin. Frogmore, St. Albans). 
The first calls information a distinction that makes a difference,
and the second a difference that makes a difference. Both
permit the physical interpretation. I really wish we could get 
beyond this, and deal with more substantive issues. It has
already been decided: information and interpretation of
information are different from each other.
Regards,
John





Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://www.ukzn.ac.za/undphil/collier/index.html 


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Information states

2009-11-14 Thread John Collier
At 05:33 PM 2009/11/14, you wrote:
While not suggesting a discussion on this, I note that

John says -- information and the interpretation of information are 
different from
each other

I think this is not as clear cut as that.  Beginning all the way 
back to von Uexkull's
Theoretical Biology, the constructivist perspective takes a 
different view.  The
'epistemic cut' is created by the observer.

The observer is part of the universe and deserves no special status 
except as a representer. That must be understood in terms of the 
basic conditions of the universe. This sort of dualism of epistemic 
cuts is doomed to self-destruction as it removes the observer from 
the universe.

John

--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] The Inventor of Information as Asymmetry

2009-11-14 Thread John Collier


Thanks. I still maintain my student carried this idea much further then
anyone before.
As I said before, priority in such issues issues is very hard to
establish. I think that Michael Scriven was well ahead on these ideas. He
is now known as Tal Scriven. His ideas date much earlier than 1992, to
say the least. I first encountered them in 1971 at MIT.
John

John
At 05:49 PM 2009/11/14, David Weiss wrote:
The inventor of the
concept of Information as Asymmetry is
Michael Leyton in his enormous book 640 pages
Symmetry,Causality, Mind (MIT Press, 1992).

Furthermore: Leyton invented the concept of the 
causal basis of information.

In addition, Leyton's book A Generative Theory of Shape
in Springer (2001), invents an enormous mathematical theory
of information as asymmetry. 

Leyton's work is used by scientists in over 40
disciplines.
His theorems are used 1000s of times a minute all around the world.

Also, because of the importance of his work he was awarded a
major prize from 
the president of the united states. 


Symmetry Causality Mind. By Michael Leyton. MIT Press 1992:
Berlin. 
A Generative Theory of Shape. By Michael Leyton. MIT
Press 2001.


best wishes 
David Weiss







Professor John
Collier
colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://www.ukzn.ac.za/undphil/collier/index.html 


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] FW: Fw: Definition of Knowledge?Chrysippus's dog

2009-10-07 Thread John Collier
I would second that. There are some relevant papers on my home page:
http://www.ukzn.ac.za/undphil/collier/papers/InformationCausationComputation.pdf
http://logica.ugent.be/philosophica/fulltexts/75-4.pdf
http://www.ukzn.ac.za/undphil/collier/papers/Information%20in%20Biological%20Systems.pdf

John

At 07:46 PM 2009/10/07, Jacob Lee wrote:
Why not situation theory, or Barwise and Seligman's channel theory?

Jacob

john.holg...@ozemail.com.au wrote:
  Stanley, Christophe
 
 
 
  IMO we need to develop a comprehensive Grammar of Information which
  embraces not only semantics and syntax but also modality, case, aspect
  , tense etc and looks at the language of informational states,
  objects, events, experiences and processes throughout the biosphere,
  physiosphere, sociosphere etc.
 


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


--
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] informational economics?

2008-10-29 Thread John Collier
 in out from some sector (eg, housing in some
strategic countries), amplified in the global complexity, have now
potential to destabilize the whole financial layers and bring the real
economy to havoc.

Again, the most likely reason is that the assumptions of neoclassical
economics do not hold due to non-ideal market conditions. The housing
bubble in the US caused trouble mostly because the risks were not
properly represented in the financial markets for derivatives. This
allowed the amplification you mention. This process was aided by
'quant' programs that were set up on the basis of normal markets, and
were not suited to the abnormal conditions (this has happened in
other crashes like the one in 1987 -- bigger percentagewise over one
day -- and the Asian and dot.com crashes), but the obscurity of
market information was not as extreme in these cases, so there
was not as much chance for the problem to spread. I suspect that
panic was the bigger factor. In the 2007-08 panic the information
was simply not available, and rating houses like Moodey's had
helped obscure the real situation by rating collective low rating based
derivatives as high rated because they were distributed by banks
with high ratings (that presumably would not fail), and had real estate
to back them up, so the bottom was far from $0. This did not take
into consideration that the financial companies might have a failure
of cash flow, and have to sell off blue chip stocks, where the value
can drop to 0 for at least some companies (including major banks
and insurance companies). it wasn't so much a scam as self-delusion
made possible by the obscuring of information (and deregulation,
such as the 2007 abolishment of the upward tick rule on short
selling that was put in back in 1933 or so -- I suspect the last, by
the way, invalidated the quants in one act, since it changed
market conditions so much).


5. Economy is an informational systems, in crucial aspects, not well
explained yet... advancing an info economics would be quite timely.

Quite. However, having tried it with a very good economist, it is not
very easy. Definitely worth trying again now that neoclassical economics
does not hold quite so much sway, though.

Would it be interesting to argue on some of these very roughly penned
aspects (while our pockets get emptier and emptier)?

It is definitely going to be a while before I can retire. Anybody know
of a nice job where the retirement age is over 60, as it is here? It's
coming up a bit too soon for the markets to recover, I fear.

Cheers,
John


--
Professor John Collier [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  

___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Neuroscience of art

2008-09-18 Thread John Collier


At 05:30 PM 2008/09/18, Sonu Bhaskar wrote:
Dear FIS Colleagues,
The cognizance between the art and cognitive neuroscience has been
relatively ignored in the scientific fraternity. The recent proposition
regarding the ten laws of art, as Dr. V. S. Ramachandran puts it,
has ignited a new debate among the philosophers and the neuroscientists
about neural correlates of art in its different forms.
Professor Ramachandran's suggested 10 universal laws of art: 

Peak shift 
Grouping 
Contrast 
Isolation 
Perception problem solving 
Symmetry 
Abhorrence of coincidence/generic viewpoint 
Repetition, rhythm and orderliness 
Balance 
Metaphor 

Ref:

http://www.bbc.co.uk/radio4/reith2003/lecture3.shtml
The tenets of the above 10 laws draws its profound inspiration from the
theory of information flow and the conceptualisation of the perception in
humans. Interestingly, some of these points dovetail with Lanham's
proposal that Pedro mentions; but others are very different…
This is my first posting to FIS. I am an Indian Neuroscientist pursuing
doctoral research in Spain (land of Cajal!!!).

Welcome, Sonu.
Ramchadran is a brilliant psychologist. I use him in my 2nd year
Cognitive Science course. The way he demolishes some a priori views of
psychology (I mention Dennett and Fodor in particular) by specific
examples of experimental results is a model that others would be well
advised to pay close attention to.
Cheers,
John





Professor John
Collier
[EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://www.ukzn.ac.za/undphil/collier/index.html 


___
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] info meaning

2007-10-12 Thread John Collier

At 09:01 PM 12/10/2007, bob logan wrote:

Loet et al - I guess I am not convinced that information and entropy
are connected. Entropy in physics has the dimension of energy divided
by temperature. Shannon entropy has no physical dimension - it is
missing the Boltzman constant. Therefore how can entropy and shannon
entropy be compared yet alone connected?


Bob,

Temperature has the dimensions of energy per degree of freedom.
Do the dimensional analysis, and you end up with a measure in
degrees of freedom. This is a very reasonable dimensionality
for information.


I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.


I am planning to make some general comments of meaning, but
I am too busy right now. They will have to wait for later. There are
some very tricky issues involved, but I will say right now that
information is not meaningful, but has only a potential for meaning.
All information must  be interpreted to be meaningful, and the same
information can have very different meanings depending on how it
is interpreted. Information, on the other hand, has an objective
measure independent of interpretation, and that depends on
the measure of asymmetry within a system. See the recent book
by my student Scott Muller for details,  Asymmetry: The Foundation
of Information, Springer 2007.

http://www.amazon.ca/Asymmetry-Foundation-Information-Scott-Muller/dp/3540698833

This whole discussion on meaning needs far more precision and a
lot of garbage collecting.

Cheers,
John




--
Professor John Collier [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html  


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] about fis discussions

2007-06-13 Thread John Collier

At 06:22 AM 13/06/2007, Pedro Marijuan wrote:

Dear FIS colleagues,

About the approaches to the information concept commented by Karl, 
Loet, John, and Stan, let me argue that some of them have a rather 
narrow conceptual domain of applicability. In Karl's approach I have 
already argued that his highly suggestive conflation of the 
sequential vs. the simultaneous in order to define formally 
information should be accompanied by an agreeement (an in depth 
discussion) of the technical problem on how to count 
multidimensional partitions. Morris, Pastor, and me had found 
years ago some discrepancy regarding the heuristic formula he has 
developed ...a few things might be different, and perhaps even more 
interesting. Well, it may seem strange, but Michael Leyton's 
approach based on group theory could be in close vicinity of the 
formal structures in Karl's. Anyhow, the pitty is that discussimg 
this on the Internet is a pain of the neck (we should have had a 
small ad hoc seminar during the Paris conference!).


My student Scott Muller, who completed his PhD recently on just this 
topic, was at the Paris meeting. His worked was praised by examiners 
Larry Sklar, Phil Hanson and Louis Kaufmann, all of whom have a long 
history dealing with information theory and statistical mechanics. It 
is being published by Springer, I believe. To bad he didn't get a 
chance to speak up more.


My 1986 paper, Entropy in Evolution, in the first issue of Biology 
and Philosophy, shows a way to define information in multidimensional 
physical systems I called 'arrays' to capture the statistical 
fluctuations of information at lower levels. I define a physical 
information system in terms of these arrays. I've had some minor 
criticism (Sarkar), but he backed off when I explained in more detail.



My own track is based on the need to accomodate quite many new 
observations, mostly in molecular biology  neuroscience, that 
cannot be situated within the existing conceptualizations, apart 
from leaving the immediate problem of meaning in the dark, 
concerning its biological-material underpinng. So I proposed last 
year, in this list, exploring the scope of an alternative 
conceptualization of information as distinction on the adjacent... 
given that both terms are too heavily loaded, I stop here and leave 
the matter for future discussions (of course, the underlying 
reflection is that it is far more than a single concept what we are 
trying to clarify during all these years in this list: the quest for 
a consistent new perspective or disciplinary body around information).


Meaning is a really tricky problem, and I now believe it requires 
semiotics to resolve.


Cheers,
John


--
We're just fighting at a number of levels here against a number of 
different enemies,

U.S. Ambassador to Iraq Ryan Crocker
Professor John Collier [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.nu.ac.za/undphil/collier/index.html
http://www.kli.ac.at/research.html?personal/collier
Cybernetics  Human Knowing http://www.imprint-academic.com/CHK
Subscriptions [EMAIL PROTECTED]  


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] Mind, matter, meaning and information

2007-03-16 Thread John Collier


Welcome to the list, Robin!
At 06:30 PM 3/13/2007, Robin Faichney wrote:
I'm new to this list, so I will
give a brief description of my
background, then a brief account of my understanding of information,
in the hope of eliciting some comments.
I have a BA (Hons) in philosophy and psychology and obtained an MSc
in
information technology in 1986. I worked in academic research first
in
computing then computer modelling in environmental economics. I left
academia in 1998 to start my own computer maintenance business, but
health problems over the last 2-3 years have obstructed that, and I
have instead been pursuing my long-standing interest in philosophy
of
mind. Although I've only recently had much time to devote to such
studies, my ideas have been developing over the last 25-30 years,
and
even 3 years ago, I had several tens of thousands of words, though
none of it had ever been published (or even submitted).
I have only quite recently become aware of the new field of
philosophy
of information, but I've given a great deal of thought to the place
of
information in phil of mind, and have come to some quite firm
conclusions, on which I'd like to get some feedback. I have already
submitted a conference paper abstract but it hasn't yet been
accepted
so I guess I could retract it if you people manage to convince me
there's a serious error of some sort. What follows is almost
identical
to the abstract.
In this paper I combine and extend some ideas of Daniel Dennett with
one from Wittgenstein and another from physics. Dennett introduced
the
concepts of the physical, design and intentional stances (1987), and
has suggested (with John Haugeland) that “some concept of
INFORMATION
could serve eventually to unify mind, matter, and meaning in a
single
theory.” (Dennett and Haugeland, 1987, emphasis in the
original)
The most Wittgensteinian approach to intentionality is, in my opinion,
in
Situations and Attitudes by Jon Barwise and John Perry. I think it is
flawed,
as it does not properly incorporate standard logic (this is a problem
that
Jerry Fodor harps on, a bit excessively perhaps, and to the wrong
effect,
but basically he is right). I come more from a Peircean direction, which
takes
standard logic much more seriously in his account of meaning. There is an

attempt to criticise and integrate the various positions, including
formal
pragmatics in Pragmatist Pragmatics, Collier and Talmont-Kaminski,
Philosophica 75 (2006), available on my website. You might find it 
interesting, as it uses information as a central primitive. I find
Dennett's
stances too nominalist for my taste. Dennett might now too, under the

influence of my colleague and coauthor Don Ross. There is a book
coming out from Oxford before too long, Every Thing Must Go: Metaphysics

Naturalised by JAMES LADYMAN and DON ROSS with David Spurrett and 
John Collier that takes a more realist position on some of Dennett's
work.
It also give an information-theoretic structural realism that does
away
with objects as fundamental metaphysical entities. I can send you a
preprint if you like.
There is a nice, accessible account of Barwise and Perry in Keith
Devlin,
Information and Logic.

The concept of physical
information is now very well established. The famous
bet between physicists Stephen Hawking and John Preskill that Hawking
conceded
he’d lost in July 2004 concerned whether physical information is
conserved in
black holes. (Preskill, 2004) Physical information is basically
material form.
The concept derives from C.E. Shannon’s information theory (1948) and
has no
semantic component. When this concept is taken to its logical
conclusion, an
energy flow becomes an information flow and an object becomes its
own
description. The crucial distinction is between form and
substance. Dennett’s
physical stance could be renamed the “substantial stance,” while I
introduce an
additional stance to account for information, called the “formal
stance,” in
which we attend to form rather than substance.
The book mentioned above talks about the material and formal modes,
which
dates back to the early logical empricists (but I would argue it can be
found
in Hertz's philosophy of science -- thanks to Howard Pattee for that). On
energy
flow being information flow, see my Causation is the Transfer of
Information, also
on my website. I am revising it now to incorporate Barwise and Seligman,
Information
Flow. My approach is a special case of their formalism restricted to
dynamical
classes and particulars (types and tokens). After the restriction, the
rest of my
view follows trivially from their formalism. Incidentally, I don't
think that Shannon's
theory is general enough to do the job you require, but I won't go into
the reasons
now, since they would require a rather extended development.
You can find my website below, or by goolgling John Collier
complexity.

The common concept of
information is intentional. Intentional information is
encoded in physical

Re: [Fis] Re: fis Digest, Vol 501, Issue 5

2007-02-05 Thread John Collier
 have the propensity to function in relatively small
groups bind by strong cultural bonds.
To:
I was referring to the hypothesis that genetic networks have the
creative capacity to function in very large associations that are linked
together by very weak bonds.
There is no difference between the two statements -- the scope in the
'from' case is the Yang side of things, but in the 'to' case it is the
Yin side. One pays attention to the Yang aspects, and the other to the
Yin aspects. Both propensities are there, and the stronger the Yang
propensity the more it transforms into the Yin, and vice versa. Given a
finite information capacity, these are the only two possible dynamics,
and they trade off against each other. Now, if we have an expanding
information capacity (phase space), as Kaufman, Brooks and Wiley, Layzer,
Landserg, Frautschi, Davies and other notables have seen, we can get both
together, though they still trade off one against the other. 
Ted's comment seems
to be based on a some recent innovations in the mathematics of
hierarchies. The issue of how we select the meaning for our symbols
of representations of the world can be a very complicated one. The
profound limitations that linear and quasi - linear mathematics places on
the symbolic carrying capacity of signs may be relevant to Ted's
statement. But, I am not certain of the origins of his
views.
Jerry, I think the way this is worded is not quite consistent with the
perspective you are promoting. We don't select the meaning of
our symbols, except perhaps in fairly formal contexts. If we did it would
be very hard to be usefully creative, I am sure you agree -- we could
only select what we already have a template for -- see my Dealing with
the Unexpected from the CASYS meetings examples.
Stan's comment
deserves to be attended to.
The many
complexities facing us as society can be parsed as follows, using a
specification hierarcy:
{physical constraints (material/chemical constraints {biological
constraints {sociocultural constraints.
As I search for the substance in this comment, I focus
on what might be the potentially misleading usage of the term
parsed. Nor, do I understand why brackets,
signifiers of separations, are used in this context. 
I have no idea what it would mean to parse a material /
chemical constraint in this context.
See note on W.E. Johnson above. That is the standard source for the logic
here, and it is universally accepted among those who know it.
Indeed, chemical
logic functions in exactly the opposite direction. 
The creative relations grow with the complexity of the system. Is
this not what we mean by evolution? 
But so do the constraints or restrictions, as Stan has been arguing for
years now. There is no inconsistency in both happening.
On a personal note
to Stan: We have been discussing similar concepts since the inception of
WESS more than 20 years ago and it does not appear that we are
converging! :-) :-) :-) Unless you
choose to embrace the creative capacities of chemical logic, I fear your
mind is doomed to the purgatory of unending chaotic cycles, searching for
a few elusive or perhaps imaginary fixed points.
;-) :-) :-(
!!!

And there is no convergence. There are fixed points -- there have to be
or all we can have is mush -- but they are not where the action is. On
the other hand, the 'action' occurs only because of receptivity to
being worked on or guided by constraints that must relatively fixed. The
divergence is there in reality, and the place where there is convergence
is beyond our ability to grasp with an argument. I am sure that Stan
knows this.
John





Professor John
Collier
[EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292 F:
+27 (31) 260 3031

http://www.ukzn.ac.za/undphil/collier/index.html 


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


  1   2   >