[Fis] RV: Something positive (From Arturo Tozzi)

2016-12-22 Thread PEDRO CLEMENTE MARIJUAN FERNANDEZ
De: tozziart...@libero.it [tozziart...@libero.it]
Enviado el: jueves, 22 de diciembre de 2016 14:08
Para: fis@listas.unizar.es; PEDRO CLEMENTE MARIJUAN FERNANDEZ
Asunto: Something positive

Dear FISers,

it's excruciating...
We did not even find an unique definition of information, life, brain activity, 
consciousness...
How could the science improve, if it lacks definitions of what itself is 
talking about?
And the old problem of science: from above, or from below?  Which is the best 
approach?

It seems that we depicted a rather dark, hopeless picture...  However, there 
is, I think, a light in front of us.
The only way to pursue our common goal, I think, it is to be free.
Free from our own beliefs.
Enlarge our horizons to other fields of science, apart from our own.
Forget metaphysics, of course.
Look at other disciplines, such as physics, medicine, engineering, biology, 
math...

Voltaire said: "Il faut cultiver notre jardin" .  But he was wrong.  We have to 
take care of more than a garden.
Your own garden is too narrow for your beautiful mind.

Therefore, TANTI AUGURI!
And I hope that, the next year, in the 2017 Christmas time, every one of us 
will be expert in a scientific field different from his own.

Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/

-
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] TR: Something positive

2016-12-22 Thread Christophe
Dear Arturo,
You are right, we need models.
But why not use the natural one during which all this has happened: the 
evolution of our universe.
And taking the evolution of the universe as a reference frame allows to 
highlight a few pitfalls. Basically the natures of life and human mind which 
are still mysteries for today science and philosophy. This should bring us to 
accept the difficulties encountered with a definition of life.
So we can begin by positioning our investigations betwen life and human mind to 
address the natures of information and meaning, which are realities at that 
level.
But first a preliminary point. I feel that information should not be separated 
from meaning. We care only about meaningful information. Weaver rightly said 
that information should not be confused with meaning because a channel capacity 
is independent of the meaning of information going thru it. But this does not 
mean that information should be separated from meaning. Nobody would care about 
a channel transferring meaningless information.
Now, taking life as a given with its performances allows us to look at 
definitions for information and meaning for living entities, and also can bring 
in a thread for a definition of self-consciousness (part of human mind).
This has been addressed in a 2011 book to which several FISers have 
participated with Gordana Dodig-Crnkovic and Mark Burgin as editors

(http://www.worldscientific.com/worldscibooks/10.1142/7637). The chapter 
defining information and meaning for living entities  is at 
https://philpapers.org/rec/MENCOI  (with extension to artificial agents).
I would recommend you have a look at it.
All the Best
Christophe


De : Fis  de la part de tozziart...@libero.it 

Envoyé : jeudi 22 décembre 2016 07:51
À : fis@listas.unizar.es
Objet : [Fis] Something positive

Dear FISers,

it's excruciating...
We did not even find an unique definition of information, life, brain activity, 
consciousness...
How could the science improve, if it lacks definitions of what itself is 
talking about?

It seems that we depicted a rather dark, hopeless picture...  However, there 
is, I thing, a light in front of us.
I think that the best way to proceed, at least the most useful in the last 
centuries, is the one pursued by Einstein: to build an abstract, rather 
geometric , mathematical model, make testable previsions and then to check if 
it works in the real world.
Therefore, I think, we need novel, fresh models and theories, more than 
experiments aiming to demonstrate theory-laden, pre-cooked, previsions of 
scientists.
It is the old problem of science: from above, or from below?  Which is the best 
approach?
The knowledge of the most elementary biological and physical issues is so 
scarce, as demonstrated by this FIS discussion involving foremost scientists 
from all over the world, that the right approach, I think, is to start from 
above...
from topology, of course




Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/

Arturo Tozzi
arturotozzi.webnode.it
Formally MD, PhD, Pediatrician (ASL NA2 Nord, Caivano, Naples, Italy), Adjunct 
Assistant Professor in Physics (Department of Physics, Center for Nonlineas 
Science ...








___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-22 Thread Stanley N Salthe
Dai --

{phenomenon 1}

{phenomenon 2}   -->  {Phenomena 1 & 2} ---> {phenomena 1.2,3}

{phenomenon 3}

The process from left to right is generalization.

‘Information’ IS a generalization.

generalities form the substance of philosophy. Info happens to a case

 of generalization which can be mathematized, which in turn allows

 it to be generalized even more.

So, what’s the problem?

STAN

On Wed, Dec 21, 2016 at 7:44 AM, Dai Griffiths 
wrote:

> >  Information is not “something out there” which “exists” otherwise than
> as our construct.
>
> I agree with this. And I wonder to what extent our problems in discussing
> information come from our desire to shoe-horn many different phenomena into
> the same construct. It would be possible to disaggregate the construct. It
> be possible to discuss the topics which we address on this list without
> using the word 'information'. We could discuss redundancy, variety,
> constraint, meaning, structural coupling, coordination, expectation,
> language, etc.
>
> In what ways would our explanations be weakened?
>
> In what ways might we gain in clarity?
>
> If we were to go down this road, we would face the danger that our
> discussions might become (even more) remote from everyday human experience.
> But many scientific discussions are remote from everyday human experience.
>
> Dai
> On 20/12/16 08:26, Loet Leydesdorff wrote:
>
> Dear colleagues,
>
>
>
> A distribution contains uncertainty that can be measured in terms of bits
> of information.
>
> Alternatively: the expected information content *H *of a probability
> distribution is .
>
> *H* is further defined as probabilistic entropy using Gibb’s formulation
> of the entropy .
>
>
>
> This definition of information is an operational definition. In my
> opinion, we do not need an essentialistic definition by answering the
> question of “what is information?” As the discussion on this list
> demonstrates, one does not easily agree on an essential answer; one can
> answer the question “how is information defined?” Information is not
> “something out there” which “exists” otherwise than as our construct.
>
>
>
> Using essentialistic definitions, the discussion tends not to move
> forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) definition
> of information “as natural selection assembling the very constraints on the
> release of energy that then constitutes work and the propagation of
> organization.” I asked several times what this means and how one can
> measure this information. Hitherto, I only obtained the answer that
> colleagues who disagree with me will be cited. J Another answer was that
> “counting” may lead to populism. J
>
>
>
> Best,
>
> Loet
>
>
> --
>
> Loet Leydesdorff
>
> Professor, University of Amsterdam
> Amsterdam School of Communication Research (ASCoR)
>
> l...@leydesdorff.net ; 
> http://www.leydesdorff.net/
> Associate Faculty, SPRU, University of
> Sussex;
>
> Guest Professor Zhejiang Univ. ,
> Hangzhou; Visiting Professor, ISTIC,
> Beijing;
>
> Visiting Professor, Birkbeck , University of
> London;
>
> 
> http://scholar.google.com/citations?user=ych9gNYJ&hl=en
>
>
>
> *From:* Dick Stoute [mailto:dick.sto...@gmail.com ]
>
> *Sent:* Monday, December 19, 2016 12:48 PM
> *To:* l...@leydesdorff.net
> *Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
> *Subject:* Re: [Fis] What is information? and What is life?
>
>
>
> List,
>
>
>
> Please allow me to respond to Loet about the definition of information
> stated below.
>
>
>
> 1. the definition of information as uncertainty is counter-intuitive
> ("bizarre"); (p. 27)
>
>
>
> I agree.  I struggled with this definition for a long time before
> realising that Shannon was really discussing "amount of information" or the
> number of bits needed to convey a message.  He was looking for a formula
> that would provide an accurate estimate of the number of bits needed to
> convey a message and realised that the amount of information (number of
> bits) needed to convey a message was dependent on the "amount" of
> uncertainty that had to be eliminated and so he equated these.
>
>
>
> It makes sense to do this, but we must distinguish between "amount of
> information" and "information".  For example, we can measure amount of
> water in liters, but this does not tell us what water is and likewise the
> measure we use for "amount of information" does not tell us what
> information is. We can, for example equate the amount of water needed to
> fill a container with the volume of the container, but we should not think
> that water is therefore identical to an empty volume.  Similarly we should
> not think that information is identical to uncertainty.
>
>
>
> By equating the number of bits

[Fis] Something positive

2016-12-22 Thread tozziart...@libero.it

Dear FISers, 
it's excruciating...We did not even find an unique definition of information, 
life, brain activity, consciousness...How could the science improve, if it 
lacks definitions of what itself is talking about?And the old problem of 
science: from above, or from below?  Which is the best approach? 
It seems that we depicted a rather dark, hopeless picture...  However, there 
is, I think, a light in front of us.  The only way to pursue our common goal, I 
think, it is to be free.  Free from our own beliefs.  Enlarge our horizons to 
other fields of science, apart from our own.  Forget metaphysics, of course.  
Look at other disciplines, such as physics, medicine, engineering, biology, 
math...   
Voltaire said: "Il faut cultiver notre jardin" .  But he was wrong.  We have to 
take care of more than a garden. Your own garden is too narrow for your 
beautiful mind. 
Therefore, TANTI AUGURI!And I hope that, the next year, in the 2017 Christmas 
time, every one of us will be expert in a scientific field different from his 
own.   





Arturo TozziAA Professor Physics, University North TexasPediatrician ASL 
Na2Nord, ItalyComput Intell Lab, University 
Manitobahttp://arturotozzi.webnode.it/ 













___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Something positive

2016-12-22 Thread tozziart...@libero.it




Dear FISers, 
it's excruciating...We did not even find an unique definition of information, 
life, brain activity, consciousness...How could the science improve, if it 
lacks definitions of what itself is talking about?
It seems that we depicted a rather dark, hopeless picture...  However, there 
is, I thing, a light in front of us.  I think that the best way to proceed, at 
least the most useful in the last centuries, is the one pursued by Einstein: to 
build an abstract, rather geometric , mathematical model, make testable 
previsions and then to check if it works in the real world.Therefore, I think, 
we need novel, fresh models and theories, more than experiments aiming to 
demonstrate theory-laden, pre-cooked, previsions of scientists. It is the old 
problem of science: from above, or from below?  Which is the best approach? The 
knowledge of the most elementary biological and physical issues is so scarce, 
as demonstrated by this FIS discussion involving foremost scientists from all 
over the world, that the right approach, I think, is to start from above... 
from topology, of course   


Arturo TozziAA Professor Physics, University North TexasPediatrician ASL 
Na2Nord, ItalyComput Intell Lab, University 
Manitobahttp://arturotozzi.webnode.it/ 









___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-22 Thread Terrence W. DEACON
Against information fundamentalism


Rather than fighting over THE definition of information, I suggest that we
stand back from the polemics for a moment and recognize that the term is
being used in often quite incompatible ways in different domains, and that
there may be value in paying attention to the advantages and costs of each.
To ignore these differences, to fail to explore the links and dependencies
between them, and to be indifferent to the different use values gained or
sacrificed by each, I believe that we end up undermining the very
enterprise we claim to be promoting.


We currently lack broadly accepted terms to unambiguously distinguish these
divergent uses and, even worse, we lack a theoretical framework for
understanding their relationships to one another.

So provisionally I would argue that we at least need to distinguish three
hierarchically related uses of the concept:


1. Physical information: Information as intrinsically measurable medium
properties with respect to their capacity to support 2 or 3 irrespective of
any specific instantiation of 2 or 3.


2. Referential information: information as a non-intrinsic relation to
something other than medium properties (1) that a given medium can provide
(i.e. reference or content) irrespective of any specific instantiation of 3.


3. Normative information: Information as the use value provided by a given
referential relation (2) with respect to an end-directed dynamic that is
susceptible to contextual factors that are not directly accessible (i.e.
functional value or significance).


Unfortunately, because of the history of using the same term in an
unmodified way in each relevant domain irrespective of the others there are
often pointless arguments of a purely definitional nature.


In linguistic theory an analogous three-part hierarchic partitioning of
theory IS widely accepted.


1. syntax

2. semantics

3. pragmatics


Thus by analogy some have proposed the distinction between


1. syntactic information (aka Shannon)

2. semantic information (aka meaning)

3. pragmatic information (aka useful information)


This has also often been applied to the philosophy of information (e.g. see
The Stanford Dictionary of Philosophy entry for ‘information’).
Unfortunately, the language-centric framing of this distinction can be
somewhat misleading. The metaphoric extension of the terms ‘syntax’ and
‘semantics’ to apply to iconic (e.g. pictorial) or indexical (e.g.
correlational) forms of communication exerts a subtle procrustean influence
that obscures their naturalistic and nondigital features. This language
bias is also often introduced with the term ‘meaning’ because of its
linguistic connotations (i.e. does a sneeze have a meaning? Not in any
standard sense. But it provides information “about” the state of person who
sneezed.)


So as a first rough terminological distinction I propose using


1. physical information (or perhaps information1)

2. referential information (information2)

3. normative information (information3)


to avoid definitional equivocation and the loss of referential clarity.


I would argue that we use the term ‘information’ in a prescinded way in
both 1 and 2. That is, considered from the perspective of a potential
interpretation (3) we can bracket consideration of any particular
interpretation to assess the possible relational properties that are
available to provide reference (2); and we can bracket both 3 and 2 to only
consider the medium/signal properties minimally available for 2 and 3
irrespective of using them for these purposes.*


Although 2 and 3 are not quantifiable in the same sense that 1 is, neither
are they unconstrained or merely subjective. The possible referential
content of a given medium or sign vehicle is constrained by the physical
properties of the medium and its relationship to its physical context.
Normative information captures the way that referential content can be
correct or incorrect, accurate or inaccurate, useful or useless, etc.,
depending on the requirements of the interpretive system and its relation
to the context. In both cases there are specific unambiguously identifiable
constraints on reference and normative value.


There has been a prejudice in favor of 1 because of the (mistaken) view
that 2 and three are in some deep sense nonphysical and subjective.
Consistent with this view, there have been many efforts to find a way to
reduce 2 and 3 to some expression of 1. Although it is often remarked that
introducing non reduced concepts of referential content (2) and normative
evaluation (3) into the theory of information risks introducing non
quantifiable (and by assumption non scientific) attributes, I think that
this is more a prejudice than a principle that has been rigorously
demonstrated. Even if there is currently no widely accepted non
reductionistic formalization of reference and significance within the
information sciences this is not evidence that it cannot be achieved. One
thin

[Fis] Something positive

2016-12-22 Thread tozziart...@libero.it
Dear FISers, 
it's excruciating...We did not even find an unique definition of information, 
life, brain activity, consciousness...How could the science improve, if it 
lacks definitions of what itself is talking about?
It seems that we depicted a rather dark, hopeless picture...  However, there 
is, I thing, a light in front of us.  I think that the best way to proceed, at 
least the most useful in the last centuries, is the one pursued by Einstein: to 
build an abstract, rather geometric , mathematical model, make testable 
previsions and then to check if it works in the real world.Therefore, I think, 
we need novel, fresh models and theories, more than experiments aiming to 
demonstrate theory-laden, pre-cooked, previsions of scientists. It is the old 
problem of science: from above, or from below?  Which is the best approach? The 
knowledge of the most elementary biological and physical issues is so scarce, 
as demonstrated by this FIS discussion involving foremost scientists from all 
over the world, that the right approach, I think, is to start from above... 
from topology, of course   


Arturo TozziAA Professor Physics, University North TexasPediatrician ASL 
Na2Nord, ItalyComput Intell Lab, University 
Manitobahttp://arturotozzi.webnode.it/ 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] Seasons greetings

2016-12-22 Thread Andrew Fingelkurts / BM-Science
Dear members of FIS,

 

A new year is coming, and we hope that 2017 will be a fruitful year, full of
happiness, new exciting ideas, new experiments and nice results, and new
great times for you!

 

Merry Christmas and Happy New Year 2017!

 

Greetings,

Andrew and Alexander

__ 
Dr. Andrew Fingelkurts, Ph.D.
Co-Head of Research 

cid:image001.gif@01CA9858.F290A470

BM-Science - Brain & Mind Technologies Research Centre
PL 77
FI-02601 Espoo, FINLAND
 
Tel. +358 9 541 4506 
 
andrew.fingelku...@bm-science.com
 
www.bm-science.com/team/fingelkurts.html

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] What is information? and What is life?

2016-12-22 Thread Dai Griffiths
>  Information is not “something out there” which “exists” otherwise 
than as our construct.


I agree with this. And I wonder to what extent our problems in 
discussing information come from our desire to shoe-horn many different 
phenomena into the same construct. It would be possible to disaggregate 
the construct. It be possible to discuss the topics which we address on 
this list without using the word 'information'. We could discuss 
redundancy, variety, constraint, meaning, structural coupling, 
coordination, expectation, language, etc.


In what ways would our explanations be weakened?

In what ways might we gain in clarity?

If we were to go down this road, we would face the danger that our 
discussions might become (even more) remote from everyday human 
experience. But many scientific discussions are remote from everyday 
human experience.


Dai

On 20/12/16 08:26, Loet Leydesdorff wrote:


Dear colleagues,

A distribution contains uncertainty that can be measured in terms of 
bits of information.


Alternatively: the expected information content /H /of a probability 
distribution is .


/H/is further defined as probabilistic entropy using Gibb’s 
formulation of the entropy .


This definition of information is an operational definition. In my 
opinion, we do not need an essentialistic definition by answering the 
question of “what is information?” As the discussion on this list 
demonstrates, one does not easily agree on an essential answer; one 
can answer the question “how is information defined?” Information is 
not “something out there” which “exists” otherwise than as our construct.


Using essentialistic definitions, the discussion tends not to move 
forward. For example, Stuart Kauffman’s and Bob Logan’s (2007) 
definition of information “as natural selection assembling the very 
constraints on the release of energy that then constitutes work and 
the propagation of organization.” I asked several times what this 
means and how one can measure this information. Hitherto, I only 
obtained the answer that colleagues who disagree with me will be 
cited. JAnother answer was that “counting” may lead to populism. J


Best,

Loet



Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net ; 
http://www.leydesdorff.net/
Associate Faculty, SPRU, University of 
Sussex;


Guest Professor Zhejiang Univ. , 
Hangzhou; Visiting Professor, ISTIC, 
Beijing;


Visiting Professor, Birkbeck , University of 
London;


http://scholar.google.com/citations?user=ych9gNYJ&hl=en

*From:*Dick Stoute [mailto:dick.sto...@gmail.com]
*Sent:* Monday, December 19, 2016 12:48 PM
*To:* l...@leydesdorff.net
*Cc:* James Peters; u...@umces.edu; Alex Hankey; FIS Webinar
*Subject:* Re: [Fis] What is information? and What is life?

List,

Please allow me to respond to Loet about the definition of information 
stated below.


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


I agree.  I struggled with this definition for a long time before 
realising that Shannon was really discussing "amount of information" 
or the number of bits needed to convey a message.  He was looking for 
a formula that would provide an accurate estimate of the number of 
bits needed to convey a message and realised that the amount of 
information (number of bits) needed to convey a message was dependent 
on the "amount" of uncertainty that had to be eliminated and so he 
equated these.


It makes sense to do this, but we must distinguish between "amount of 
information" and "information".  For example, we can measure amount of 
water in liters, but this does not tell us what water is and likewise 
the measure we use for "amount of information" does not tell us what 
information is. We can, for example equate the amount of water needed 
to fill a container with the volume of the container, but we should 
not think that water is therefore identical to an empty volume.  
Similarly we should not think that information is identical to 
uncertainty.


By equating the number of bits needed to convey a message with the 
"amount of uncertainty" that has to be eliminated Shannon, in effect, 
equated opposites so that he could get an estimate of the number of 
bits needed to eliminate the uncertainty.  We should not therefore 
consider that this equation establishes what information is.


Dick

On 18 December 2016 at 15:05, Loet Leydesdorff > wrote:


Dear James and colleagues,

Weaver (1949) made two major remarks about his coauthor (Shannon)'s 
contribution:


1. the definition of information as uncertainty is counter-intuitive 
("bizarre"); (p. 27)


2. "In particular, information must not be conf