[Fis] TR: : Principles of IS

2017-09-30 Thread Christophe Menant
Dear Krassimir,

Thanks for highlighting that aspect of the subject.
However, I'm not sure that “meaning” is enough to separate information from 
data.  A basic flow of bits can be considered as meaningless data. But the same 
flow can give a meaningful sentence once correctly demodulated.
I would say that:
1) The meaning of a signal does not exist per se. It is agent dependent.
- A signal can be meaningful information created by an agent (human voice, ant 
pheromone).
- A signal can be meaningless (thunderstorm noise).
- A meaning can be generated by an agent receiving the signal 
(interpretation/meaning generation).
2) A given signal can generate different meanings when received by different 
agents (a thunderstorm noise generates different meanings for someone walking 
on the beach or for a person in a house).
3) The domain of efficiency of the meaning should be taken into account (human 
beings, ant-hill).

Regarding your positioning of data,  I'm not sure to understand your 
"reflections without meaning".
Could you tell a bit more ?

All the best
Christophe


De : Krassimir Markov 
Envoyé : samedi 30 septembre 2017 11:20
À : christophe.men...@hotmail.fr
Cc : Foundation of Information Science
Objet : Re: [Fis] TR: Principles of IS

Dear Christophe and FIS Colleagues,

I agree with idea of meaning.

The only what I would to add is the next:

There are two types of reflections:

1. Reflections without meaning called DATA;

2. Reflections with meaning called INFORMATION.

Friendly greetings
Krassimir


--
Krassimir Markov
Director
ITHEA Institute of Information Theories and Applications
Sofia, Bulgaria
presid...@ithea.org
www.ithea.org
ITHEA
www.ithea.org
ITHEA ISS; Membership; Awards; Photogalery; Credits








Dear FISers,


A hot discussion indeed...
We can all agree that perspectives on information depend on the context.
Physics, mathematics, thermodynamics, biology, psychology, philosophy, AI,
...

But these many contexts have a common backbone: They are part of the
evolution of our universe and of its understanding, part of its increasing
complexity from the Big Bang to us humans.
And taking evolution as a reading grid allows to begin with the simple.
As proposed in a previous post, we care about information ONLY because it
can be meaningful.  Take away the concept of meaning, the one of
information has no reason of existing.
And our great discussions would just not exist. 
Now, Evolution + Meaning => Evolution of meaning. As already highlighted
this looks to me as important in principles of IS.
As you may remember that there is a presentation on that subject
(http://www.mdpi.com/2504-3900/1/3/211,
https://philpapers.org/rec/MENICA-2)
The evolution of the universe is a great subject where the big questions
are with the transitions: energy=> matter => life => self-consciousness =>
...
And I feel that one way to address these transitions is with local
constraints as sources of meaning generation.
Best

Christophe



De : Fis  de la part de
tozziart...@libero.it 
Envoyé : vendredi 29 septembre 2017 14:01
À : fis
Objet : Re: [Fis] Principles of IS

Dear FISers,
Hi!
...a very hot discussion...
I think that it is not useful to talk about Aristotle, Plato and Ortega y
Gasset, it the modern context of information... their phylosophical, not
scientific approach, although marvelous, does not provide insights in a
purely scientific issue such the information we are talking about...

Once and forever, it must be clear that information is a physical quantity.
Please read (it is not a paper of mine!):
Street S.  2016.  Neurobiology as information physics.  Frontiers in
Systems neuroscience.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5108784/

In short, Street shows how information can be clearly defined in terms of
Bekenstein entropy!

Sorry,
and BW...


Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2­Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.w­ebnode.it/


--
Inviato da Libero Mail per Android

venerdì, 29 settembre 2017, 01:31PM +02:00 da Rafael Capurro
raf...@capurro.de:


Dear Pedro,

thanks for food for thought. When talking about communication we should
not forget that Wiener defines cybernetics as "the theory of messages"
(not: as the theory of information) (Human use of human beings, London
1989, p. 15, p. 77 "cybernetics, or the theory of messages" et passim)
Even for Shannon uses the (undefined) concept of message 'as' what is
transmitted (which is not information) is of paramount importance. And so
also at the level of cell-cell communication.

The code or the difference message/messenger is, I think, a key for
interpreting biological processes. In this sense, message/messanger are
'archai' (in the Aristotelian) sense for different sciences (no
reduc

Re: [Fis] Principles of IS

2017-09-30 Thread Michel Petitjean
Dear Arturo, dear FISers,

Citing Beck (Contemp. Phys. 2009, 50, 495–510. doi:
10.1080/00107510902823517), Street wrote: << information can be
defined as a negation of thermodynamic entropy (Beck, 2009): I=-S >>
(pls. read the equal sign with three bars, I don't know how to type
the three bars sign).
But Beck wrote about information theory (i.e. the probabilistic one):
<<  One then defines the entropy S as ‘missing information’, i.e. S=-I
>>.
Thus it is not what claimed Street: (i) Beck referred to probability
theory (no thermodynamics there), and (ii) Beck defined S from I, not
I from S.
So the claim of Street is doubtful, if not false.
Bt the way, the Publisher of "Frontiers Systems in Neuroscience" was
classified as predatory in the Beall's list, but let us forget it.

Beck is in agreement to what is told on
https://en.wikipedia.org/wiki/Entropy_(information_theory), << The
inspiration for adopting the word entropy in information theory came
from the close resemblance between Shannon's formula and very similar
known formulae from statistical mechanics. >>
As far as I know, what is related in the Wikipedia page is an historical fact.
Entropy has thus two meanings: a physical quantity in thermodynamics,
and a math quantity in the framework of modeling communication
science.
Information is also a math quantity in the framework of modeling
communication science: it is a modeling concept which is not physical.
Playing again with words, some people introduced the term information
back in thermodynamics, thus concluded that information is physical.
In my opinion it is not a good practice: it adds confusion.

Best regards,

Michel.

Michel Petitjean
MTi, INSERM UMR-S 973, University Paris 7,
35 rue Helene Brion, 75205 Paris Cedex 13, France.
Phone: +331 5727 8434; Fax: +331 5727 8372
E-mail: petitjean.chi...@gmail.com (preferred),
michel.petitj...@univ-paris-diderot.fr
http://petitjeanmichel.free.fr/itoweb.petitjean.html

2017-09-29 14:01 GMT+02:00  :
> Dear FISers,
> Hi!
> ...a very hot discussion...
> I think that it is not useful to talk about Aristotle, Plato and Ortega y 
> Gasset, it the modern context of information... their phylosophical, not 
> scientific approach, although marvelous, does not provide insights in a 
> purely scientific issue such the information we are talking about...
>
> Once and forever, it must be clear that information is a physical quantity.
> Please read (it is not a paper of mine!):
> Street S.  2016.  Neurobiology as information physics.  Frontiers in Systems 
> neuroscience.
>
> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5108784/
>
> In short, Street shows how information can be clearly defined in terms of 
> Bekenstein entropy!
>
> Sorry,
> and BW...
>
> Arturo Tozzi
> AA Professor Physics, University North Texas
> Pediatrician ASL Na2­Nord, Italy
> Comput Intell Lab, University Manitoba
> http://arturotozzi.w­ebnode.it/
>
> -

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] TR: Principles of IS

2017-09-30 Thread Krassimir Markov
Dear Christophe and FIS Colleagues,

I agree with idea of meaning.

The only what I would to add is the next:

There are two types of reflections:

1. Reflections without meaning called DATA;

2. Reflections with meaning called INFORMATION.

Friendly greetings
Krassimir


--
Krassimir Markov
Director
ITHEA Institute of Information Theories and Applications
Sofia, Bulgaria
presid...@ithea.org
www.ithea.org





Dear FISers,


A hot discussion indeed...
We can all agree that perspectives on information depend on the context.
Physics, mathematics, thermodynamics, biology, psychology, philosophy, AI,
...

But these many contexts have a common backbone: They are part of the
evolution of our universe and of its understanding, part of its increasing
complexity from the Big Bang to us humans.
And taking evolution as a reading grid allows to begin with the simple.
As proposed in a previous post, we care about information ONLY because it
can be meaningful.  Take away the concept of meaning, the one of
information has no reason of existing.
And our great discussions would just not exist. 
Now, Evolution + Meaning => Evolution of meaning. As already highlighted
this looks to me as important in principles of IS.
As you may remember that there is a presentation on that subject
(http://www.mdpi.com/2504-3900/1/3/211, 
https://philpapers.org/rec/MENICA-2)
The evolution of the universe is a great subject where the big questions
are with the transitions: energy=> matter => life => self-consciousness =>
...
And I feel that one way to address these transitions is with local
constraints as sources of meaning generation.
Best

Christophe



De : Fis  de la part de
tozziart...@libero.it 
Envoyé : vendredi 29 septembre 2017 14:01
À : fis
Objet : Re: [Fis] Principles of IS

Dear FISers,
Hi!
...a very hot discussion...
I think that it is not useful to talk about Aristotle, Plato and Ortega y
Gasset, it the modern context of information... their phylosophical, not
scientific approach, although marvelous, does not provide insights in a
purely scientific issue such the information we are talking about...

Once and forever, it must be clear that information is a physical quantity.
Please read (it is not a paper of mine!):
Street S.  2016.  Neurobiology as information physics.  Frontiers in
Systems neuroscience.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5108784/

In short, Street shows how information can be clearly defined in terms of
Bekenstein entropy!

Sorry,
and BW...


Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2­Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.w­ebnode.it/


--
Inviato da Libero Mail per Android

venerdì, 29 settembre 2017, 01:31PM +02:00 da Rafael Capurro
raf...@capurro.de:


Dear Pedro,

thanks for food for thought. When talking about communication we should
not forget that Wiener defines cybernetics as "the theory of messages"
(not: as the theory of information) (Human use of human beings, London
1989, p. 15, p. 77 "cybernetics, or the theory of messages" et passim)
Even for Shannon uses the (undefined) concept of message 'as' what is
transmitted (which is not information) is of paramount importance. And so
also at the level of cell-cell communication.

The code or the difference message/messenger is, I think, a key for
interpreting biological processes. In this sense, message/messanger are
'archai' (in the Aristotelian) sense for different sciences (no
reductionism if we want to focus on the differences between the
phenomena). 'Archai' are NOT 'general concepts' (as you suggest) but
originating forces that underline the phenomena in their manifestations
'as' this or that.

From this perspective, information (following Luhmann) is the process of
interpretation taking place at the receiver. When a cell, excuse me these
thoughts from a non-biologist, receives a message transmitted by a
messenger, then the main issue is from the perspective of the cell, to
interpret this message (with a special address or 'form' supposed to
'in-form' the cell) 'as' being relevant for it. Suppose this
interpretation is wrong in the sense that the message causes death (to the
cell or the whole organism), then the re-cognition system (its immune
system also) of the cell fails. Biological fake news, so to speak, with
mortal consequences due to failures in the communication.

best

Rafael

Dear FISers,

I also agree with Ji and John Torday about the tight relationship between
information and communication. Actually Principle 5 was stating :
"Communication/information exchanges among adaptive life-cycles underlie
the complexity of biological organizations at all scales." However, let me
suggest that we do not enter immediately in the discussion of cell-cell
communication, because it is very important and perhaps demands some more
exchanges on the preliminary info