Dear colleagues,

Using the concept of "data", one loads the discussion with an ontology. "Data" is "given" or "revealed" by God. (In antiquity, the holy was hidden and guarded by priests, but Christianity brought the idea of Revelation.) In physics, one talks about "data" and "nature" as given.

It seems to me that we don't need this in a discussion about information. Distributions contain information or, in other words, the expected information content of a distribution can be expressed in bits (dits, nits, etc.) of information. I assume that this is equivalent to Prof. Zhong's object information. The specification of the object ("what is distributed") provides the information with meaning. "In particular, information must not be confused with meaning." (Weaver, 1949, p. 8).

Best,
Loet

PS. When, there is no "given," but only constructs, uncertainty (that is, Shannon-type information) prevails. Instead of a cosmology ("given"), one moves to a chaology of different constructs. The constructs differ in terms of "what is distributed", that is, the specification of "the object". L.

--------------------------------------------------------------------------------
Loet Leydesdorff

Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)

l...@leydesdorff.net <mailto:l...@leydesdorff.net>; http://www.leydesdorff.net/ Associate Faculty, SPRU, <http://www.sussex.ac.uk/spru/>University of Sussex;

Guest Professor Zhejiang Univ. <http://www.zju.edu.cn/english/>, Hangzhou; Visiting Professor, ISTIC, <http://www.istic.ac.cn/Eng/brief_en.html>Beijing;

Visiting Fellow, Birkbeck <http://www.bbk.ac.uk/>, University of London;

http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en


------ Original Message ------
From: "Alex Hankey" <alexhan...@gmail.com>
To: "Krassimir Markov" <mar...@foibg.com>; "FIS Webinar" <fis@listas.unizar.es>
Sent: 10/3/2017 8:08:18 PM
Subject: Re: [Fis] If "data = information", why we need both concepts?

This is a titbit in support of Krassimir Markov.
There was a very interesting paper by Freeman Dyson in about 1970, about which he gave a Colloquium at the MIT Department of Physics which I attended. Dyson had analyzed data taken from higher nuclear energy levels in particular bands far above the ground state - probably using the Mossbauer effect if I remember rightly, because it has a very high resolution. .

Dyson's question was simple: Does the data contain any useful information?
His analysis was that the eigenvalues represented by this selection of
data were no different from those of matrix with Random Entries.
The data were equivalent to a set of random numbers.

Dyson therefore concluded that, 'The Data Contained No Useful Information' for the purpose of understanding the nuclear physics involved.



On 3 October 2017 at 16:46, Krassimir Markov <mar...@foibg.com> wrote:
Dear John and FIS Colleagues,

I am Computer Science specialist and I never take data to be information.

For not specialists maybe it is normal "data to be often taken to be
information" but this is not scientific reasoning.

Simple question: if "data = information", why we need both concepts?


Friendly greetings

Krassimir


Dear list,


As Floridi points out in his Information. Oxford: Oxford University Press, 2010. A volume for the Very Short Introduction series. data is often taken
to be information. If so, then the below distinction is somewhat
arbitrary. It may be useful or not. I think that for some circumstances it is useful, but for others it is misleading, especially if we are trying to
come to grips with what meaning is. I am not sure there is ever data
without interpretation (it seems to me that it is always assumed to be
about something). There are, however, various degrees and depths of
interpretation, and we may have data at a more abstract level that is
interpreted as meaning something less abstract, such as pointer readings
of a barometer and air pressure. The pointer readings are signs of air
pressure. Following C.S. Peirce, all signs have an interpretant. We can
ignore this (abstraction) and deal with just pointer readings of a
particular design of gauge, and take this to be the data, but even the
pointer readings have an important contextual element, being of a
particular kind of gauge, and that also determines an interpretant. Just pointer readings alone are not data, they are merely numbers (which also,
of course, have an interpretant that is even more abstract.

So I think the data/information distinction needs to be made clear in each
case, if it is to be used.

Note that I believe that there is information that is independent of mind, but the above points still hold once we start into issues of observation. My belief is based on an explanatory inference that must be tested (and
also be useful in this context). I believe that the idea of mind
independent information has been tested, and is useful, but I am not going
to go into that further here.


Regards,

John

PS, please note that my university email was inadvertently wiped out, so I
am currently using the above email, also the alias coll...@ncf.ca If
anyone has wondered why their mail to me has been returned, this is why.




On 2017/09/30 11:20 AM, Krassimir Markov wrote:

Dear Christophe and FIS Colleagues,

I agree with idea of meaning.

The only what I would to add is the next:

There are two types of reflections:

1. Reflections without meaning called DATA;

2. Reflections with meaning called INFORMATION.

Friendly greetings
Krassimir


------------------------------
Krassimir Markov
Director
ITHEA Institute of Information Theories and Applications
Sofia, Bulgaria
presid...@ithea.org
www.ithea.org





Dear FISers,


A hot discussion indeed...
We can all agree that perspectives on information depend on the context. Physics, mathematics, thermodynamics, biology, psychology, philosophy, AI,
...

But these many contexts have a common backbone: They are part of the
evolution of our universe and of its understanding, part of its increasing
complexity from the Big Bang to us humans.
And taking evolution as a reading grid allows to begin with the simple. As proposed in a previous post, we care about information ONLY because it
can be meaningful.  Take away the concept of meaning, the one of
information has no reason of existing.
And our great discussions would just not exist. ....
Now, Evolution + Meaning => Evolution of meaning. As already highlighted
this looks to me as important in principles of IS.
As you may remember that there is a presentation on that subject
(http://www.mdpi.com/2504-3900/1/3/211 <http://www.mdpi.com/2504-3900/1/3/211>, https://philpapers.org/rec/MENICA-2 <https://philpapers.org/rec/MENICA-2>) The evolution of the universe is a great subject where the big questions are with the transitions: energy=> matter => life => self-consciousness =>
...
And I feel that one way to address these transitions is with local
constraints as sources of meaning generation.
Best

Christophe

--------------------------------------------------------------------------------

De : Fis <fis-boun...@listas.unizar.es> de la part de
tozziart...@libero.it <tozziart...@libero.it>
Envoyé : vendredi 29 septembre 2017 14:01
À : fis
Objet : Re: [Fis] Principles of IS

Dear FISers,
Hi!
...a very hot discussion...
I think that it is not useful to talk about Aristotle, Plato and Ortega y Gasset, it the modern context of information... their phylosophical, not scientific approach, although marvelous, does not provide insights in a
purely scientific issue such the information we are talking about...

Once and forever, it must be clear that information is a physical quantity.
Please read (it is not a paper of mine!):
Street S.  2016.  Neurobiology as information physics.  Frontiers in
Systems neuroscience.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5108784/ <https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5108784/>

In short, Street shows how information can be clearly defined in terms of
Bekenstein entropy!

Sorry,
and BW...


Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2­Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.w­ebnode.it/ <http://ebnode.it/>


--
Inviato da Libero Mail per Android

venerdì, 29 settembre 2017, 01:31PM +02:00 da Rafael Capurro
raf...@capurro.de:


Dear Pedro,

thanks for food for thought. When talking about communication we should
not forget that Wiener defines cybernetics as "the theory of messages"
(not: as the theory of information) (Human use of human beings, London
1989, p. 15, p. 77 "cybernetics, or the theory of messages" et passim)
Even for Shannon uses the (undefined) concept of message 'as' what is
transmitted (which is not information) is of paramount importance. And so
also at the level of cell-cell communication.

The code or the difference message/messenger is, I think, a key for
interpreting biological processes. In this sense, message/messanger are
'archai' (in the Aristotelian) sense for different sciences (no
reductionism if we want to focus on the differences between the
phenomena). 'Archai' are NOT 'general concepts' (as you suggest) but
originating forces that underline the phenomena in their manifestations
'as' this or that.

From this perspective, information (following Luhmann) is the process of interpretation taking place at the receiver. When a cell, excuse me these
thoughts from a non-biologist, receives a message transmitted by a
messenger, then the main issue is from the perspective of the cell, to
interpret this message (with a special address or 'form' supposed to
'in-form' the cell) 'as' being relevant for it. Suppose this
interpretation is wrong in the sense that the message causes death (to the
cell or the whole organism), then the re-cognition system (its immune
system also) of the cell fails. Biological fake news, so to speak, with
mortal consequences due to failures in the communication.

best

Rafael

Dear FISers,

I also agree with Ji and John Torday about the tight relationship between
information and communication. Actually Principle 5 was stating :
"Communication/information exchanges among adaptive life-cycles underlie the complexity of biological organizations at all scales." However, let me suggest that we do not enter immediately in the discussion of cell-cell communication, because it is very important and perhaps demands some more
exchanges on the preliminary info matters.

May I return to principles and Aristotle? I think that Rafael and Michel
are talking more about principles as general concepts than about
principles as those peculiar foundational items that allow the beginning
of a new scientific discourse. Communication between principles of the
different disciplines is factually impossible (or utterly irrelevant):
think on the connection between Euclidean geometry and politics, biology,
etc. I think Ortega makes right an interpretation about that. When
Aristotle makes the first classification of the sciences, he is continuing
with that very idea. Theoretical sciences, experimental or productive
sciences, and applied or practical sciences--with an emphasis on the
explanatory theoretical power of both physics and mathematics (ehm, Arturo will agree fully with him). I have revisited my old reading notes and I think that the Aristotelian confrontation with the Platonic approach to the unity of knowledge that Ortega comments is extremely interesting for
our current debate on information principles.

There is another important aspect related to the first three principles in my original message (see at the bottom). It would be rather strategic to
achieve a consensus on the futility of struggling for a universal
information definition. Then, the tautology of the first principle ("info is info") is a way to sidestep that definitional aspect. Nevertheless, it is clear that interesting notions of information may be provided relative
to some particular domains or endeavors. For instance, "propagating
influence" by our colleague Bob Logan, Stuart Kauffman and others, and
many other notions or partial definitions as well--I include my own
"distinction on the adjacent" as valuable for the informational approach
in biology. Is this "indefinability" an undesirable aspect? To put an
example from physics, time appears as the most undefinable of the terms,
but it shows up in almost all equations and theories of physics...
Principle three means that one can do a lot of things with info without
the need of defining it.

As for the subject that is usually coupled to the info term, as our
discussion advances further, entering the "information flows" will tend to clarify things. The open-ended relationship with the environment that the
"informational entities" maintain via the channeling of those info
flows--it is a very special coupling indeed--allows these entities the
further channeling of the "energy flows" for self-maintenance. Think on
the living cells and their signaling systems, or think on our "info"
societies. Harold Morowitz's "energy flow in biology" has not been
paralleled yet by a similar "information flow in biology". One is
optimistic that the recent incorporation of John Torday, plus Shungchul Ji and others, may lead to a thought-collective capable of illuminating the
panorama of biological information.

(shouldn't we make an effort to incorporate other relevant parties, also
interested in biological information, to this discussion?)

Best wishes--Pedro

El 23/09/2017 a las 21:27, Sungchul Ji escribió:

Hi Fisers,




I agree.

Communication may be the key concept in developing a theory of informaton.




Just as it is impossible to define what energy is without defining the
thermodynamic system under consideration (e.g., energy is conserved only in an isolated system and not in closed or open systems; the Gibbs free
energy content decreases only when a spontaneous process  occurs in
non-isolsted systems with a constant temperature and pressure, etc), so it
may be that 'information' cannot be defined rigorously without  first
defining the "communication system" under consideration. If this analogy is true, we can anticipate that, just as there are many different kinds of
energies depending on the characteristics of the thermodynamic systems
involved, so there may be many different kinds of 'informations' depending
on the nature of the communication systems under consideration.




The properties or behaviors of all thermodynamic systems depend on their
environment, and there are three  system-environment relations -- (i)
isolated (e.g., the Universe, or the thermos bottle), (ii) closed (e.g.,
refriegerator), and (iii) open (e.g., the biosphere, living cells).




It is interesting to note that, all communication systems (e.g., cell,
organs, animals, humans) may embody ITR (Irreducible Triadic Relation)
which I found it convenient to represent diagramamatically using a 3-node
network arrows as shown below:




                                             f                   g

                                    A ---------->  B --------->  C
                                     |
  ^
                                     |
  |
                                     |__________________|
                                                          h




Figure 1.  The Irreducible Triadic Relation (ITR) of C. S. Peirce
(1839-21914) represented as a 3-node, closed and directed network. The arrows form the commutative triangle of category theory, i.e., operations f followed by g leads to the same result as operation h, here denoted as
fxg = h.

f = information production; g = information interpretation; h =
correspondence or information flow. Please note that Processes f and g
are driven by exergonic physicochemical processes, and h requires a
pre-existing code or language that acts as the rule of mapping A and C.




Again, just as generations of thermodynamicists in the 19-20th centuries have defined various kinds of "energies" (enthalpy, Helmholtz free energy, Gibbs free energy) applicable to different kinds of thermodynamic systems,
so 'information scientists' of the 21st century  may have the golden
opportunity to define as many kinds of 'informations' as needed for the different kinds of "communcation systems" of their interest, some examples
of which being presented in Table 1.




________________________________________________________________________



Table 1. A 'parametric' definition of information based on the values of
the three nodes
                of the ITR, Figure 1.

________________________________________________________________________




Communication system               A                      B
                  C
(Information)

________________________________________________________________________




Cells                                                 DNA/RNA
Proteins                     Chemcal reactions
(Biological informations)
                                 or chemical waves

_________________________________________________________________________





Humans                                            Sender
Message                   Receiver
(Linguistic informations)

_________________________________________________________________________



Signs                                                  Object
Representamen        Interpretant
(Semiotic informations, or

'Universal informations' (?))
__________________________________________________________________________




With all the best.




Sung





--------------------------------------------------------------------------------

From: Fis <fis-boun...@listas.unizar.es> on behalf of JOHN TORDAY
<jtor...@ucla.edu>
Sent: Saturday, September 23, 2017 10:44:33 AM
To: fis@listas.unizar.es
Subject: [Fis] Principles of IS

Dear Fis, I am a newcomer to this discussion, but suffice it to say that I have spent the last 20 years trying to understand how and why physiology
has evolved. I stumbled upon your website because Pedro Maijuan had
reviewed a paper of ours on 'ambiguity' that was recently published in
Progr Biophys Mol Biol July 22, 2017 fiy.
Cell-cell communication is the basis for molecular
embryology/morphogenesis. This may seem tangential at best to your
discussion of Information Science, but if you'll bear with me I will get
to the point. In my (humble) opinion, information is the 'language' of
evolution, but communication of information as a process is the mechanism.
In my reduction of evolution as communication, it comes down to the
interface between physics and biology, which was formed when the first
cell delineated its internal environment (Claude Bernard, Walter B Cannon) from the outside environment. From that point on, the dialog between the environment and the organism has been on-going, the organism internalizing
the external environment and compartmentalizing it to form what we
recognize as physiology (Endosymbiosis Theory). Much of this thinking has come from new scientific evidence for Lamarckian epigenetic inheritance from my laboratory and that of many others- how the organism internalizes information from the environment by chemically changing the information in
DNA in the egg and sperm, and then in the zygote and offspring, across
generations. So here we have a fundamental reason to reconsider what
'information' actually means biologically. If you are interested in any of
my publications on this subject please let me know (jtor...@ucla.edu).
Thank you for any interest you may have in this alternative way of
thinking about information, communication and evolution.




_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis <http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>



Dear FIS Colleagues,

As promised herewith the "10 principles of information science". A couple
of previous comments may be in order.
First, what is in general the role of principles in science? I was
motivated by the unfinished work of philosopher Ortega y Gasset, "The idea
of principle in Leibniz and the evolution of deductive theory"
(posthumously published in 1958). Our tentative information science seems
to be very different from other sciences, rather multifarious in
appearance and concepts, and cavalierly moving from scale to scale. What
could be the specific role of principles herein? Rather than opening
homogeneous realms for conceptual development, these information
principles would appear as a sort of "portals" that connect with essential topics of other disciplines in the different organization layers, but at the same time they should try to be consistent with each other and provide
a coherent vision of the information world.
And second, about organizing the present discussion, I bet I was too
optimistic with the commentators scheme. In any case, for having a first
glance on the whole scheme, the opinions of philosophers would be very
interesting. In order to warm up the discussion, may I ask John Collier,
Joseph Brenner and Rafael Capurro to send some initial comments /
criticisms? Later on, if the commentators idea flies, Koichiro Matsuno and Wolfgang Hofkirchner would be very valuable voices to put a perspectival end to this info principles discussion (both attended the Madrid bygone
FIS 1994 conference)...
But this is FIS list, unpredictable in between the frozen states and the chaotic states! So, everybody is invited to get ahead at his own, with the
only customary limitation of two messages per week.

Best wishes, have a good weekend --Pedro


10 PRINCIPLES OF INFORMATION SCIENCE

1. Information is information, neither matter nor energy.

2. Information is comprehended into structures, patterns, messages, or flows.

3. Information can be recognized, can be measured, and can be processed
(either computationally or non-computationally).

4. Information flows are essential organizers of life's self-production
processes--anticipating, shaping, and mixing up with the accompanying
energy flows.

5. Communication/information exchanges among adaptive life-cycles underlie
the complexity of biological organizations at all scales.

6. It is symbolic language what conveys the essential communication
exchanges of the human species--and constitutes the core of its "social
nature."

7. Human information may be systematically converted into efficient
knowledge, by following the "knowledge instinct" and further up by
applying rigorous methodologies.

8. Human cognitive limitations on knowledge accumulation are partially
overcome via the social organization of "knowledge ecologies."


9. Knowledge circulates and recombines socially, in a continuous
actualization that involves "creative destruction" of fields and
disciplines: the intellectual Ars Magna.


10. Information science proposes a new, radical vision on the information
and knowledge flows that support individual lives, with profound
consequences for scientific-philosophical practice and for social
governance.





--
John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal, Durban
Collier web page



--------------------------------------------------------------------------------


_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis <http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>


_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis <http://listas.unizar.es/cgi-bin/mailman/listinfo/fis>



--
Alex Hankey M.A. (Cantab.) PhD (M.I.T.)
Distinguished Professor of Yoga and Physical Science,
SVYASA, Eknath Bhavan, 19 Gavipuram Circle
Bangalore 560019, Karnataka, India
Mobile (Intn'l): +44 7710 534195
Mobile (India) +91 900 800 8789
____________________________________________________________

2015 JPBMB Special Issue on Integral Biomathics: Life Sciences, Mathematics and Phenomenological Philosophy <http://www.sciencedirect.com/science/journal/00796107/119/3>
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to