Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-14 Thread Bob Logan
Hello Sung and Arturo. Entropy is a measure of disorder and ΔS > 0. If entropy 
is zero at T = 0 K because there is no disorder at T = absolute zero then 
entropy can only increase from T = 0 K. If that is the case how can entropy 
ever be negative?

Arturo asked me to share a private email I sent to him about the elephant and 
the 3 blind men. He urged me to share it with the group. Because we are limited 
to 2 posts per week I waited until I had something else to post. So here by 
Arturo’s request is our correspondence:

Dear Bob, 
A nice story!
I think you must share it with FISers.

...even if I think that, while one blind man is touching the elephant, another 
is touching a lion, and another a deer...
In other words, your tale says that a single elephant does exist, while I'm not 
so sure...
However, don't worry, I will not make a public critic to your nice tale!

--
Inviato da Libero Mail per Android

venerdì, 06 ottobre 2017, 02:15PM +02:00 da Bob Logan lo...@physics.utoronto.ca 
:

Caro Arturo - and thanks for your feedback. 

The discussion of info on the FIS list (re what is info) is like the 3 blind 
men inspecting an elephant. It is a rope said the blind man holding the tail; 
no It is a snake said the blind man holding the trunk; no It is a tree said the 
blind man touching the elephant’s leg.

I refrained from using this story on the FIS list as it might come off as 
insulting - what do you think - should I share it with the group 

tanti auguri - Bob 


__

Robert K. Logan
Prof. Emeritus - Physics - U. of Toronto 
Fellow University of St. Michael's College
Chief Scientist - sLab at OCAD
http://utoronto.academia.edu/RobertKLogan
www.researchgate.net/profile/Robert_Logan5/publications
https://www.physics.utoronto.ca/people/homepages/logan


On Oct 14, 2017, at 9:25 PM, Sungchul Ji  wrote:

Hi Arturo,

I agree.  Engtropy can be negative MATHEMATICALLY, as Shroedinger assumed.
But what I am claiming is that that may be a mathematical artifact, since, 
according to the Third Law of Thermodynamics, therer is no negative entropy.  

All the best.

Sung


From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 6:02 PM
To: Sungchul Ji
Cc: fis@listas.unizar.es
Subject: Re[2]: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
 
Dear Sung, 
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds 
true.  
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.  
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."
https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html
--
Inviato da Libero Mail per Android
venerdì, 13 ottobre 2017, 10:11PM +02:00 da Sungchul Ji 
s...@pharmacy.rutgers.edu :

Hi Arturo,

(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,

"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."

(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?
(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].

Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:

 - S = - k lnW = k ln (1/W)

and then equating W with disorder, D, led him to 

- S = k ln (1/D).

Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that

"negative entropy = order".

As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.

Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]

"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless." 

(4) If my argument in (3) is valid, this may provide an example of what may be 
called 

the "Unreasonable Ineffectiveness of Mathematics"

which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.  

All the best.

Sung



 





References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.
   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.
   [3] Ji, S. (2012).  The Third Law of Thermodynamics 

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-14 Thread Sungchul Ji
Hi Arturo,


I agree.  Engtropy can be negative MATHEMATICALLY, as Shroedinger assumed.

But what I am claiming is that that may be a mathematical artifact, since, 
according to the Third Law of Thermodynamics, therer is no negative entropy.


All the best.


Sung



From: tozziart...@libero.it 
Sent: Friday, October 13, 2017 6:02 PM
To: Sungchul Ji
Cc: fis@listas.unizar.es
Subject: Re[2]: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION


Dear Sung,
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds true.
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."

https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html

--
Inviato da Libero Mail per Android

venerdì, 13 ottobre 2017, 10:11PM +02:00 da Sungchul Ji 
s...@pharmacy.rutgers.edu:


Hi Arturo,


(1)  I don't understand where you got (or how you can justify) S = 1 J/K in 
your statement,


"With the same probability mass function, you can see that H = S/(ln(2)*kB), so 
setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits."


(2) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
this equality mean physically ?

(3) This reminds me of what Schroedinger did when he came up with the 
conclusion that "negative entropy" is equivalent to "order", which led to 
Brillouin's so-called the "negentropy Principle of Information (NPI)" [1, 2].


Simply by multiplying the both sides of the Boltzmann equation with negative 
one, Schroedinger obtained the following formula:


 - S = - k lnW = k ln (1/W)


and then equating W with disorder, D, led him to


- S = k ln (1/D).


Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
he concluded that


"negative entropy = order".


As you can see, the above derivation is mathematically sound but the result 
violates the Third Law of Thermodynamics, according to which thermodynamic 
entropy cannot be less than zero.


Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
as follows [3]


"Schroedinger's paradox refers to the mathematical equations, concepts, or 
general statements that are formally true but physically meaningless."


(4) If my argument in (3) is valid, this may provide an example of what may be 
called


the "Unreasonable Ineffectiveness of Mathematics"


which, together with Wigner's "Unreasonable Effectiveness of Mathematics", may 
constitute an Yin-Yang pair of mathematics.


All the best.


Sung











References:
   [1]  Brillouin, L. (1953).  Negentropy Principle of Information, J. Applied 
Phys. 24(9), 1152-1163.

   [2]  Brillouin, L. (1956). Science and Information Theory, Academic Press, 
Inc., New York, pp. 152-156.

   [3] Ji, S. (2012).  The Third Law of 
Thermodynamics
 and “Schroedinger’s Paradox”.  
In:Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and 
Biomedical Applications.  Springer, New York.  pp. 12-15.  PDF at 
http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_paradox.pdf









From: tozziart...@libero.it 
>
Sent: Friday, October 13, 2017 4:43 AM
To: Sungchul Ji; fis@listas.unizar.es
Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

Dear Sung,
One J/K corresponds to 1.045×1023 bits.

Indeed,
The Gibbs entropy formula states that thermodynamic entropy S equals 
kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and 
pi is the probability of microstate i. On the other hand, the Shannon entropy 
is defined as H = sum[pi*log2(1/pi)], with units of bits. With the same 
probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 
1J/K gives a Shannon entropy of 1.045×1023 bits.

On the other side, The 
energy
 consumption per bit of data on the Internet is around 75 μJ at low access 
rates and decreases to around 2-4 μJ at an access rate of 100 Mb/s.
see:

[Fis] Learn to Predict the Colour of Garden Peas in Twelve Easy Steps

2017-10-14 Thread Karl Javorszky
   1. A historic parallel: a cultural handicap

We are at Mendel again. There is an unmistakable parallel between a single
person trying to drive attention of members of a learned society to general
rules and principles that are discernible on multitudes of objects in the 19
th century and a single person trying to drive attention of members of a
learned society to general rules and principles that are discernible on
multitudes of objects in the 21st century.

The difficulty is that a) the idea to be raised is new, b) it needs
thinking, c) it necessitates a change in perspectives, d) it is told by
means of boring numbers, e) it unveils a huge system of logical relations,
f) these logical relations have no established names for the concepts, yet.

The similarity is, that a) the members of the learned society are not
really stupid, b) they only want to be left alone, c) they want to keep
congratulating each other how clever and wise they are, d) they share a
cultural handicap – and this cultural handicap is the subject I wish to say
a few words about to you, my dear friends.



   1. The neurotic inhibition of intelligence – a faux pas

Some few months ago, I had the opportunity to speak in person to one of the
members of this Society, an opportunity for which I am thankful and which I
hope to be able to reciprocate. The faux pas I unintentionally committed
was by remarking – with guards lowered, because of the friendly and
nourishing conversation with a nice and clever fellow – that I see the real
difficulty in making people *want to *Learn to Count, as opposed to that
chimp-like usage of their digits what they believe to be counting, to be
based in the normal neurotic inhibition of intelligence.

Well, this really closed the door in the mind of my conversation partner.
Not enough that I state that I can count and all the people have no idea of
what counting is, above elementary school level, above all that, me having
the cheek to call him, to his face, an intellectually inhibited person,
inhibited because of being a neurotic – this was just about the call for a
very polite, polished and friendly end to the conversation.



   1. The healthy person is a mixture of all psychiatric symptoms

Bear with me, if after comparing myself to Mendel, now I recall troubles
Freud has had coming from calling excitation patterns that are spreading
(with an exponent >1.0) like a viral disease, “sexual” excitations. The
term “sexual excitation” means and has meant in the trade avalanching
patterns, also observable in hysteric rages, and in a negative – overly
flattening, inhibitory – version in induced stupor and flexibilitas cerea.
This is the name for the avalanching pattern of nervous excitation
propagating across the central nervous system. History has it that at the
time the concept was introduced, the sexual excitation pattern was the best
known and most obvious pattern to name the idea after.

In the trade, one knows that depression is too much of serenity,
schizophrenia is too much of creativity, hysteria is too much of emotional
expressivity, paranoia is too much of caution/foresight, servility is too
much of empathy, autism is too much of concentration, and so forth and so
forth. The “too much” means usually too frequently and too long and too
regularly, not only too intensely.

The way names are assigned to observations is by using the extreme grade of
the property and using it as a description of the whole range, well into
the middle, usual extents.



   1. The neurotic inhibition of intelligence – a cultural asset

The property to follow rules means that one saves a lot of senseless work.
We do not have to reinvent the wheel every generation and every individual.
Once we have learnt that a²+b²=c², we do not have to figure out complicated
methods to calculate surfaces or angles or distances.

Members of a subculture recognise each other by means of the beliefs and
values they share. If it had been that easy to leave the shared knowledge
of Aristoteles, Galileo and Copernicus had had no troubles at all. If it
had been that easy to drop the knowledge codified by Newton, Einstein would
have met much less resistance. If it had been so easy to leave aside all
that unordered, messy, belief-based convolute of empty words about what
makes garden peas green or yellow, serious work could have begun about 40
years sooner. The social stability is based on the common smell, shared by
all the chimps which reside on the same tree.



   1. Self-declaration of competence of we the people

A self-made man is a good thing. A self-declared prophet is usually a
complicated case. A self-declared innovator can hardly be otherwise. A body
of competent persons installed by higher authorities is not really a
collection of self-declared persons.  In fact, there is a natural
opposition between Pilate and the person he judges (“you had no authority,
had in not been transferred unto you from higher above” vs. “You say I am a
king, though 

Re: [Fis] Data - Reflection - Information

2017-10-14 Thread Mark Johnson
Dear Loet, 

When you say "distinguishing between the information content and the meaning of 
a message requires a discourse" this is, I think, a position regarding what 
scientific discourse does. There are, of course, competing descriptions of what 
scientific discourse does.  Does your "meaning" refers to the meaning of 
scientific discovery? Do we want to defend a definition of meaning which is 
tied to scientific practice as we know it? Would that be too narrow? Ours may 
not be the only way of doing science... 

A non-discursive science might be possible - a science based around shared 
musical experience, or meditation, for example. Or even Hesse's 
"Glasperlenspiel"... Higher level coordination need not necessarily occur in 
language. Our communication technologies may one day give us new 
post-linguistic ways of coordinating ourselves. 

Codification is important in our science as we know it. But it should also be 
said that our science is blind to many things. Its reductionism prevents 
effective interdisciplinary inquiry, it struggles to reconcile practices, 
bodies, and egos, and its recent obsession with journal publication has 
produced the conditions of Babel which has fed the pathology in our 
institutions. There's less meaning in the academy than there was 50 years ago.

The business of sense and reference which Terry refers to (and which provided a 
foundation for Husserl) is indeed problematic. Some forms of communication have 
only sense and yet there is coordination, emotion and meaning.  Peirce saw 
something different in the underlying symmetry of communication. This is in 
Bateson too (symmetrical/asymmetrical schizmogenesis).

It may be that it is symmetrical principles underpin quantum mechanical 
phenomena like entanglement; they certainly pervade biology. Medieval logicians 
may have seen this: Duns Scotus's ideas on "synchronic contingency" for 
example, mirror what quantum physicists are describing.

The implication is that our distinguishing between information and meaning in 
science may be an epiphenomenon of something deeper.

Best wishes,

Mark



-Original Message-
From: "Loet Leydesdorff" 
Sent: ‎14/‎10/‎2017 16:06
To: "Terrence W. DEACON" ; "Sungchul Ji" 

Cc: "foundationofinformationscience" 
Subject: Re: [Fis] Data - Reflection - Information

Dear Terry and colleagues, 


"Language is rather the special case, the most unusual communicative adaptation 
to ever have evolved, and one that grows out of and depends on 
informationa/semiotic capacities shared with other species and with biology in 
general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" of 
mankind. "Meaning" can be provided by non-humans such as animals or networks, 
but distinguishing between the information content and the meaning of a message 
requires a discourse. The discourse enables us to codify the meaning of the 
information at the supra-individual level. Discursive knowledge is based on 
further codification of this intersubjective meaning. All categories used, for 
example, in this discussion are codified in scholarly discourses. The 
discourse(s) provide(s) the top of the hierarchy that controls given the 
cybernetic principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" instead of 
"meaning". Perhaps, this has advantages; but I am not so sure that the 
difference is more than semantic. In Cartesian Meditations (1929) he argues 
that this intersubjective intentionality provides us with the basis of an 
empirical philosophy of science. The sciences do not begin with observations, 
but with the specification of expectations in discourses. A predator also 
observes his prey, but in scholarly discourses, systematic observations serve 
the update of codified (that is, theoretical) expectations.


Best,
Loet___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


[Fis] my contribution to the ongoing information / meaning debate

2017-10-14 Thread Steven Bindeman
While I sometimes feel as if I speak a different language than many of you, I 
believe that my initial notes for  a book on creativity may shed some light on 
the ongoing current discussion. Simply put, information is what a machine has 
the capacity to produce. However,  a machine cannot produce meaning -- which is 
essentially the  interpretation of this information.


For what it's worth, here are my notes on creativity:



The Creative Process


Meaning does not exist on its own in the world. Nor does it exist within the 
interiority of human consciousness. Rather it is something we actively create 
when we engage with the world around us. We create meaning — we don’t merely 
find it or discover it. Creativity, then, is an essential part of the process 
of meaning apprehension. This means that it is not  merely a personality trait 
or a personality state;  nor is it something special that some lucky people 
have more of than others. (While talent on the other hand is special, it’s not 
the same thing as creativity.)


Our relationship to our surroundings is, then, necessarily creative. This means 
that we are constantly adjusting our understanding of our environment in a 
creative way at all times. We never merely engage with it in a passive way, but 
are always and actively either accepting or denying the apparent truth of our 
perceptions. This is we maintain our memory — and with the help of this memory 
we engage in the process of maintaining our identity, our sense of self. 


In this context we can see how a painter never paints the objective truth of 
his subject. He paints what he sees — and depicting the truth of his perception 
of what he sees is his essential challenge. This is true even for “realist” 
painters and for photographers as well. They arrange and organize what they 
have seen until they are satisfied that their art has captured the unique 
characteristics of their own perceptual acts. Their “art” consists in trying to 
make the viewer’s experience of this object seem “real” to them, too — whatever 
this term might mean.


Creativity, then, is a process. It is an ongoing engagement by human 
consciousness with the  gradual depiction, over time, of a particular object. 
As Bergson put it, “reality is that (which) creates itself gradually… that is, 
absolute duration” (Creative Evolution, 385).  This means that for Bergson we 
cannot separate reality from temporality.


Humanity for Bergson  was essentially homo faber — tool-making, pragmatic, 
analytic. Yet life itself is essentially qualitative, and therefore only 
accessible otherwise — meaning, not through mere pragmatic analysis. “We see 
that the intellect, so skillful in dealing with the inert, is awkward the 
moment it touches the living. Whether it wants to treat the life of the body or 
the life of the mind, it proceeds with the rigor, the stiffness and the 
brutality of an instrument not designed for such use” (Selections from Bergson, 
88). Throughout Creative Evolution, Bergson insisted that life must be equated 
with creation, because only creativity can adequately account for both the 
continuity of life and the discontinuity of thought. But if humans only possess 
analytic intelligence, then how are we ever to know the essence of life (which 
Bergson called the “élan vital”)? Bergson's answer was that at the periphery of 
intelligence a fringe of instinct survives, namely intuition,  and because of 
it we are able to have access to the essence of life.  In his view, instinct 
and intelligence are not simply self-contained and mutually exclusive states. 
They are both rooted in, and hence inseparable from, the duration that informs 
all life, all change, all becoming. Thanks to intuition, humanity can turn 
intelligence against itself in order to seize life itself. 


>From this point of view, creativity is not just something that only artists 
>can do; it is not their unique province. Even if we include mathematicians and 
>scientists and recognize the important place that creativity plays in their 
>work too, we miss the point if we try to delimit the extent that creativity 
>plays in our lives. Bergson is arguing here, then, that whenever we relate to 
>the world around us in a meaningful way, then we are engaged in a creative 
>act. 


For Bergson, though, poets and other artists start from a fuller view of 
reality than the rest of us.  They plumb the depths in such a way that they can 
lay hold of the potential in the real, taking up with what nature has left, 
namely a mere outline or sketch of something, something which remains 
incompletely lodged in the memory, in order to make of it a finished work of 
art. The result enables us to discover, in the things which surround us, more 
qualities and more shades than we would otherwise naturally perceive. Our view 
of reality is thereafter altered, and we begin to realize that it is possible 
to move beyond the limits of our own perceptions.  While art in this 

Re: [Fis] Data - Reflection - Information

2017-10-14 Thread Loet Leydesdorff

Dear Terry and colleagues,

"Language is rather the special case, the most unusual communicative 
adaptation to ever have evolved, and one that grows out of and depends 
on informationa/semiotic capacities shared with other species and with 
biology in general."
Let me try to argue in favor of "meaning", "language", and "discursive 
knowledge", precisely because they provide the "differentia specifica" 
of mankind. "Meaning" can be provided by non-humans such as animals or 
networks, but distinguishing between the information content and the 
meaning of a message requires a discourse. The discourse enables us to 
codify the meaning of the information at the supra-individual level. 
Discursive knowledge is based on further codification of this 
intersubjective meaning. All categories used, for example, in this 
discussion are codified in scholarly discourses. The discourse(s) 
provide(s) the top of the hierarchy that controls given the cybernetic 
principle that construction is bottom up and control top-down.


Husserl uses "intentionality" and "intersubjective intentionality" 
instead of "meaning". Perhaps, this has advantages; but I am not so sure 
that the difference is more than semantic. In Cartesian Meditations 
(1929) he argues that this intersubjective intentionality provides us 
with the basis of an empirical philosophy of science. The sciences do 
not begin with observations, but with the specification of expectations 
in discourses. A predator also observes his prey, but in scholarly 
discourses, systematic observations serve the update of codified (that 
is, theoretical) expectations.


Best,
Loet

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis