Re: [PEIRCE-L] What is information and how is it related to 'entropy' ?

2015-04-13 Thread Jerry LR Chandler
Dear Professor Mani:

Thank you for your well- formulated response.  Of course, type theory and its 
many variations are critical to establishing correspondence between nature and 
mathematics. But it fails to satisfy the needs of chemists, biologists and 
physicians for a consistent approach to perplex representations of practical 
problems.  It also fails Hilberts other two criteria of completeness and 
decidability. 

On Apr 12, 2015, at 2:05 AM, A. Mani wrote:

 Arguably many of these fit into the more common axiomatic way of
 defining entropies with associated **intuitions** that we have been
 discussing.

One question arises from your quote above.

What are your views on the roles of intuition in the 150 exact formulations of 
entropy functions?

I presume that IF150 exact formulations of entropy exist,Then some sort of 
extension rules exist such that an infinity of formulations of entropy is 
possible.  (Merely an inductive argument. :-))

In particular, can you use your philosophy of rough sets (in their many 
varieties) to categorize or classify or stage or separate the 150 entropy 
formulas into comparable forms?   

If so, is each class of abstract entropic forms extendable to an infinity of 
functions?
If not, why not?

Cheers

jerry






-
PEIRCE-L subscribers: Click on Reply List or Reply All to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line UNSubscribe PEIRCE-L in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .






Re: [biosemiotics:8249] Re: [PEIRCE-L] What is information and how is it related to 'entropy' ?

2015-04-08 Thread Gary Richmond
Lists,

I had sent a version of this note to Steven off-list, but have added a
single sentence at the end because the message is being sent not only to
PEIRCE-L.

Steven,

At this point I think the list moderators/managers of both the biosemiotics
list and PEIRCE-L (we are in communication) feel that what has been
happening on the lists is not what proper academic/scientific discourse
should look like.

I think it is important for all participants to reflect on this and assume
what they consider to be proper personal responsibility. If my recent
remarks seem inappropriate to you, I must note that I have found some
exchanges involving you on both lists problematic. I do apologize, however,
for forgetting that you had apologized for your Them that can't, teach'
remark.

As Peirce would say, this seems to me essentially a matter of self-control.
How much self-control is each of us capable of? Joe's ideal was that the
list would be largely self-moderating, and he thought that this happened
during his final years. In Joe's growing pronounced quietness during those
years, active participants seemed to be more or less on their best behavior.

I usually do not write as moderator when I participate on peirce-l. In
this, I have followed Joe Ransdell's practice. Ben and I have kept
ourselves reined in because we know that, even when I don't write as
moderator and he and I don't write as co-managers, our posts help preserve
the tone that developed in the preceding years, and we intend to continue
to keep ourselves to a generally peaceful tone. Still, the peirce-l forum
has traditionally allowed people to sound off, as Joe sometimes certainly
did. Joe even wrote guidelines to apply to such discussion - see among the
section titles under How the Forum Works at
http://www.iupui.edu/~arisbe/PEIRCE-L/PEIRCE-L.HTM#forum . This is part of
how the peirce-l forum has self-regulated so as to maintain not only
decorum but quality. Different lists have sometimes divergent ways and, of
course, posts sent also sent to the biosemiotics list should respect that
list's standards as well.

Best,

Gary.

*Gary Richmond*
*Philosophy and Critical Thinking*
*Communication Studies*
*LaGuardia College of the City University of New York*
*C 745*

*718 482-5690 718%20482-5690*




-
PEIRCE-L subscribers: Click on Reply List or Reply All to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line UNSubscribe PEIRCE-L in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .






Re: [biosemiotics:8249] Re: [PEIRCE-L] What is information and how is it related to 'entropy' ?

2015-04-07 Thread Gary Richmond
This from a man who recently insulted anyone and everyone who has taken
teaching/learning seriously by offering that old Shavian quote, Those who
can, do, those who can't teach.

And I add, as if it needed to be said, it is possible both to do important
research *and* to teach well,

I personally agree with John's critique.

Best, Gary

[image: Gary Richmond]

*Gary Richmond*
*Philosophy and Critical Thinking*
*Communication Studies*
*LaGuardia College of the City University of New York*
*C 745*
*718 482-5690*

On Tue, Apr 7, 2015 at 10:42 PM, Steven Ericsson-Zenith ste...@iase.us
wrote:

 Unfortunately, the only person that is really hurt by these claims, is
 John, I encourage you to disengage.

 Regards,
 Steven

 On Tue, Apr 7, 2015 at 6:21 PM, Sungchul Ji s...@rci.rutgers.edu wrote:


 -- Forwarded message --
 From: Sungchul Ji s...@rci.rutgers.edu
 Date: Tue, Apr 7, 2015 at 9:18 PM
 Subject: Re: [PEIRCE-L] What is information and how is it related to
 'entropy' ?
 To: John Collier colli...@ukzn.ac.za


 John,

 For the benefit of these lists, my advise to you is simple: Let's do
 science; not personal attacks.

 I feel sorry for Steven.

 Sung

 On Tue, Apr 7, 2015 at 8:13 PM, John Collier colli...@ukzn.ac.za wrote:

  This is the sort of thing that I have called nuts Sung. Let better
 people than you and me judge. It is unfortunately similar to your bizarre
 comments about Peirce.

 You keep, as Edwina has said many times, repeating the same nonsensical
 interpretations, and never admit you are wrong. It is pointless to try to
 communicate with such a strategy. This is much worse than the strategy that
 Steven has adopted in claiming the a certain position does not make sense
 because he does to understand how it could make sense. You go further
 and try to reject the position of a perfectly comprehensible positon on the
 grounds that it does not agree with what you have previously said. You are
 an academic hazard, much like a drunken drive is a hazard.



 I have already responded to your objections in your interpretation of
 Schrödinger's work, but you seem incapable of digesting them. Like Steven,
 you object to view because they do not correspond to your preconceptions
 without trying to understand the preconceptions involved and thus interpret
 them properly. I am sad that you have imposed such limitations on your
 intellect, which is obviously large.



 John





 *From:* sji.confor...@gmail.com [mailto:sji.confor...@gmail.com] *On
 Behalf Of *Sungchul Ji
 *Sent:* April 7, 2015 8:17 PM
 *To:* John Collier
 *Cc:* biosemiotics
 *Subject:* Re: [PEIRCE-L] What is information and how is it related to
 'entropy' ?



 John,



 You wrote: . . . He points out a bit later that a piece of coal is
 ordered with respect to its disordered burnt state, and can thus do work.



 So, you agree with Schroedinger that, whenever a system transitions from
 an ordered state to a disordered state, the system can do work.  In other
 words, you believe that, when a system performs work, its entropy
 increases. Do you ?  How about me writing this email, which is a form of
 work.  Did my entropy increase as a result of my writing activity ?   I
 don't think so.  What increased is the entropy of the Universe, not that of
 my body.  What actually happened in my body due to my writing activity is a
 decrease in Gibbs free energy (due to my muscle mitochondria burning NADH),
 which is the function of both entropy and energy, i.e., dG = dE + PdV -
 TdS.  So it is clear that what is responsible for my body performing work
 is not entropy increase in my body but Gibbs free energy decrease, i.e., dG
  0, which is always accompanied by an increase in the entropy of the
 Universe but the entropy change, dS, of my body may be positive, negative
 or zero, depending on environment.



 If this analysis is correct, one reason that Schroedinger associated
 order with negative entropy (which violates the Third Law) is his
 pre-occupation with 'entropy' and failure to see that what allows
 non-isolated systems to perform work is not just entropy but free energies.




 All the best.



 Sung









 On Tue, Apr 7, 2015 at 4:53 PM, John Collier colli...@ukzn.ac.za
 wrote:

 I shouldn't have started looking at these posts again.



 John



 *From:* sji.confor...@gmail.com [mailto:sji.confor...@gmail.com] *On
 Behalf Of *Sungchul Ji
 *Sent:* April 7, 2015 4:38 PM
 *To:* John Collier
 *Cc:* biosemiotics
 *Subject:* Re: [PEIRCE-L] What is information and how is it related to
 'entropy' ?



 John, you wrote:



 . . . .the rest of your post seems either commonplace or nuts to me,
 so I will leave things there.



 Can it be that commonplace or nuts are in the eye of the beholder ?



 [John Collier] Of course. Though commonplace is fairly easy to show,
 nuts is clearly subjective. What I meant is that I don't get much reward
 from trying to figure out what you mean. No doubt that is a failing of mine
 rather than a 

Re: [PEIRCE-L] What is information and how is it related to 'entropy' ?

2015-04-04 Thread Jerry LR Chandler
List, Sung


On Apr 4, 2015, at 12:22 AM, Sungchul Ji wrote:

 (18) The concept of entropy has had a long and interesting history, 
 beginning with its implicit introduction by Carnot to its explicit 
 formalization as a state function by Clausius to its statistical treatment by 
 Boltzmann and Gibbs to its application to communications theory by Shannon 
 (Shannon and Weaver 1949). The latter achievement has seemed to several 
 scientists a true generalization of the entropy conception, its freeing from 
 the particular disciplinary framework of thermodynamics for application to 
 probability distributions generally (Gatlin 1972; Yockey 1977). This mistaken 
 belief is a major impediment to productive communication between 
 thermodynamics and information theory, and we will examine it carefully. [3, 
 p. 177].
 

In my initial response to your request, I mentioned books in the 1970's and 
1980's that had slipped from my memory.

Your response identities two of them.

The book by Gatlin was critical at the time.
The book by Yockey, a personal friend, was important at the time but was 
seriously flawed and is seldom referenced.

Another book from this period, by Wickens, was studied for many months and 
remains worthy on mediation.

Parenthetically, I would add that the meaning of the term entropy, once only 
used in a strict mathematical sense that was supported by physical 
measurements, has been decimated beyond any meaningful utility.   It is a prime 
example of the loss of specific meaning of a scientific term when other 
disciplines seek to extend it to their form/method of inquiry.

Further comment. In view of the vast perplexity of the unity of nature, 
predictive science demands clear and unambiguous usage of language and 
mathematics in order to comprehend and validate empirical evidence.  
Technologists recognize the economic importance of this.  

BTW, in view of the maladroit usage of relational terminology, I recently 
coined the phrase, motivated by the perplex sortal logic of chemistry, and in a 
minor way by CSP's trichotomony, economy of relations.  It will play a 
substantial role in my book on Organic Mathematics.

Cheers

Jerry
-
PEIRCE-L subscribers: Click on Reply List or Reply All to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line UNSubscribe PEIRCE-L in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .






Aw: Re: [PEIRCE-L] What is information and how is it related to 'entropy' ?

2015-04-04 Thread Helmut Raulien
List,

not having read Shannon and Weaver, my concept of entropy now relates only to the physical world, that is realworld systems with their system space being the real dimensions x, y, z, resp. longitude, broadness and altitude. And I think, that Jons definition is correct. The other kind of systems are virtual or mind systems, whose system spaces are virtual. Just like when you imagine something, this thing does not exist in real space, but in an imagined space. I assume, Peirceans call this space the Phaneron. Between these two kinds of systems, I think, events -and only events- can pass to and fro. These passing events are called information, I think. So, information is the one thing (better: Kind of event) that can pass the border between body and mind, or between real world and phaneron. While entropy is something only applied to the real world. Is that so?

Best,

Helmut



Gesendet:Samstag, 04. April 2015 um 19:14 Uhr
Von:Jerry LR Chandler jerry_lr_chand...@me.com
An:Peirce List peirce-l@list.iupui.edu
Cc:Sungchul Ji s...@rci.rutgers.edu
Betreff:Re: [PEIRCE-L] What is information and how is it related to entropy ?


List, Sung




On Apr 4, 2015, at 12:22 AM, Sungchul Ji wrote:



(18) The concept of entropy has had a long and interesting history, beginning with its implicit introduction by Carnot to its explicit formalization as a state function by Clausius to its statistical treatment by Boltzmann and Gibbs to its application to communications theory by Shannon (Shannon and Weaver 1949). The latter achievement has seemed to several scientists a true generalization of the entropy conception, its freeing from the particular disciplinary framework of thermodynamics for application to probability distributions generally (Gatlin 1972; Yockey 1977). This mistaken belief is a major impediment to productive communication between thermodynamics and information theory, and we will examine it carefully. [3, p. 177].




In my initial response to your request, I mentioned books in the 1970s and 1980s that had slipped from my memory.



Your response identities two of them.



The book by Gatlin was critical at the time.

The book by Yockey, a personal friend, was important at the time but was seriously flawed and is seldom referenced.



Another book from this period, by Wickens, was studied for many months and remains worthy on mediation.



Parenthetically, I would add that the meaning of the term entropy, once only used in a strict mathematical sense that was supported by physical measurements, has been decimated beyond any meaningful utility.  It is a prime example of the loss of specific meaning of a scientific term when other disciplines seek to extend it to their form/method of inquiry.



Further comment. In view of the vast perplexity of the unity of nature, predictive science demands clear and unambiguous usage of language and mathematics in order to comprehend and validate empirical evidence. Technologists recognize the economic importance of this. 



BTW, in view of the maladroit usage of relational terminology, I recently coined the phrase, motivated by the perplex sortal logic of chemistry, and in a minor way by CSPs trichotomony, economy of relations. It will play a substantial role in my book on Organic Mathematics.



Cheers



Jerry
- PEIRCE-L subscribers: Click on Reply List or Reply All to REPLY ON PEIRCE-L to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the line UNSubscribe PEIRCE-L in the BODY of the message. More at http://www.cspeirce.com/peirce-l/peirce-l.htm .




-
PEIRCE-L subscribers: Click on Reply List or Reply All to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line UNSubscribe PEIRCE-L in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .






[PEIRCE-L] What is information and how is it related to 'entropy' ?

2015-04-03 Thread Sungchul Ji
Jerry, Steven, John, Bob, lists,

I want to thank Jerry for bringing to my attention Miller's impressive
book, Living Systems [1], which I thought I had thumbed through once
but did not: I simply conflated it with another book.
Miller's book is the first biology book that I have seen so far that
provides an extensive discussion on the meaning of information, which I
have collected below as Items (1) through (17). I agree with most of these
items except a few.


(A)  In Item (9), Miller indicates that Schroedinger coined the term
negentropy but it was Brillouin who coined the word as an abbreviation of
Schroedinger's expression negative entropy.  As I pointed out in [2], the
concept of negative entropy violates the Third Law of Thermodynamics (but
that of negative entropy change does not).

(B)  In Item (9), Miller assumes that the negentropy is the same as
information, which assumption being referred to as the Negentropy Principle
of Information (NPI) by Brillouin (1951, 1953, 1956).  I refuted NPI in
[2], in agreement with Wicken [3] based on the  thought experiment called
the Bible test, which was designed to demonstrate the fundamental
difference between Shannon entropy, S_S (also called informational entropy,
S_I) and thermodynamic entropy, S_T:  The S_T of the Bible increases with
temperature but S_I does not.  See Item (28). Wicken's argument against NPI
is summarized in Items (18) through (27) extracted from [3].

(C) In Item (10), Miller assumes that the Second Law of Thermodynamics
stated as a system tends to increase in entropy over time applies to all
physical systems but it does not: It applies only to isolated systems which
cannot exchange any energy or matter with their environment and not to
closed (only energy can be exchanged; e.g., a refrigerator) or open systems
(both energy and matter can be exchanged; e.g., living cells). For example,
the thermodynamic entropy content of a living cell can decrease while its
Shannon entropy can increase when growing and differentiating.

(D)  My conclusion would be that it is impossible to define the relation
between information and thermodynamic entropy without knowing whether the
thermodynamic system under consideration is open, closed or isolated.  In
other words, the relation between *Shannon entropy* (also called
*information*) and *thermodynamic entropy* depends on the nature of
thermodynamic system involved.

(E)  As I indicated in my previous email of today, I believe that
information is an irreducibly triadic relation (as shown in Figure 1
below) but entropy is a pat of the free energy that drives semiosis in
which information is processed.  Free energy is a function of both energy
and thermodynamic entropy.  In other words, information is the whole
system of 3 nodes and 3 arrows whereas entropy is a part of the energy
that drives the processes indicated by the 3 arrows.



 f g
VARIETY   MESSAGE - FUNCTION
 |  ^
 |  |
 |__|
 h

f = selection
g = action
h = information flow

Figure 1.  An irreducibly triadic definition of information, an instance
of the irreducibly triadic semiosis and the sign.




(1) The technical sense of  information (H) . . . . is not the same thing
as meaning or quite the same as information as we usually understand it.
[1, p. 11].

(2) Meaning is the significance of information to a system which possesses
it: it constitutes a change in that system's processes elicited by the
information, often resulting from associates made to it on previous
experience with it.  [1, p. 11].

(3)  Information is . . . .the degree of freedom that exists in a given
situation to choose among signals, symbols, messages, or patterns to be
transmitted. [1, p. 11].

(4) The amount of information is measured as the logarithm to the base 2 of
the number of alternative patterns, forms, organizations, or messages.
 [1, p. 11].

(5) The unit of information, bit, . . .is the amount of information which
relieves the uncertainty when the outcomes of a situation with two equally
likely alternatives is known. [1, p. 11].

(6)  The term marker was used by von Neumann to refer to those observable
bundles, units, or changes of matter-energy whose patterning bears or
conveys the informational symbols from the ensemble or repertoire.  These
might be the stones of Hammurabi's day which bore cuneiform writings,
parchments, writing paper, Indian,s smoke signals, a door key with notches,
punched cards, paper or magnetic tape, a computer's magnetized ferrite core
memory, an arrangement of nucleotides in a DNA molecule, the molecular
structure of a hormone,  pulses on a telegraph wire, or waves emanating
from  a radio station.  If a marker can assume n different states of which