Hi,

There should be the negative sign in front of "Sum" in Equations
(091915-1), (091915-2), and (091915-3), which will not affect the content
of this post: i.e., Equations (091915-4) through (091915-8) still hold.

Sorry for the omission.

Sung



---------- Forwarded message ----------
From: Sungchul Ji <s...@rci.rutgers.edu>
Date: Wed, Sep 23, 2015 at 6:04 AM
Subject: Fwd: The First Law of Quantitative Semiotics: Information =
Changes in Shannon Entropy, or I = dH
To: Sungchul Ji <sji.confor...@gmail.com>



---------- Forwarded message ----------
From: Sungchul Ji <s...@rci.rutgers.edu>
Date: Sat, Sep 19, 2015 at 5:07 PM
Subject: The First Law of Quantitative Semiotics: Information = Changes in
Shannon Entropy, or I = dH
To: PEIRCE-L <peirce-l@list.iupui.edu>


Hi,

(*1*) I think semiotics can be divided into two branches --the *qualitative*
 and *quantitative  *semiotics, i.e., the study of *qualitative signs (*e.g.,
words) and *quantitative signs *(e.g., numbers), respectively. *Quantitative
semiotics* may be identified with (or referred to as) *informatics*.  This
post is then about the *First Law of Informatics (FLI)*. Examples of the
former would include classical philosophy, linguistics, literature, arts,
molecular biology, and those of the latter include *number-based *sciences
and engineering such as physics, chemistry, quantitative biology, computer
science and engineering, and mathematics.

(*2*) Most of the discussions in the semiotics literature, including those
seen on these lists, are almost exclusively concerned with what I would
define as *qualitative semiotics, *since rarely do *numbers *and
associated *mathematical
equations *occur in them.

(*3*) Two of the most important 'quantitative signs' (i.e., the signs that
can be quantified) are *H* and *I*, the former standing for the well
known *Shannon
entropy* and the latter *Shannon information. *  Because both of these
signs are often defined by the same mathematical equation known as the *Shannon
formula*, (091915-1), *H* and *I *are viewed as synonymous, which has
caused great confusions in the field of informatics (the scientific study
of information):

                                Sum(from i = 1 to i = n) pi log pi
                                                          (091915-1)

where pi is the probability of the i^th event (or symbol in a message)
occurring, n is the number of possible events (or symbols) under
consideration, and log is the binary logarithm, i.e., y = log x means that
x = 2^y, or that y is the exponent to the base 2  leading to x.

(*4*) Strictly speaking, Eq. (091915-1) applies to H, and not to I:


                                 H =  Sum(from i = 1 to i = n) pi log pi
                                                      (091915-2)


(*5*) In contrast, the Shannon information I involves the difference
between two H values::

                                I  = H (final) - H(initial) = Sum(from i=1
to i=n) dPi log dPi                          (091915-3)

where H(final) and H(initial) are the Shannon entropy of the semiotic
system under consideration in the final state (i.e, after receiving I) and
in the initial state (i.e., before receiving I), respectively, and dpi is
the change in the probability of the i^th event (or symbol) occurring that
is induced by receiving I, or the I-induced changes in the probability of
the i^th event (or symbol).

(*6*)  Since dH = H(final) - H(initial) in Eq. (091915-3) can be  positive,
zero, or negative, the information (or organization) of the system under
consideration can be increased, unchanged or decreased when it receives
information.  In other words, Eq. (091915-3) states that information I is
equal to the change in the Shannon entropy induced by the reception of I:

                                                 I = dH

 (091915-4)

where d indicates "change in".  I suggest that Eq. (091915-4) be referred
to as the *First Law of Quantitative Semiotics *(FLQS), because violating
it inevitably leads to a paradox as explained in (*7*).

(*7*)  As indicated in (*3*), many investigators equate I and H:

                                               I = H

(091915-5)

Eq. (091915-5) is invalid because H *maximizes* and I *minimizes* when the
system under consideration becomes completely disordered or randomized,
thus violating the equality sign. Formally speaking, Eq. (091915-5) is
invalid because it conflates H (absolute value, either positive or
negative) and dH (a difference).

(*8*)  A similar error appears to have been committed by Schroedinger when
he conflated - S and dS and claimed that, since thermodynamic entropy, S,
represents disorder, its negative counterpart, i.e., -S, must represent
order [1]:

                                     S = disorder   (correct)
                                                              (091915-6)

                                   - S = order       (wrong)
                                                               (091915-7)

                                   dS = S(final) - S(initial) = order if <
0 & disorder if > 0    (correct)              (091915-8)

Eq. (091915-7) is wrong because there cannot be any "negative entropy"
according to the Third Law of Thermodynamics [1].

(*9*) If FLQS given in Eq. (091915-4) is right, information I can be
positive (information gained or uncertainty reduced), zero (no changes in
information or uncertainty) or negative (information lost or uncertainty
increased), whereas Shannon entropy H is always positive.

Any questions or comments would be welcome.

Sung

-
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net


References:
   [1] Ji, S. (2012). The Third Law of Thermodynamics and “Schroedinger’s
Paradox” <http://www.conformon.net/?attachment_id=1033>.  In:*Molecular
Theory of the Living Cell: Concepts, Molecular Mechanisms, and Biomedical
Applications.*  Springer, New York.  pp. 12-15.  PDF at
http://www.conformon.net under Publications > Book Chapters.

-



-- 
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net



-- 
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to