Hello Jon,

Thanks for the explanation of an uncertainty measure and the link to your work 
on information.  If you have worked on Peirce's general account of measurement 
(e.g., the account laid out in the CP at CP 7.280), I'd like to see what you 
have to say.  I see that you have a subchapter in Inquiry Driven Systems titled 
"Measure for Measure, but you appear to be working on a question having to do 
with how we might interpret the quantifiers in logic.  Have you written up an 
explanation or have comments somewhere on his general theory of measurement?

--Jeff

Jeff Downard
Associate Professor
Department of Philosophy
NAU
(o) 523-8354
________________________________________
From: Jon Awbrey [jawb...@att.net]
Sent: Saturday, April 04, 2015 5:36 AM
To: Sungchul Ji; Peirce List
Cc: biosemiotics
Subject: [PEIRCE-L] Re: What is information and how is it related to 'entropy' ?

Sung, List,

>From a mathematical point of view, an “entropy” or “uncertainty” measure is 
>simply a measure on distributions that achieves its maximum when the 
>distribution is uniform. It is thus a measure of dispersion or uniformity.

Measures like these can be applied to distributions that arise in any given 
domain of phenomena, in which case they have various specialized meanings and 
implications.

When it comes to applications in communication and inquiry, the information of 
a sign or message is measured by its power to reduce uncertainty.

The following essay may be useful to some listers:

http://intersci.ss.uci.edu/wiki/index.php/Semiotic_Information

Regards,

Jon

http://inquiryintoinquiry.com

On Apr 4, 2015, at 1:22 AM, Sungchul Ji 
<s...@rci.rutgers.edu<mailto:s...@rci.rutgers.edu>> wrote:

Jerry, Steven, John, Bob, lists,

I want to thank Jerry for bringing to my attention Miller's impressive book, 
"Living Systems" [1], which I thought I had "thumbed through" once but did not: 
I simply conflated it with another book.
Miller's book is the first biology book that I have seen so far that provides 
an extensive discussion on the meaning of "information", which I have collected 
below as Items (1) through (17). I agree with most of these items except a few.


(A)  In Item (9), Miller indicates that Schroedinger coined the term 
"negentropy" but it was Brillouin who coined the word as an abbreviation of 
Schroedinger's expression "negative entropy".  As I pointed out in [2], the 
concept of "negative entropy" violates the Third Law of Thermodynamics (but 
that of "negative entropy change" does not).

(B)  In Item (9), Miller assumes that the negentropy is the same as 
information, which assumption being referred to as the Negentropy Principle of 
Information (NPI) by Brillouin (1951, 1953, 1956).  I refuted NPI in [2], in 
agreement with Wicken [3] based on the  thought experiment called the "Bible 
test", which was designed to demonstrate the fundamental difference between 
Shannon entropy, S_S (also called informational entropy, S_I) and thermodynamic 
entropy, S_T:  The S_T of the Bible increases with temperature but S_I does 
not.  See Item (28). Wicken's argument against NPI is summarized in Items (18) 
through (27) extracted from [3].

(C) In Item (10), Miller assumes that the Second Law of Thermodynamics stated 
as "a system tends to increase in entropy over time" applies to all physical 
systems but it does not: It applies only to isolated systems which cannot 
exchange any energy or matter with their environment and not to closed (only 
energy can be exchanged; e.g., a refrigerator) or open systems (both energy and 
matter can be exchanged; e.g., living cells). For example, the thermodynamic 
entropy content of a living cell can decrease while its Shannon entropy can 
increase when growing and differentiating.

(D)  My conclusion would be that it is impossible to define the relation 
between information and thermodynamic entropy without knowing whether the 
thermodynamic system under consideration is open, closed or isolated.  In other 
words, the relation between Shannon entropy (also called information) and 
thermodynamic entropy depends on the nature of thermodynamic system involved.

(E)  As I indicated in my previous email of today, I believe that "information" 
is an irreducibly triadic relation (as shown in Figure 1 below) but "entropy" 
is a pat of the "free energy" that drives semiosis in which "information" is 
processed.  Free energy is a function of both energy and thermodynamic entropy. 
 In other words, "information" is the whole system of 3 nodes and 3 arrows 
whereas entropy is a part of the "energy" that drives the processes indicated 
by the 3 arrows.



                     f                                 g
VARIETY ------------>  MESSAGE -------------> FUNCTION
     |                                                                  ^
     |                                                                  |
     |______________________________________|
                                     h

f = selection
g = action
h = information flow

Figure 1.  An "irreducibly triadic definition of information", an instance of 
the irreducibly triadic semiosis and the sign.




(1) The technical sense of  "information (H) . . . . is not the same thing as 
meaning or quite the same as information as we usually understand it." [1, p. 
11].

(2) "Meaning is the significance of information to a system which possesses it: 
it constitutes a change in that system's processes elicited by the information, 
often resulting from associates made to it on previous experience with it."  
[1, p. 11].

(3)  Information is ". . . .the degree of freedom that exists in a given 
situation to choose among signals, symbols, messages, or patterns to be 
transmitted." [1, p. 11].

(4) The amount of information is measured as the logarithm to the base 2 of the 
number of alternative patterns, forms, organizations, or messages."  [1, p. 11].

(5) The unit of information, bit, ". . .is the amount of information which 
relieves the uncertainty when the outcomes of a situation with two equally 
likely alternatives is known." [1, p. 11].

(6)  "The term marker was used by von Neumann to refer to those observable 
bundles, units, or changes of matter-energy whose patterning bears or conveys 
the informational symbols from the ensemble or repertoire.  These might be the 
stones of Hammurabi's day which bore cuneiform writings, parchments, writing 
paper, Indian,s smoke signals, a door key with notches, punched cards, paper or 
magnetic tape, a computer's magnetized ferrite core memory, an arrangement of 
nucleotides in a DNA molecule, the molecular structure of a hormone,  pulses on 
a telegraph wire, or waves emanating from  a radio station.  If a marker can 
assume n different states of which only one is present at any given time, it 
can represent at most log_2(n) bits of information."  [1, p. 11].

(7)  "Communication of almost every sort requires that the marker move in 
space, from the transmitting system to the receiving system, and this movement 
follows the same physical laws as the movement of nay other sort of 
matter-energy." [1, pp. 11].

(8)  "The disorder, disorganization, lack of patterning, or randomness of 
organization of a system is known as its entropy (S).  This is the amount of 
progress of a system from improbable to probable states.  The unit in which it 
is measured empirically is ergs or joules per degree absolute." [1, p. 13].

(9) "It was noted by Wiener and by Shannon that the statistical measure for the 
negative of entropy is the same as that for information, which Schroedinger has 
called negentropy."  [1, p. 13].

(10) "Since, according to the second law, a system tends to increase in entropy 
over time, it must tend to decrease in negentropy or information." [1, p. 13].

(11)  "There is therefore no principle of the conservation of information as 
there are principles of the conservation of matter and energy." [1, p. 13].

(12)  "The total information can be decreased in any system without increasing 
it elsewhere, but it cannot be increased without decreasing it elsewhere."  [1, 
p. 13].

(13)   "Making one or more copies of a given information pattern does not 
increase information overall, though it may increase the information
 in the system which receives the copied information."  [1, p. 13].

(14)  "Matter-energy and information always flow together.  Information is 
always borne on a marker." [1, p. 13]


(15)  "Conversely there is no regular movement in a system unless there is a 
difference in potential between two points, which is negative entropy or 
information."  [1, p. 13]

(16)  "Which aspect of the transmission is most important depends upon how it 
is handled by the receiver.  If the receiver responds primarily to the material 
energetic aspect, I shall call it, for brevity, a matter-energy transmission; 
it the response  is primarily to the information, I shall call it an 
information transmission.  For example, the banana eaten by a monkey is a 
nonradndom arrangement of specific molecules, and thus has its informational 
aspect, but its use to the monkey is chiefly to increase the energy available 
to him." [1, p. 13].

(17)  Referring to Table 2-1 in his book, James Miller states "It indicates 
that there are several pairs of antonyms used in the section, one member of 
which is associated with the concept of information (H) and the other member of 
which is associated with its negative, entropy (S)." [1, p. 13].

(18) "The concept of entropy has had a long and interesting history, beginning 
with its implicit introduction by Carnot to its explicit formalization as a 
state function by Clausius to its statistical treatment by Boltzmann and Gibbs 
to its application to communications theory by Shannon (Shannon and Weaver 
1949). The latter achievement has seemed to several scientists a true 
generalization of the entropy conception, its freeing from the particular 
disciplinary framework of thermodynamics for application to probability 
distributions generally (Gatlin 1972; Yockey 1977). This mistaken belief is a 
major impediment to productive communication between thermodynamics and 
information theory, and we will examine it carefully." [3, p. 177].

(19)  "entropy is defined to express constraints on the directions natural 
processes can take, according to the equation dS = dQ/T where dS is the 
differential change in entropy resulting from an infinitesimal flow of heat dQ 
at temperature T. In any spontaneous or irreversible process, entropy always 
increases. Any concept claiming to be a generalization of classical entropy 
would have to share this essential property." [3, p. 177].

(20) "Boltzmann gave thermodynamics its first, rough statistical 
interpretation, introducing the microstate-macrostate distinction on which the 
explanation of irreversibility hangs. Boltzmann used as a model a gas with N 
particles and having a total kinetic energy of E (see Brush 1983). This energy 
was divided into J discrete pieces and assigned to these N particles in all 
possible combinations. Each combination constituted one of W equiprobable 
microstates of the system. Given these assumptions, Boltzmann was able to 
define his entropy function by the equation

H = k ln W or

H = -k ln P

where k is Boltzmann's constant and P = 1/W." [3, p. 177].

(21)  "Natural, entropy producing processes are those that increase the 
microscopic possibilities of the system-environment supersystem. A 
nonequilibrium system can therefore be regarded as compressed in probability 
space  Wicken 1981), accessing only a small fraction of the microstates 
available to it. Irreversible processes are expansions in probability space, 
from macrostates having relatively few microscopic complexions to macrostates 
having relatively many such complexions." [3, p. 178].

(22) "Microstates are only equiprobable if they have the same energy. For 
systems that are open to energy exchanges with their environments, this is not 
usually the case. A more general formula for the entropy of a system, derived 
from Gibbs' systematic development of statistical thermodynamics, is

H = - k E Pi ln Pi

where Pi refers to the energy-dependent probabilities of the various i 
microstates. Under isolated conditions, or under conditions where kinetic 
barriers to reaction keep the system in a single, nonequilibrium macrostate, 
this reduces to Boltzmann's equation." [3, 178].

(23)  "The Shannon equation is

H = - K E Pi log2 Pi,

where K is generally taken as unity. Since proportionality constants and 
logarithm bases are more matters of convenience and scaling than of substance, 
the relationships among the variables in the two equations are identical. Gibbs 
circumspectly referred to his statistical formulations as "entropy analogues" 
rather than "entropies" (Denbigh 1982). The question is then whether the 
Shannon equation generalizes the entropy analogues of statistical mechanics." 
[3, 178].

(24)  "The answer to this question depends on whether Shannon entropies have 
properties consistent with thermodynamic entropies. Issues relevant to this 
question are: (a) Do both entropies behave the same way? and (b) are they both 
based on the same kinds of probabilistic assumptions? Neither is the case." [2, 
178].

(25)  "The starting point for the Shannon entropy is an alphabet of symbols 
which have the capacity to convey information because they can be transmitted 
in alternative sequences. The entire ensemble of possible sequences can be 
abstractly assigned an "entropy", which measures the uncertainty connected with 
knowing a priori the sequence of elements in any given message. Shannon 
initially suggests (Shannon and Weaver 1949, p. 49) that information is carved 
from that entropic space. This much can at least be borne in a spirit of 
suspended judgment. But he slips immediately (p. 50) to assigning entropies to 
the symbols and messages themselves. Here he permanently parts company with 
statistical entropy." [2, p. 180-181].

(26)  "What allows us to assign a thermodynamic system an entropy is that any 
measurable macrostate in which it resides can be expressed in a variety of 
alternative microstates. Since these are all accessible by the system, there is 
an essential uncertainty in knowing its microstate at any instant. I would 
concur with Denbigh (1982) that there is nothing "subjective" about this 
uncertainty, that it belongs to the macrostate by virtue of its ensemble of 
microstates. A message, in contrast, cannot possess entropy. It is what has 
been said, a fait accompli." [3, p. 181].

(27)  "To appreciate the importance of restricting entropy to thermodynamic 
applications -- or, more broadly to applications in which a 
macrostate-microstate relationship obtains, one need only reflect on Weaver's 
remarks about the Shannon formulation making contact with a universal law. It 
does no such thing. Yet, as long as the term "entropy" buttresses the Shannon 
formula, the second law remains a steady source of justification for ideas that 
must find their own grounds of support. If there is a single generalized 
entropy concept manifoldly expressing itself, one might expect all 
exemplifications of it to have the property of increasing in time " [3, p. 187].

(28)  "When one heats up a book such as the Bible, the thermodynamic entropy 
associated with molecular motions of the paper constituting the pages of the 
Bible will increase but the informational entropy associated with the 
arrangement of letters in the Bible will not be altered until the temperature 
increases high enough to burn the Bible.  This thought experiment may be 
conveniently referred to as the Bible test." [4, Footnote c on p. 100].


 All the best.

Sung

--
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701<tel:732-445-4701>

www.conformon.net<http://www.conformon.net>



References:
   [1]  Miller, J. G. (1978).  Living Systems,  McGraw-Hill Book Company, New 
York.
   [2] Ji, S. (2012). The Third Law of Thermodynamics and “Schroedinger’s 
Paradox”<http://www.conformon.net/?attachment_id=1033>.  In:Molecular Theory of 
the Living Cell: Concepts, Molecular Mechanisms, and Biomedical Applications.  
Springer, New York.  pp. 12-15.   PDF available at http://www.conformon.net 
under Publications > Book Chapters.
   [3]  Wicken, J. S. (1985).  Entropy and Information: Suggestions for Common 
Language.  Phil. Sci. 54: 176-193.
   [4]  Ji, S. (2012).  The Information-Entropy Relations.  In:Molecular Theory 
of the Living Cell: Concepts, Molecular Mechanisms, and Biomedical 
Applications.  Springer, New York.  pp. 97-101.
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to