Gesendet: Dienstag, 14. April 2015 um 21:51 Uhr
Von: "Helmut Raulien" <h.raul...@gmx.de>
An: biosemiot...@lists.ut.ee
Betreff: [biosemiotics:8316] Re: A unified theory of the AMOUNT of
Dear Sung, Stan, Edwina, List,
because it is about systems, information, and mind, I would like to utter my proposal of systemizing all that: I think, that there is mainly two kinds of systems: Reality systems and mind systems. They are distinguished by their system spaces: Reality systems have for their space the real space consisting of the three dimensions x, y, z. Mind systems use an ideal space. Information goes from ideal space systems to real space sytems, and perhaps again to an ideal (mind) space system. Example: Imagine two hydrogen atoms in an elsehow empty space. When they get into contact, they form a hydrogen molecule. But before they have gotten into contact, they already were similar or even equal. Now this relation "equality": Where did it exist in real space? One cannot say. It cannot be located in one of them, because then the other is missing. It cannot be located in both of them, because these are two spots, not one locale. So the location of "equality" is not a locale in the real space, but in an ideal space. If we agree to replace "ideal" with "mind" (because a thing we have in mind, also exists not in real space, but in the ideal space of our imagination), then this would be the proof, that the universe has got a mind. In a mindless universe there would not be any equality- and a lack of virtually everything else too, including signs. I also think, that there is no real contradiction between constructivism and realism. Each -ism is just putting the emphasis either on reality systems or on mind systems- but both exist.
Best,
Helmut
 
 
Gesendet: Dienstag, 14. April 2015 um 14:47 Uhr
Von: "Stanley N Salthe" <ssal...@binghamton.edu>
An: biosemiot...@lists.ut.ee
Betreff: [biosemiotics:8315] Re: A unified theory of the AMOUNT of information
Sung -- Replying to
So, if we are on the shore, we can see porpoises with our naked eye, as compared to through the telescope or a TV monitor.  But in all these cases, aren't we seeing signs of porpoises and not porpoises themselves, because we don't have any porpoises in our head, do we ?
 
S: The porpoises that we conclude are out there in the ocean are a result of various transformations and perceptions/interpretations made in our brains on the basis of sensations at the eye -- PLUS our learned knowledge that there are porpoises who look this way and do this and that.
 
STAN
 
On Mon, Apr 13, 2015 at 4:37 PM, Sungchul Ji <s...@rci.rutgers.edu> wrote:
Stan,
 
Thanks. That is a nice metaphor.
 
So, if we are on the shore, we can see porpoises with our naked eye, as compared to through the telescope or a TV monitor.  But in all these cases, aren't we seeing signs of porpoises and not porpoises themselves, because we don't have any porpoises in our head, do we ?
 
All the best.
 
Sung
 
On Mon, Apr 13, 2015 at 2:42 PM, Stanley N Salthe <ssal...@binghamton.edu> wrote:
Sung -- The submarine metaphor shows that a system transforms its input into its own mode pf understanding; in this case various machine modes.  That is, it doesn't 'see' e.g., porpoises, but instead electrical impulses on a screen.
 
STAN
 
On Mon, Apr 13, 2015 at 1:12 PM, Sungchul Ji <s...@rci.rutgers.edu> wrote:
Stan,
 
 
What is Maturana and Valera's submarine metaphor ?
 
Sung
 
On Mon, Apr 13, 2015 at 10:02 AM, Stanley N Salthe <ssal...@binghamton.edu> wrote:

 

Sung -- Replying to your:

What does it mean to say

J: "the global information capacity of a macroscopic system must behave entropically when viewed internally." ?

   S: It means that no matter how much information is obtained internally by a searching system, it will always require more because its searching has itself disturbed, and changed, the environing system.  “Macroscopic” insures that friction will be in play.  The Second Law states that the physical entropy of an isolated system will never decrease. My statement is that the informational entropy of an environing system will never decrease as a result for a search for it. 

J: Is there any mathematical equation to go with this statement ?

   S: No, I don’t do maths.

J: Can you give me an example illustrating your point ?

   S: A hunter is searching for deer. He gently brushes aside a branch to get a better look.  A deer hears this slight motion and becomes alerted. (You could also recall Maturana & Varela’s submarine metaphor of searching for information in the ocean.)

    (An example could be vitiated by the searching system having a very limited practical goal requiring little information to be obtained.)

 
On Sun, Apr 12, 2015 at 3:36 PM, Sungchul Ji <s...@rci.rutgers.edu> wrote:
Stan,
 
What does it mean to say
 
"the global information capacity of a macroscopic system must behave entropically when viewed internally." ?
 
Is there any mathematical equation to go with this statement ?
 
Can you give me an example illustrating your point ?
 
Sung
 
 
 
 
On Sun, Apr 12, 2015 at 3:12 PM, Stanley N Salthe <ssal...@binghamton.edu> wrote:
Sung -- Among other things, you wrote:
 
(2)  Although both H and S share the same name "entropy", their meanings are not the same in that, e.g., S in isolated systems increase with time and temperature but H does not.  In other words, S obeys the Second Law of thermodynamics but H does not.  This is demonstrated in the thought experiment called the "Bible test" [2, see Footnote c in Table 4.3].
 
Here is a paper I wrote:
 

Journal of Ideas, 1: 54-59,    1990

Sketch of a logical demonstration that the global information capacity of a macroscopic system must behave entropically when viewed internally.

(Journal defunct -- here is a slightly updated version, 2005)

S.N. Salthe

Abstract: This paper attempts to sketch out in what way macroscopic information must be entropic. If this can be shown, a larger science, of infodynamics, could subsume thermodynamics and information theory. It is crucial for these purposes that a finite observer be stipulated for all informational exchanges, and, in order to achieve the desired result, that the observer must be located inside the supersystem that contains the object systems it interprets.

Keywords: dissipative structure, hierarchy, internalism, semiotics, uncertainty

Thus, as the Second Law holds ONLY under certain stipulations; well, informational entropy also behaves in the same manner - 'entropically' -- under certain circumstances!

STAN

 
On Sun, Apr 12, 2015 at 2:50 PM, Sungchul Ji <s...@rci.rutgers.edu> wrote:
(If the table below is distorted, please see the PDF file attached.)
 
Hi, 
 
I am in the middle of finishing a paper in which I am proposing that what I call the "Planckian information", I_P, measured in bits, is a new measure of organization or order in atoms, enzymes, cells, brains, human societies, and the cosmos.  This proposal is based on our recent findings that the so-called Planckian distribution equation (see Footnote *** in Table 1 below) fits long-tailed histograms generated in atomic physics, molecular biology, cell biology, brain neuroscience, econophysics, and cosmic microwave background radiation physics.  
 
 

Table 1.  A unified theory of the amount of information (UTAI) carried by a sign:
 

I = A log (B/C)
 

where A = proportionality constant, B = the number of possible messages available at the message source, and C = the number of messages selected.

 

Symbol

Name

A

B

C

Statistical mechanics

S

entropy,

Boltzmann entropy

k

Number of possible complexions*

Number of selected complexions*

Communication theory

H

entropy,

Shannon information

-K

1

P**

Natural and human sciences

IP

Planckian information [1]

1

AUC(PDE)***

AUC(GLE)***

 

*Understood here as the number of possible states at the microscopic (or micro) level of a system.
**The probability of a message being selected.

***AUC = area under the curve of the Planckian distribution equation (PDE), y = (a/(Ax + B)^5)/(Exp (b/(Ax + B) – 1), or the Gaussian-like equation (GLE), y = A Exp (-(x – mu)^2/(2*sigma^2)), where A is a free parameter.  IP is thought to be a new measure of organization or order.

 

If the content of Table 1 is right, we can conclude with reasonable confidence that

 

(1) Statistical entropy S and Shannon entropy H can be viewed as instantiations or TOKENS of the more abstract definition of information given in the legend to Table 1 called the "Unified Theory of the Amount of Information" (UTAI), which may be viewed as the information TYPE. 

 

(2)  Although both H and S share the same name "entropy", their meanings are not the same in that, e.g., S in isolated systems increase with time and temperature but H does not.  In other words, S obeys the Second Law of thermodynamics but H does not.  This is demonstrated in the thought experiment called the "Bible test" [2, see Footnote c in Table 4.3].

 

(3)  Information can be thought of as resulting from a selection process characterized by a ratio, B/C, where B = the number of all possible choices, and C = the number of choices actually selected.

 

(4) Many have suggested that information has three distinct aspects -- quantity, meaning, and value.  UTAI can only deal with the AMOUNT of information, not its meaning nor its value.

 

(5)  There are many kinds of information just as there are many kinds of energies (chemical, electrical, gravitational, kinetic, potential, nuclear, solar, electromagnetic, etc.). Hence we can speak about Boltzmann' S as "molecular information", Shannon's H as "probability-dependent information (?)", and I_P as the Planckian information.  The meanings of these kinds of information would depend critically on the detailed mechanism of selection operating at the message source level.   

 

(6)  More generally "information" can be defined as the correlation between the source (or the 'object' in the language of Peircean semiotics) and the receiver ('interpretant') of a communication system.  The message carried by the messenger ('sign' or 'representamen') in the communication system can be identified with "information".  The net result of such a mediated process can be described as the 'information flow' from the source to the receiver.

 

(7) Just as the Peircean sign is an irreducible triad (i.e., it cannot be defined without all of the 3 nodes, i.e., object, representamen and interpretant, connected by the three edges representing, I suggest,  'natural process', 'mental process', and 'information flow'), so I maintain that 'information' is another "irreducible triad" (of source, messenger, and receiver).

 

(8) The UTAI may be considered as the 'quantitative functor' connecting the mathematical aspects of  communication and semiotics.

 

(9) I predict that there is the 'qualitative functor' (based on the assumed principle of quantity-quality complementarity) that connects the qualitative aspects of communication and semiotics, and this qualitative functor may be identified with natural and formal languages.

 

Any questions, comments, or corrections would be appreciated.

 

All the best.

 

Sung

 

 

Reference:
   [1] Ji, S. (2015).  Planckian distributions in molecular machines, living cells:  The wave-particle duality in biomedical sciences.  Proceedings of the International Confernec eon Biology and Biomedical Engineering.  Vienna, March 15-17, 2015.  Pp. 115-137.  Uploaded to ResearchGate in March, 2015.

   [2] Ji, S. (2012).  The Information-Entropy Relations. In, Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms, and Biomedical Applications.  Springer, New York.  Pp. 97-101.  
 
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net
 
 
--
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net
 
 
--
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net
 
 
--
Sungchul Ji, Ph.D.

Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701

www.conformon.net
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to