I am inclined to agree, Jerry, but I think your concept of entropy is too 
narrow. Thermodynamics has been subsumed under statistical mechanics which is 
both more general and more powerful. Boltzmann grounded it in what he called 
the "complexions" of a system, by which he meant the independent physical 
variants. The maximal entropy of a system macrostate occurs when all of its 
complexions are equally likely, in accord with the famous Boltzmann equation. 
Heat capacities and the like can in principle be calculated from these (Maxwell 
made some pretty good progress during his lifetime), as can other special 
cases. But I would submit that anything that has Boltzmann complexions will 
have an entropy with all of the properties that are general to entropy (1st 
Law, 2nd Law, 3rd Law in particular). This is not true of "entropies" that are 
not based on physical complexions, most notoriously the Shannon entropy.

However, I fear the battle to restrict 'entropy' at the very least to cases 
that involve the three laws of thermodynamics (which incidentally are only true 
statisitically) was lost a long time ago.

Regards,
John

-----Original Message-----
From: Jerry LR Chandler [mailto:jerry_lr_chand...@me.com] 
Sent: April 6, 2015 8:14 PM
To: A. Mani; Peirce List
Subject: Re: [PEIRCE-L] Re: What is information and how is it related to 
'entropy' ?

Dear Professor Mani:

Your post is an excellent example of how the meaning of a unique scientific 
term, coined for an exact reason to be consistent with a particular theory, 
changes it meaning by adding adjectives that demand a separate meaning.

> neighbourhood systems,

> extensions to fuzzy sets

Neither of these meanings is related to thermo-dynamics.   Probably not related 
to temporal direction either if my reading of your usage of:  
> contribute to specific perspectives of understanding the ontology of 
> information semantics relative the systems


if correct.

To understand these usages, one must have a grasp of the essential relations 
between mathematics and thermodynamics.  Neither of these two terms relate to 
the essential nature of either heat capacity or temporal direction, do they?

The persistent attempt to extend the concept of entropy to being a driving 
force for evolution/emergence is simply beyond the pale of scientific meaning. 
In this case, after the context of heat capacity and temporal direction are 
sacrificed ,  mathematics itself is left behind.   Or, do you see this as 
otherwise?

My plea is simple.

If scientists wish to denote a new concept, then coin a new word or at least a 
phrase that places the meaning in context.

Of course, this plea will often follow on deaf ears.

My question to you is:

Is it possible to use a crisp form of hybrid logic to separate your meanings of 
entropy from thermodynamic entropy?

Cheers

Jerry




On Apr 6, 2015, at 3:06 PM, A. Mani wrote:

> On Sat, Apr 4, 2015 at 6:06 PM, Jon Awbrey <jawb...@att.net> wrote:
>> From a mathematical point of view, an "entropy" or "uncertainty" measure is
>> simply a measure on distributions that achieves its maximum when the
>> distribution is uniform. It is thus a measure of dispersion or uniformity.
>> 
>> Measures like these can be applied to distributions that arise in any given
>> domain of phenomena, in which case they have various specialized meanings
>> and implications.
>> 
>> When it comes to applications in communication and inquiry, the information
>> of a sign or message is measured by its power to reduce uncertainty.
>> 
>> The following essay may be useful to some listers:
>> 
>> http://intersci.ss.uci.edu/wiki/index.php/Semiotic_Information
> 
> 
> Adding to the discussion
> 
> 
> "entropy" has been extended to neighbourhood systems, granulations
> with the intent of capturing roughness and information uncertainty in
> rough set theory. There are extensions to fuzzy sets as well. These
> measures essentially contribute to specific perspectives of
> understanding the ontology of information semantics relative the
> systems
> 
> The measures implicitly assume a frequentist position - the
> probabilistic connections are not good enough. When fuzzy granulations
> are used, then the interpretation (by analogy with probabilistic
> idealisation) breaks down further.
> 
> 
> 
> 
> 
> Regards
> 
> A. Mani
> 
> 
> 
> Prof(Miss) A. Mani
> CU, ASL, AMS, ISRS, CLC, CMS
> HomePage: http://www.logicamani.in
> Blog: http://logicamani.blogspot.in/
> http://about.me/logicamani
> sip:girlprofes...@ekiga.net
> 
> -----------------------------
> PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON 
> PEIRCE-L to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu 
> . To UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu 
> with the line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
> http://www.cspeirce.com/peirce-l/peirce-l.htm .
> 
> 
> 
> 



-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to