SM is statistical mechanics. I don’t recall Peirce ever discussing it, though 
it was well known at his time, and proven beyond a doubt with Einstein’s ex 
planation of Brownian motion in 1906. Before that many French theorists 
rejected it because atoms and molecules were not observables.

I think that for some time now most physicists have agreed that order emerges 
from disorder, along the lines outlined by Prigogine (he won the Nobel prize, 
after all). David Layzer’s 1990Cosmogenesis makes the idea pretty clear on a 
cosmic scale. I applied the idea to biological information systems in the first 
paper in the journal Biology and Philosophy in 1986. At the time it was a bit 
adventurous, but not especially so. Entropy production is behind the formation 
of order; order doesn’t just happen on its own. The chance aspects of entropy 
production are crucial to the emergence of order, but the overall trend is 
always to increasing disorder. Personally, I think that all thirdness 
originates this way, through symmetry breaking, and I wrote an article on that 
Information Originates in Symmetry Breaking (1996). I did not see it as 
confirming Peirce’s ideas about habit formation, and I am still very doubtful 
that he didn’t just goof on this whole issue because of a lack of understanding 
of SM.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Clark Goble [mailto:cl...@lextek.com]
Sent: Thursday, 06 April 2017 7:39 PM
To: Peirce-L <PEIRCE-L@list.iupui.edu>
Subject: Re: [PEIRCE-L] Sign as Triad vs. Correlate of Triadic Relation (Was 
semantic problem with the term)


> On Apr 6, 2017, at 12:03 AM, John Collier 
> <colli...@ukzn.ac.za<mailto:colli...@ukzn.ac.za>> wrote:
>
> There is still an understanding gap between QM and SM, largely due to the 
> fact that the theory of QM is deterministic. I have heard good scientists say 
> that QM is the basis of entropy, but I don’t find their arguments sound.

I’d tend to agree that reconciling QM to TD hasn’t been well thought through. 
I’m not sure that entails TD doesn’t apply (not that you are making that claim 
- just emphasizing the distinction)

What do you mean by SM?

> I don’t think I agree with Edwina that firstness is entropic, though in some 
> cases it can be.

I took her to just be making that claim in a narrow area of inquiry.

> I think it is important to distinguish between chance and randomness. Peirce 
> focuses on chance. Chance events can be deterministic on the larger scale, 
> such as when we have a chance meeting with a friend in the store. Nothing in 
> either of our determining that we will be in the store at that time is 
> coordinated with our friend’s determinants except that these determinants 
> become coordinated when we meet. Without both stories together, the meeting 
> is chance, but not random in the technical sense, since the stories together 
> can be compressed to mark our meeting. I call situations like this relative 
> randomness: two histories are not sufficient individually to predict a common 
> event – they don’t contain enough information to compute this event, but the 
> stories together do, assuming determinism.

This is more or less what I was getting at. The combination of 
chance/determinism can lead to unique situations, such as Peirce argues happens 
with mind. I want to address Jon’s point about distinguishing between chance 
and what we might call variants of agency. I think a fair bit of work has been 
done on that in the free will literature. I’m not sure though that Peirce draws 
the distinctions that we’ve seen in the last 20-30 years of that literature. 
(Not that we should expect him to)

I’ll probably not get to Jon’s answer until later though.

> In any case, I don’t see the divergence Clark apparently sees in the use of 
> the concept of entropy.
>

Not quite sure what you mean by that. I was just speaking of how the universe 
crystalizes into a system of higher information than was there at the beginning 
for Peirce. Peirce’s solution is just to say that TD only applies to the 
determinate part of a system. That is he doesn’t see entropy as an universal 
law, but a much more limited law. Is that more or less what you’re agreeing 
with or are you agreeing with me that such a claim is problematic for most 
physicists? Could you clarify a bit here?


-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to 
peirce-L@list.iupui.edu<mailto:peirce-L@list.iupui.edu> . To UNSUBSCRIBE, send 
a message not to PEIRCE-L but to 
l...@list.iupui.edu<mailto:l...@list.iupui.edu> with the line "UNSubscribe 
PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .



-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L 
to this message. PEIRCE-L posts should go to peirce-L@list.iupui.edu . To 
UNSUBSCRIBE, send a message not to PEIRCE-L but to l...@list.iupui.edu with the 
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at 
http://www.cspeirce.com/peirce-l/peirce-l.htm .




Reply via email to