On Saturday, July 13, 2002, at 03:33  PM, Hal Finney wrote:

> Causality is a fascinating topic and one I hope to learn more about,
> perhaps via the Pearl book Tim mentioned or another that someone might
> recommend.  Here are some very random and disorganized thoughts on the
> topic.

I may order the Jim Joyce book that Wei Dai is reading...the material 
overlaps a lot of the Pearl material, based on the Amazon summary and 
sample pages, but it never hurts to have two views of the same material. 
BTW, Pearl has a discusson of Newcomb's Paradox and Joyce even devotes 
an entire chapter to it (based on the TOC Amazon shows).

(I haven't commented on Newcomb's Paradox. I read Martin Gardner's 
column on Robert Nozick's treatment of NP when it first came out, around 
1973. Fascinating then, fascinating now. Bu there are a thousand views 
of Newcomb's Paradox in the naked city, to coin a phrase.)

> Googling on "causality" led to a brief web page that summarized
> 2000 years of philosophical thought in a few paragraphs,
> http://www.helsinki.fi/~mqsalo/documents/causalit.htm.  Generally there
> seem to be two schools of thought; one is that causality is an artifact
> of our minds' efforts to understand and interpret the world; the other
> is that causality has objective reality.

One reason I like the recent scientific papers on causality, light 
cones, universes, toposes, etc. is to move beyond the b.s. college bull 
sessions about determinacy and suchlike.

> The modern philosopher's definition of causality has always struck me as
> somewhat backwards: "The cause of any event is a preceding event without
> which the event in question would not have occurred."  This means that
> "A causes B" is equivalent to A happening before B, plus the logical
> proposition, "if B then A".  You'd think that A causes B would mean "if
> A then B".  But philosophers interpret it to mean "if not-A then not-B",
> as stated above (I've read this elsewhere also), which is equivalent to
> "if B then A".

That's the contrapositive, always true as stated. (If A then B is the 
same as if Not B then Not A.)

> So if we say "heavy rains cause flooding" we mean that if there were no
> heavy rains then there would be no flooding; or equivalently, if there
> is flooding then there must have been heavy rains.  This is consistent
> with there being heavy rains and no flooding.  It really doesn't make
> sense to me.

Pearl spends a lot of time discussing these sorts of issues.

I regret that I can't summarize in a few paragraphs what it clearly 
takes Pearl much discussion to explicate.

> The problem with judging causality is the well known cautionary rule
> that "correlation is not causation".  Just because A and B always
> occur together, and A comes before B, that doesn't mean that A causes B.
> This is a common danger that scientific researchers have to guard 
> against.
> It's easy to observe correlation but hard to determine what is the
> true cause.  This is a reason to view causality as just a matter of our
> perceptions.

I don't think so. Causality is a lot more than just a matter of our 

> A strange aspect of causality is that to a large degree our laws of
> physics appear to be time symmetric.  This means that you can reverse 
> the
> time sequence of events and get a physically valid (although possibly
> unlikely) scenario.  If that is so, then if A causes B in the forward
> direction, can we with equal validity say that B causes A when we look
> at things backwards?  In that case the causal roles are fundamentally
> arbitrary.

Only in fairly simple mechanical situations. A two urn experiment with 
one urn filled with white balls and the other urn filled with black 
balls is the classic illustration of increasing entropy expressed in 
terms of "more ways to have things mixed than to have things all of one 
color." Locally, with just a couple of balls, the process is largely 
reversible. But not globally.

A large literature on why time reversal is meaningful locally, but not 
globally. Usual example of gas expanding from a cylinder to fill a room 
versus reversed image of gas moving back into cylinder.

(A couple of books: "The Physics of Time Asymmetry," P.C.W. Davies, 
1974, 1977, and "Asymmetries in Time," Paul Horwich, 1987.)
> Of course in most situations we have a strong arrow of time based on
> the growth of entropy which allows us to break this apparent symmetry.
> But this is not always the case.  Systems in thermodynamic equilibrium,
> if considered in isolation, have no arrow of time.

Yes, they look like random things bouncing around, which is what they 
are.  Most "interesting things" involve temperature differences, energy 
inputs, life surrounded by nonlife, and such.

I don't claim to know the nature of time, or what produces the "arrow of 
time." It's not a resolved question.

Several of the "physics of information" conference papers, especially 
ones by W. Zurek and Charles Bennett, are useful.

(But, before I tackle the issue at a deeper level, I'm trying to learn 
the language of what I think temporal evolution is: time's arrow being 
one of the arrows of category theory, hence my interest, or one 
additional reason for it.)

> We can look at a
> microscopic part of such a system and see fluctuations which appear to
> be describable in causal terms.  For example, a temporary void forms
> randomly, causing matter at the edges to move towards the center.
> Or in reverse we say that matter moved away from the center, causing a
> void to form.  Both are equally valid ways of describing the situation,
> indicating that there is no true causality.

Physicists would say that a lot of these explanations of random 
phenomena in terms of "voids causing matter to move..." are just abuses 
of language. Comparable to financial market babblers saying things like 
"And then at 2 pm the market sensed an oversold situation and shifted 
its buying to tech stocks, causing a late rally in the NASDAQ." 
Nonsense, mostly.

(Rest of Hal's post, no time to think about and respond to tonight.)

--Tim May
(.sig for Everything list background)
Corralitos, CA. Born in 1951. Retired from Intel in 1986.
Current main interest: category and topos theory, math, quantum reality, 
Background: physics, Intel, crypto, Cypherpunks

Reply via email to