Sorry about the delayed response -- got busy over the holidays and kind
of neglected the email.
It certainly seems clear that you're right.
Now I'm sitting here, trying to recall the details of what Schoenfield
did with this that convinced me there was some unprovable assumption
buried in there. I think it was in his 'Mathematical Logic' that I
encountered the idea that the rules of inference were something which
needed assuming, but unfortunately I lent my copy to someone a couple
years back and then I moved out of the country. Sigh...
Horace Heffner wrote:
Following is a slightly more formal version of the proof.
On Dec 19, 2007, at 8:57 AM, Stephen A. Lawrence wrote (in the WAY_OT:
"The Mindless crap shoot of evolution" thread):
Something I encountered with surprise when I first studied logic is
that the primary rule of inference:
(A => B) & A => B
is an assumption, not a theorem.
This depends on the definition of A => B. As used in the above
assertion, (A=>B) has a truth value, otherwise the operation "&" is not
closed. Thus the proposition A=>B can be defined as a function:
(A=>B) == f(A,B)
where f(A,B) can be defined by:
A, B, f(A,B)
F, F, T
F, T, T
T, F, F
T, T, T
or more formally, we can define that
(A=>B) == f(A,B) == (A' + B) == ((not A) or B),
which we can be more comfortable about by looking at an expanded table:
A, B, f(A,B), A', (A' + B)
F, F, T, T, T
F, T, T, T, T
T, F, F, F, F
T, T, T, F, T
where we see the f(A,B) and (A'+B) columns are identical. Now to apply
the definitions and some readily formally proved common theorems in
order to provide a more formal proof, i.e. one that does not require
inspecting a table.
Now, if (A => B) & A => B is true for every choice of (A,B) then, by
definition, the proposition
(A => B) & A => B == f([(A=>B) & A ], B)
must be true for every for every possible choice of A,B.
Substituting (A'+B) for (A=>B) we have:
f([(A=>B) & A ], B) = f ([(A'+B) & A)],B)
Distributing A:
f ([(A'+B) & A)],B) = f([(A' & A) + (A & B)], B)
But (A' & A) = F, by law of the excluded middle, so
f([(A' & A) + (A & B)], B) = f([F + (A & B)], B)
but F+ X = X for all X, so
f([F + (A & B)], B) = f((A & B) , B)
Again applying the definition of f(X,Y) == (X' + Y) we substitute to obtain:
f((A & B) , B) = (A & B)' + B
By De Morgan's law (A & B)' = A' + B', so, substituting:
(A & B)' + B = (A' + B') + B
Reassociating:
(A' + B') + B = A' + (B' + B)
But (B' + B) = T by the law of non-contradiction, so
A' + (B' + B) = A' + T
but X + T = T for any X, so
A' + T = T
We have thus proven (by the transitive property of equivalence
relations) the proposition:
f([(A=>B) & A ], B) = T
is true for every choice of (A,B), so back substituting (A => B) & A
=> B == f([(A=>B) & A ], B) (which can be substituted in either
direction because equivalence relations are symmetric) we have the
proposition
(A => B) & A => B = T
i.e. the proposition (A => B) & A => B is true for every possible
choice of A,B.
Q.E.D.
Horace Heffner
http://www.mtaonline.net/~hheffner/