On 29 jun, 02:13, "Jesse Mazer" <[EMAIL PROTECTED]> wrote:
> LauLuna wrote:
> >For any Turing machine there is an equivalent axiomatic system;
> >whether we could construct it or not, is of no significance here.
> But for a simulation of a mathematician's brain, the axioms wouldn't be
> statements about arithmetic which we could inspect and judge whether they
> were true or false individually, they'd just be statements about the initial
> state and behavior of the simulated brain. So again, there'd be no way to
> inspect the system and feel perfectly confident the system would never
> output a false statement about arithmetic, unlike in the case of the
> axiomatic systems used by mathematicians to prove theorems.

Yes, but this is not the point. For any Turing machine performing
mathematical skills there is also an equivalent mathematical axiomatic
system; if we are sound Turing machines, then we could never know that
mathematical system sound, in spite that its axioms are the same we

And the impossibility has to be a logical impossibility, not merely a
technical or physical one since it depends on Gödel's theorem. That's
a bit odd, isn't it?


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to [EMAIL PROTECTED]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 

Reply via email to