On 25 Jul 2011, at 21:48, Craig Weinberg wrote:
On Jul 25, 1:57 pm, Bruno Marchal <marc...@ulb.ac.be> wrote:
On 25 Jul 2011, at 15:23, Craig Weinberg wrote:
If they can only function for a few minutes, then that function may
not be 'normal' to anything except us as distantly removed
This like saying that a plane which crashes at 10 pm, was not really
That a car which would break at 120 km/h is not really riding when at
Not exactly. I'm saying that a plane that is crashing might be a sign
that the engines don't work.
That is beside the point. It does not change the fact that the plane
was flying before crashing, like the neurons did correctly their job
before the trouble comes from the absence of nucleus (in that thought
If it's an experimental plane that has
never flown before, it might be a sign that the theory of behind the
design can never work.
That's is another thought experiment. Here the plane was flying. The
fact that it crashes obviously shows that there is a problem with the
plane, but not that he was not flying before the crash.
If you copy the shape of a plane and observe
that the turbines spin because the plane is going fast through the
air, you might assume that the plane doesn't need engines at all and
build a plane with to drop from 30,000 feet and assume that it's going
You assume, or talk like if you were assuming that the level is
Is the substitution level of fire infinitely low? Think of
consciousness like fire. It's a potential that already exists in many
materials under specific conditions but it cannot be emulated by
But then already you introduce something infinite in the body.
You are saying that *all* finite approximation emulation will be zombie.
The whole premise of a substitution level for consciousness is
presuming something that I reject, because I see that awareness cannot
be anything other than an inherent potential of all physical
You see that? Or do you assume that?
so it's just a matter of deciding how closely you want that
awareness to be to our own. You might be able to synthesize emotion-
level experiences on silicon,
Even with comp it is a manner of speaking to say that we synthesize
emotion. The machine will just make it possible for them to be
relatively manifested. Like a computer does not synthesize numbers.
but I have no reason to assume that.
A reason is to avoid arbitrary infinities. In the 18th century in
France, Mechanism was considered as synonymous of rationalism (Diderot).
Instead I see that thought is relatively easy to mimic with 3-p a-
signifying symbol manipulations, whereas understanding and feeling
cannot be accessed symbolically because they are 1p phenomena which do
not arise from a function, but are the interior sense of a physical
But if you "simulate" arithmetical self-reference, you do get 1p-
phenomenon which are per se not simulable, because they rely on the
whole nature of the arithmetical reality.
You have a good intuition here, but you are not applying it validly
Indeed, only in that case you can affirm systemically
(for blocking the consequences of the thought experience) that
may be a lot more input which we have no way to understand from our
perceptual distance which gets amputated".
To make the level infinitely low, is a way to introduce an infinite
complexity, which, if well chosen can contradict the "natural"
infinities we get from the computationalist assumption. The "well
chosen" can be very complex. You might need to diagonalize against
whole of computer science.
All that for not bringing a steak to my sun in law who survived some
fatal brain cancer with a computer?
It's not that the level is infinitely low, it's that there is no
This is equivalent.
Everything from chlorophyll to starfish to Britney Spears has
the same essential ability to be what it is
OK. I mean I can accept that.
and sense what it is to be
If a chlorophyll can sense what it is, why do I need a brain to know
when I am hungry. Why would not my stomach be enough?
It does not make much sense for me to say that a chorophyl molecules
can sense what it is. I have no doubt that it can interact with light,
and sense it in some weak sense, but I don't see anything like an
ability of self-reference.
A computer is no different, except that the software you
impose on it is not it's own software so it has no idea that it's
normal routines are being hijacked for a purpose it can never
understand. It doesn't mean you can't run a simulation of Britney
Spears on a chip, and it may fool everyone including Britney Spears
and your son in law, but to the chip, it has no idea what Britney
What makes you sure that my brain has any idea who I am?
Anyway, no level means you need infinities to singularize identity.
You also get zombies for *all* finite approximation of those
infinities. But then why not: comp cannot be proved, it only entails
ad hoc infinities and zombies.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to email@example.com.
To unsubscribe from this group, send email to
For more options, visit this group at