Hi Stathis,

RE: Zombie Room
The zombie room is now in a paper on solipsism and is in review and I
expect will be rejected in due course! :-) Over XMAS I hope to catch up on
all my mail. It's proven to be a really useful cross-modal thought
experiment because it renders a human 'methodologically zombied' without
having to assume physiological alteration (just life-long entrapment!).

The room, including its inhabitant Marvin, is a zombie from the outside.
If the 'room' was actually inside a giant lifelike robot human. It's
animation by the zombir room would be pathetic. It would have no clue
where it was and learn nothing looking remotely normal. Meanwhile Marvin
inside can do perfectly good 'zombie room' science.

RE: Computer Pain
There's a whole axis of modelling orthogonal to the soma membrane which
gets statistically abstracted out by traditional Hodkin/Huxley models. The
neuron becomes geometry-less (except for when the HH model is made into
'cable'/compartmental equivalents for longitudinal transmission). The new
modelling I am doing undoes what HH did. There's nothing wrong with it -
it just throws away all the experience components, which are physics
occurring at right angles to the HH propogation.

The electric field, which is massive and intricately choreographed, cannot
be eliminated. It must be fully expressed in space just as a neuron does.
All the chemical complexities don't matter except insofar as they serve to
manipulate the electric field in space. My EC calculus says 'why it's like
something' to be these fields (again I am still working on this!).

In my model, for very specific reasons of physics, computationalism,
functionalism, representationalism, eliminativism are all false. This does
not mean that there is no 'abstract computation', no 'functional
structure', no 'representation'.... it just means that these things are
not _causal_ of phenomenal consciosuness - they merely serve to manipulate
it appropriately for the purposes of cognition.

So - 'action potentials' will be an emergent feature of the physics, not
modelled. All the chemistry manipulating the membrane conductance (and
therefore its effectiveness as a dielectric) are eliminatable in favour of
simpler gating mechanisms. You do not have to have a cytoskeleton - merely
a dielectric. Synapses are all irrelevant. They are constructed so as not
to interfere with the overall soma fiald expression (by the way -
astrocytes are more important in this than neurons!)...... except that in
some heavy mass-synaptic firing conditions - maybe in the cerebellum there
may be field effects in the dendritic trees that are 'like something'....
not sure yet.... but my bet is no.

Also, by implementing 'virtual circuits' orthogonal to the membrane
dependent on synchrony/asynchrony in 3D space, the soma fialds have a role
in learning in that they maintain channels down which (along with other
sources of chemistry)  chemistry flows to modify synapses and build new
neural hardware/shape changes.

So overall: unless the computational substrate has this stuff built into
it there will be no experiences. End of story. My PhD will eventually
contribute to an experiment to prove this. I have to build new chips tho.

Nature has made a wonderful, amazing piece of kit in the neuron/astrocyte.
We have barely begun to get it what it is really doing. Unjustified
computationalist/functionalist hubris and assumption based on simplistic
models from >50 years ago will not do.

Which is why I find the idea of computer pain so odd.


Colin Hales

> Hi Colin,
> I thought you'd react in this way. It is a prediction
> of computationalism that running certain lines of code
> should generate pain (and every other type of
> experience). I realise it seems absurd when put like
> this, but there you have it.
> I very much doubt that a superficial or top-down
> copy of an organism in pain
> (which would be very easy to build) would actually
> experience pain, but a bottom-up copy, an emulation
> of the individual neurons which resulted in
> behaviour similar to the original organism...
> I find it as difficult to imagine such a being not
> being conscious as a fellow organic human not being
> conscious. But I certainly don't expect the "pain code"
> for such a being to be anything like what you have
> indicated below.
> If you believe that computer emulation of neural tissue
> behaviour will fail, at which step do you think it will
> fail? The action potential, the cytoskeleton,
> the effect of the neurotransmitters at the synapses, or where?
> Also, you never explained if Marvin + machine is a zombie or not a
> Stathis Papaioannou

 You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 

Reply via email to