On Fri, Aug 19, 2011 at 11:39 PM, Craig Weinberg <whatsons...@gmail.com> wrote:

> 1. Visual qualia does not 'occur in' the visual cortex. Visual qualia
> overlaps phenomenologically with the electromagnetic activity in the
> cortex (which is really spread out in different area of the brain),
> but if visual qualia were actually present 'in' the brain in any way,
> then we would have no trouble simply scooping it out with a scalpel.
> That's a fundamental insight. If you don't accept that as fact, then
> we have nothing to discuss.

Qualia are not present as a substance which can be scooped out,
although scooping out a particular part of the brain will destroy the
qualia, and hence you can figure out which parts of the brain are
involved in their production.

> 2. There is no such thing, objectively as 'information'. It has no
> properties, so to say that the 'visual cortex, which is fed
> information from downstream structures such as the retina and optic
> nerve' is really a way so saying 'we have no idea how visual qualia is
> transmitted, but we know that it must be, because the visual cortex is
> informed by the retina.

Many people have no idea how machines work, but they can easily figure
out what parts of the machine are essential in their function. For
example, if you unplug the radio it won't work, so electric power is
important in radio reception. However, if you replace the AC power
with a battery the radio will work again, so it can't be that AC power
is essential for radio operation. I suppose you would say that the
battery isn't *really* functionally equivalent, since it contains
chemicals and lacks any AC hum; which is true, but irrelevant.

> 3. To say that there is 'no visual qualia' if the cortex is destroyed
> is accurate, but only accurate from the subject's perspective.
> Objectively there was never any qualia there to begin with. The retina
> still responds to photostimulation, which in your view, it seems would
> have to be the same thing as visual qualia. If you are going to
> attribute thermal sense to a thermostat, you must also attribute sight
> to the retina, no? I imagine you'll say that the qualia is produced by
> the complex interaction of the visual cortex, which supports my view,
> that the thin bandwith of the raw sensation makes not just for
> different intensity of sense but a deeper quality of it's experience.

If my optic nerves are cut I am blind, so if my retina continues to
have qualia by itself (which is not impossible) I don't know about it.
"I" am the person who speaks and acts consciously, which appears to
require my cerebral cortex.

>> Any substitution will of course affect qualia IF it affects function.
> Qualia has no objective function, which is why it is not objectively
> detectable. That is the reason.

But it is subjectively detectable. My qualia change if my brain
changes, since my brain is generating the qualia. It is also
objectively detectable by assumption: if you point a gun at someone
and they change their behaviour, you assume that the gun has changed
their visual qualia.

>> The lens in the eye is replaced in cataract surgery and that does not
>> affect visual qualia at all, because the artificial lens is
>> functionally equivalent. The artificial lens is not functionally
>> identical under *all* circumstances, since it is not identical to the
>> natural lens; for example, it won't go cloudy with prolonged
>> ultraviolet light exposure. However, it *is* functionally identical as
>> far as normal vision is concerned, and that is the thing we are
>> concerned with. An artificial neuron is more difficult to make than an
>> artificial lens or artificial joint, but in principle it is no
>> different.
> You keep going back to that, but it's not true. It's wall buffaloes.
> "“We’re like cavemen that, you draw a buffalo on the wall and you say,
> ok I’m getting good at this just give me another week, bring me some
> meat, and pretty soon I’ll make buffaloes come out of that wall”.
> http://www.youtube.com/watch?v=fTd8I0DoHNk (57:16)". You can't make a
> movie so complete that it replaces the audience. There is a
> fundamental ontological difference between altering the substance of
> what we ARE compared to altering the substance of what we USE.

You *can* start with paintings and end up arbitrarily close to the
real thing. You go from painting to sculpture to clockwork to
computer-controlled to computer-controlled with emulation of the
buffalo brain if you want to reproduce buffalo consciousness. If you
are more interested in eating the buffalo than its intellect you will
have to make muscles out of artificial protein,. And so on: whatever
aspect of the real buffalo you want to replicate, in theory you can.

>>It just has to slot into the network of neurons in a way
>> that is close enough to a natural neuron. It does not have to behave
>> exactly the same as the neuron it is replacing, since even the same
>> neuron changes from moment to moment, just close enough.
> That view is not supported by neuroscience. There are no slots in the
> network of the brain. It's a family, a dynamic community that has a
> way of keeping out parasites and foreign biota.

I'm sure you can imagine a way of doing it. For example, an
intelligent nanobot could enter the neuron, study it closely for a
period of time, then gradually replace its various cellular processes
with more efficient non-biological ones.

>> I am unclear as to what exactly you think the artificial neuron would
>> do. If it replicated 99.99% of the behaviour of the biological neuron
>> do you think it would result in a slight change in consciousness, such
>> as things looking a little bit fuzzy around the edges if the entire
>> visual cortex were replaced, or do you think there would be total
>> blindness because machines can't support qualia?
> Let's sat that 100% of the behavior is only 50% of what a neuron is.
> The other 50% is what is doing the behaving and the reasons it has for
> behaving that way. It doesn't matter how great your I/O handlers are
> if the computer is made of ground beef.

So are you agreeing that that the artificial neuron can in theory
replicate almost 100% of the behaviour of the biological neuron? What
would that be like if your entire visual cortex were replaced?

>> I have asked several times if you believe that IF the behaviour of a
>> device were exactly the same as a biological neuron THEN you would
>> deduce that it must also have the consciousness of a neuron, and you
>> haven't answered.
> I keep answering but you aren't able to let go of your view long
> enough to consider mind. The answer is that biological 'function' is
> inseparable from BUT NOT IDENTICAL TO awareness. Awareness is not
> 'caused by' biological function - biology is necessary but not
> sufficient to describe sense. Biology is built on the interaction of
> passive objects only, so there is no room for the ordinary experience
> of interiority that we have to arise from or exist within other
> phenomena in the universe.

It seems to me that biology is sufficient since if you exactly
replicate the biology, you would replicate awareness. However, I don't
think biology is necessary, since as explained if you replace the
brain with a non-biological equivalent awareness would continue.

>  It's hugely anthropocentric to say "We are the magic monkeys that
> think we feel and see when of course we could only be pachincko
> machines responding to complex billiard ball like particle impacts".

Of course that's what we are. This is completely obvious to me and I
have to make a real effort to understand how you could think

>>I've asked the same question differently: could an
>> omnipotent being make a device that behaves just like a neuron
> No. There is no such thing as something that 'behaves *just like* a
> neuron' that is not, in fact, a neuron.
>  (but
>> isn't a biological neuron) while lacking consciousness, and you
>> haven't answered that either. A simple yes/no would do.
> This part of the question is invalidated by the answer to the first
> part. I've already answered it. It's like saying 'If God could make an
> apple that was exactly like an orange, would it lack orange flavor'.
> It's a non sequitur.

So, you agree: any device that acts just like a neuron has to be a
neuron. It doesn't have to look like a neuron, it could be a different
colour for example, it just has to behave like a neuron. Right?

>> So you say. We assume that you are right and see where it leads. It
>> leads to contradiction, as you yourself admit, but then you avoid
>> discussing how to avoid the contradiction.
> What contradiction? I think your view leads to contradiction.
> Thermostats that can feel but retina that can't see? It's de-
> anthropomorphic prejudice.

I'm agnostic about whether thermostats can feel or retinas can see on their own.

>> If our brain chemistry were the same then we would have the same
>> opinions; but since our brain chemistry is different we have different
>> opinions. Moreover, my opinions change because my brain chemistry
>> changes.
> So you think that you in fact have no control over your opinion, and
> what we are doing here is a meaningless sideshow to some more
> important and causally efficacious coordination of neurotransmitter
> diffusion-reputake gradients. To me that is really bending over
> backward to put the cart before the horse... or does it make more
> sense to just say
> "CCK, cholecystokinin; CCK8,COOH-terminal CCK-octapeptide; PC,
> phosphatidylcholine; DAG,sn-1,2-diacylglycerol; PI,
> phosphatidylinositol;P A, phosphatidic acid;PIP, phosphatidylinositol
> 4-phosphate; PIP2, phosphatidylinositol4,5-bisphosphate; PE,
> phosphatidylethanolamine; PS, phosphatidylserine;TG, triacylglycerol;
> protein kinase C, Ca2+-activated, phospholipid-dependent protein
> kinase; protein kinase A, cyclic AMPdependent protein kinase; TPA,
> 12-0-tetradecanoylphorbol-13-acetate;IP:3,i nositol trisphosphate;
> HEPES4,- (2-hydroxyethyl)-l-piperazineethansulfonicacid EGTA,
> [ethylenebis(oxyethylenenitrilo)]tetracetic acid HR"
> instead?

If they are my opinions then I have control over them, don't I? I
think you may be referring to the compatibilism/incompatibilism
debate. If you choose to say you lack free will under determinism,
then you can say that - it's just a matter of semantics. But then,
would you be happier saying that you have free will if your choices
are random rather than determined? Or do you have some third option,
neither random nor determined?

>> Of course it does. It has absolutely everything to do with it. If you
>> drank 10 liters of water the sodium concentration in your brain would
>> fall and you would go into a coma. If you had a non-material soul that
>> was unaffected by biochemistry you would have been OK.
> The mind supervenes upon the body and brain, but the brain also
> supervenes upon the mind. Consciousness is 100% real, but only 50% of
> it can be describes in exclusively physical terms - like mass,
> density, scale, etc. Those terms do not apply to the phenomenology of
> experience. If it was all biochemical, you could not read something
> that 'makes you mad', since the ink of the writing doesn't enter your
> bloodstream and cross the blood brain barrier.

Consciousness is 100% real but none of it can be described in purely
physical terms. However, the physical processes which generate it can
be described.

>> > If he breaks his artificial knee, he won't experience the same pain as
>> > if he broke his original knee. That is a function. The difference
>> > means that it does not 'function normally'. He can take a 3/4" drill
>> > and put a hole straight through his patella five inches and not feel a
>> > damn thing. That is not 'functioning normally'. Why not admit it?
>> Similarly with an artificial brain component of the right design, if
>> the circulation is cut off he won't have a stroke. But that does not
>> mean that he can't have normal cognition.
> So you admit, finally, that a replacement part cannot be assumed to
> 'function normally' just because it appears *to us*  to do what the
> thing it replaces used to do.

If it's not exactly the same then obviously there will be differences,
in composition and also in function. But you don't seem to get that we
are concerned with *relevant* differences. If a person with a brain
prosthesis can have a normal conversation with you for an hour on a
wide range of topics, showing humour and emotion and creativity, that
would say something about how well the prosthesis was working. So what
if the prosthesis is a different colour and weighs a little bit more
compared to the original?

>> >> No, the radioactive decay is truly random,
>> > What makes you an authority on what is truly random?
>> Look it up yourself.
> I mean how do you know that the idea of randomness corresponds to that
> can exist? Randomness is a category of information, which I think
> doesn't physically exist.

There is no way, even if you have complete knowledge of starting
conditions, that you can predict when an atom will decay. Perhaps
"indeterminate" is a better term than random.

>> For example, do you imagine that an ion gate in
>> a cell membrane can open apparently without being triggered by any
>> change in the physical conditions such as binding of neurotranmitter?
>> If so, it should be evident experimentally; can you cite any papers
>> showing such amazing results?
> You're citing the limitation of your own 50% correct view as a virtue.
> The fact is that changes in our cell membranes of our brains are
> triggered by our thoughts, and thoughts are triggered by neurological
> changes as well. It. is. bidirectional. I know it seems bizarre, as
> did Galileo's understanding of astronomy contradict the naive realism
> that we should fly off the Earth if it was moving, but I am not
> willing to discard the reality of consciousness in favor of the
> reality of biology. They can and do coexist quite peacefully. Your
> brain doesn't have a problem with you controlling what you think and
> say, it's only your conditioning and stubbornness that prevents you
> from seeing it as it is, an unassailable and ordinary fact.

If thoughts are generated by biochemical reactions in the brain and
only biochemical changes in the brain then there is a deterministic
chain of biochemical events from one brain state to another. That is,
we can say,

(a) Release of dopamine from Neuron A triggers an action potential in
Neuron B which causes Muscle C to contract which causes Hand D to


(b) Release of dopamine from Neuron A generates a desire to lift one's
hand up, the dopamine then triggers an action potential in Neuron B
which is experienced as the intention of lifting one's hand up, and
Neuron B stimulates Muscle C to contract which is experienced as one's
hand actually rising.

Stathis Papaioannou

You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to