On Aug 6, 6:27 am, Stathis Papaioannou <stath...@gmail.com> wrote:
> On Fri, Aug 5, 2011 at 11:52 AM, Craig Weinberg <whatsons...@gmail.com> wrote:
> >> But the part of your brain that is doing the doubting, which might be
> >> normal,
>
> > If the part of your brain doing the doubting is 'normal' enough for
> > you to experience doubt, then you are conscious to that extent. It's
> > very straightforward. There is no point in overthinking it.
>
> Yes.
>
> >> could be fed signals about perceptual data (perception
> >> involves consciousness by definition) from non-conscious machinery.
> >> The technical issues re this machinery are irrelevant: it is only
> >> necessary to consider that it is *possible* to provide the appropriate
> >> electrochemical signals without also providing consciousness (the
> >> hypothesis entertained in order to disprove it).
>
> > Consciousness isn't provided. It's not a service. It's like saying
> > that mass is being provided to an object.
>
> My position is that consciousness occurs necessarily if the sort of
> activity that leads to intelligent behaviour occurs.

Consciousness is the same thing as that which 'leads to intelligent
behavior' for the subjective perspective (which makes your position
tautological, that consciousness occurs if consciousness occurs) but
for the objective perspective, there is no such thing as observable
behavior that is intrinsically intelligent, only behavior which
reminds one of their own intelligent motives. Let's call the
subjective view of one's own behaviors 'motives' for clarity, and the
objective view of 'intelligent seeming' as 'anthropomorphic behavior',
or more universally, isomorphic phenomenology.

> This is not
> immediately obvious, at least to me. I assume therefore that it is not
> true: that it is possible to have intelligent behaviour (or
> neuron-like behaviour) without consciousness. This assumption is then
> shown to lead to absurdity.

What absurdity? A cartoon of a neuron has neuron-like behavior, and
it's clearly not intelligent. At what point does a cartoon improve
enough that it becomes conscious? To me that shows the assumption that
you can't have something that behaves like a physical neuron without
there existing such a thing as consciousness as absurd. Of course you
can make any physical design of surfaces and mechanical relations
between them without there being some feeling entity appearing to
enjoy your simulation.

> >>You would thus believe
> >> you had perceptions when in fact you had none.
>
> > Signals from an artificial source are still perceptions. When I play
> > an mp3 it comes from a speaker rather than a miniature musician. I can
> > tell the difference, and I can hear the song. There is no mystery
> > there, it's just how sense works. It's not a substance that travels
> > through space like a projectile, it's the 'wholes that are pulled
> > through the holes'.
>
> Perceptions are generally believed to occur in brain tissue,
 and
> probably cortical rather than subcortical tissue, not in the
> environment or in the sensory organs.

It's all perception. Every limbic impulse, every thought, need, idea,
or intuition is perception. You are assuming that the sense organs can
possibly sense something without perceiving it, and then pass on that
non-sense to the brain which makes sense out of it, making perception
a solipsistic simulation bearing no legitimate resemblance to what is
being sensed. I used to make that assumption, but since brain tissue
is nothing more than neurons, and neurons are nothing more than
matured stem cells, there is no reason to imagine some special dance
they do in the brain occurs nowhere else in the universe. It's
everywhere, on every level, but it's private - only to be shared
amongst isomorphic phenomena.

>If a part of your cortex
> associated with a certain type of perception or cognition is removed,
> that perception or cognition is eliminated. (Other brain tissue can
> take on the lost function but this process requires remodelling).

Oh definitely, removing part of your brain is going to remove the part
of *you* that perceives it, but it's not going to stop your sense
organs from perceiving it. Try thinking of it the other way. If the
sense organ had nothing to do with perception then the brain would
simply remodel the eye out of an empty socket in the skull with a lens
on it.

Yes, perception occurs at the brain - which is why you can numb pain
with narcotics without affecting pain receptors, but perception also
occurs everywhere in the nervous system, which is why you can use an
local anesthetic on the nociceptors too which doesn't affect the
brain. If it was truly all in the brain, it would compensate for the
missing signals from your numb finger and perceive pain anyhow if it
was being operated on, just like an optical illusion compensates for
paradoxical inputs with a sensory simulation.

> >> that could be the case
> >> now: you could be completely blind, deaf, lacking in emotion but you
> >> behave normally and don't realise that anything is wrong. Please think
> >> about this paragraph carefully before replying to it - it is
> >> essentially the whole argument and you seem to have misunderstood it.
>
> > I have thought about it many times. Like 25 years ago. It's the
> > reductio ad absurdum of materialism. You can't seem to let go of the
> > idea that perception is perception whether it happens completely
> > within your own dreamworld, through the tailpipe of some computerized
> > lawnmower, or a crystal clear presentation of external realities. It.
> > makes. no. difference. Having a thought, any thought, any experience
> > whatsoever and being aware of that fact is consciousness. Period. It
> > doesn't matter if you have a brain or not, or what other people
> > observe of your behavior. Unless you are talking about a medical
> > definition of being conscious as far as exhibiting signs of responsive
> > to the outside world, which is something else entirely. That would
> > only be relevant for something which we assume to be capable of
> > consciousness in the first place.
>
> I'm talking about subjective experience, perceptions, qualia,
> understanding, feelings. These things cannot be observed from the
> outside, as opposed to associated behaviours which are observable. But
> there is a big conceptual problem if it is possible to make brain
> components that perform just the mechanical function without also
> replicating consciousness. Sometimes you say it would be too difficult
> to create such components, which is irrelevant to the argument. Other
> times you say that there would be a change in perception, but then
> don't seem to understand that it would be impossible to notice such a
> change given that the part of the brain that does the noticing gets
> normal inputs by definition.

I'm consistent in my position, you're just not seeing why the premise
is fallacious from the beginning. There is no such thing as:

1. behaviors associated with qualia
2. mechanical functions that replicate consciousness
3. normal inputs

1. If I experience qualia, like color, then I can associate the
conditions in my brain, my retina, exterior light meters, etc with the
production of color. If, however, I'm blind, then I cannot associate
those conditions with anything. It's not symmetric. A camera can't
necessarily see just because it takes pictures that we can see. We see
through lens and through the pictures and jpeg pixels. The pixels
don't see, the monitor doesn't see, the lens doesn't see. These are
just devices we can use to reflect our sight. What they 'see' is
likely totally different.

2. Consciousness isn't a special logical design that turns inanimate
objects and circuits into something that can feel. Matter feels
already - or detects/reacts. Consciousness is just the same principle
run through multiple organic elaborations so that it feels as the
interior of an organism rather than just the interior of cells or
molecules. It scales up.

3. We've done this to death. You're just not understanding what I'm
saying about the YouTubes and the voicemails. Fooling a neuron for a
while doesn't mean that you can rely in it being fooled forever, and
even if you could, it still doesn't mean that a counterfeit neuron
will provide genuine neuron interiority, when scaled up to the level
of the entire brain.

> > I just want to know how many times I can repeat the phrase 'there is
> > no such thing as 'behaving as if it were conscious' before you
> > acknowledge the meaning of it.
>
> No, I don't understand how you could possibly not understand the
> phrase. "Here's Joe, he behaves as if he's conscious, but I can't be
> sure he is conscious because I'm noit he".

That's what I'm saying - you can't be sure he is conscious because
you're not him. So why do you keep wanting to claim that there is some
kind of normal, conscious 'behavior' which can be emulated with
confidence that Joe2 is as good as Joe?

> >>would in
> >> fact be conscious. I think it would,
>
> > You are free to think whatever you want under my model. If you're
> > right about consciousness being just a brain behavior, in which case
> > you can only think what your neurology makes you think. In that case
> > you might as well stop reading because there's no point in imagining
> > you can have an opinion about anything.
>
> I'm not saying consciousness is just a brain behaviour, I'm saying
> consciousness is generated by a brain behaviour, and if you copy the
> behaviour in a different substrate you will also copy the
> consciousness. And of course I can only think what my neurology makes
> me think, and my neurology can only do what the laws of physics
> necessitate that it do.

That's the problem, you're neurology can only do what the laws of
physics allow it to do, but your copy can do whatever you program your
copymaking algorithms to do. That's an ontological difference. Your
copy of the laws of physics aren't physical. Which is why if you copy
the behavior of fire it won't burn anything. I agree that if you could
copy the brain behavior into a different brain (ideally of course,
practically transplanting the behavior of one group of millions of
neurons to a different brain wouldn't work an more than taking the
blueprints of every building in New York City to Sumatra is going to
make Indonesian New Yorkers.) that potential to experience the sense
and motives of the original would be reproduced *to the extent that
the host brain can support it*. You can play a color movie on a black
and white monitor, but the pattern of the DVD alone, even though it
codes for color, can't change the monitor. For the same reason you
can't just copy 'behavior' into any old substance, it has to be
something that the brain does - live, breathe, feel like an animal's
brain in an animal's world.

> >> otherwise it is possible that I
> >> am currently deluded about being conscious, which is absurd.
>
> > I've gone over and over and over this. A puppet 'behaves like it is
> > conscious' when it is being manipulated by a puppeteer. Does that mean
> > the puppet is conscious? You're just affirming a fallacious initial
> > assumption that consciousness is a behavior rather than an elaboration
> > of an intrinsic quality of all matter. There is certainly a range of
> > awareness in matter, so that behavior can generally indicate what
> > level of awareness something is capable of experiencing, but that goes
> > out the window when you are talking about intentionally simulating the
> > kinds of behaviors we would tend to associate with beings similar to
> > ourselves.
>
> I understand what you're saying and assume for the purposes of
> argument that it is true - that consciousness is substrate-dependent
> and hence an implementation of a brain component in a different
> substrate will therefore be unconscious or differently conscious. But
> then if this component is inserted into your brain you will be
> differently conscious without realising that you are differently
> conscious - otherwise the brain component is not really functioning
> normally, which is the initial assumption.

If you accept my argument that substrate-dependence is true, then you
have to reject your initial assumption that a different substrate can
ever function 'normally'. The plastic neuron fails for the same reason
the plastic brain fails. The other neurons know it's an imposter. They
might still make sweet sweet love to the blowup doll, but that just
means they're lonely ;)

>So I ask you again, yes or
> no, IF A BRAIN COMPONENT SLOTS IN AND FUNCTIONS NORMALLY FROM A
> MECHANISTIC PERSPECTIVE DOES THAT MEAN THE WHOLE BRAIN'S CONSCIOUSNESS
> WILL NECESSARILY ALSO BE NORMAL?

Heh, no need to get ALL CAPPY about it, but I hope the above helps
explain my position. We can't be ever be sure that a brain component
can function "normally" unless it is made of brain. Hell, it might not
even function exactly like the original if it's transplanted from an
identical twin's brain. We just don't know. My position is that the
closer the thing is physically and logically to a brain, the more like
a brain it could be, but that just logic alone will not give an
inorganic brain feeling and organic matter alone will not give tissue
human logic.

> > Being 'deluded about being conscious is a true non-sequitur'. Delusion
> > is consciousness too. A brick cannot be deluded. A computer cannot be
> > deluded. A brain cannot be deluded. A person can be deluded - because
> > they are the cumulatively entangled sensorimotive interior of a human
> > brain, which contains many ambiguous and conflicting fugues of
> > significance, organized dynamically and hierarchically through
> > metaphor and association, image, instinct, etc on many different
> > levels of awareness above and below the conscious threshold.
>
> I agree you can't be deluded about your consciousness, but if
> consciousness is substrate-dependent then you CAN be deluded about
> your conscious. That's why consciousness can't be substrate-dependent!

No, you can't be deluded about your consciousness, regardless of
whether it is substrate dependent or not. You can be deluded about
other things, whether substance-dependent or not, but in neither case
can you think that you are thinking without thinking.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to