On Aug 1, 8:07 pm, Stathis Papaioannou <stath...@gmail.com> wrote:

> 1. You agree that is possible to make something that behaves as if
> it's conscious but isn't conscious.

Noooo. I've been trying to tell you that there is no such thing as
behaving as if something is conscious. It doesn't mean anything
because consciousness isn't a behavior, it's a sensorimotive
experience which sometimes drives behaviors.

If you accept that, then it follows that whether or not someone is
convinced as to the consciousness of something outside of themselves
is based entirely upon them. Some people may not even be able to
accept that certain people are conscious... they used to think that
infants weren't conscious. In my theory I get into this area a lot and
have terms such as Perceptual Relativity Inertial Frame (PRIF) to help
illustrate how perception might be better understood (http://
s33light.org/post/8357833908).

How consciousness is inferred is a special case of PR Inertia which I
think is based on isomorphism. In the most primitive case, the more
something resembles what you are, in physical scale, material
composition, appearance, etc, the more likely you are to identify
something as being conscious. The more time you have to observe and
relate to the object, the more your PRIF accumulates sensory details
which augment your sense-making of the thing,  and context,
familiarity, interaction, and expectations grow to overshadow the
primitive detection criteria. You learn that a video Skype of someone
is a way of seeing and talking to a person and not a hallucination or
talking demon in your monitor.

So if we build something that behaves like Joe Lunchbox, we might be
able to fool strangers who don't interact with him, and an improved
version might be able to fool strangers with limited interaction but
not acquaintances, the next version might fool everyone for hours of
casual conversation except Mrs. Lunchbox cannot be fooled at all, etc.
There is not necessarily a possible substitution level which will
satisfy all possible observers and interactors, pets, doctors, etc and
there is not necessarily a substitution level which will satisfy any
particular observer indefinitely. Some observers may just think that
Joe is not feeling well. If the observers were told that one person in
a lineup was an android, they might be more likely to identify Joe as
the one.

In any case, it all has nothing to do with whether or not the thing is
actually conscious, which is the only important aspect of this line of
thinking. We have simulations of people already - movies, TV, blow up
dolls, sculptures, etc. Computer sims add another layer of realism to
these without adding any reality of awareness.

> 2. Therefore it would be possible to make a brain component that
> behaves just like normal brain tissue but lacks consciousness.

Probably not. Brain tissue may not be any less conscious than the
brain as a whole. What looks like normal behavior to us might make the
difference between cricket chirps and a symphony and we wouldn't
know.

> 3. And since such a brain component behaves normally the rest of the
> brain should be have normally when it is installed.

The community of neurons may graciously integrate the chirping
sculpture into their community, but it doesn't mean that they are
fooled and it doesn't mean that the rest of the orchestra can be
replaced with sculptures.

> 4. So it is possible to have, say, half of your brain replaced with
> unconscious components and you would both behave normally and feel
> that you were completely normal.

It's possible to have half of your cortex disappear and still behave
and feel relatively normally.

http://www.newscientist.com/article/dn17489-girl-with-half-a-brain-retains-full-vision.html
http://www.pnas.org/content/106/31/13034

> If you accept the first point, then points 2 to 4 necessarily follow.
> If you see an error in the reasoning can you point out exactly where
> it is?

If you see an error in my reasoning, please do the same.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to