On Jul 30, 5:18 pm, Jason Resch <jasonre...@gmail.com> wrote:

> So would an accurate reproduction of your brain processes, assuming you are
> capable of lying and ignoring.

No, it's not my brain lying or ignoring, it's the other side of the
brain - the self, which is not directly accessible from the exterior
(it is indirectly accessible through interaction with the environment
though).

> > > The system which responds in the same way you do can offer its opinion on
> > > anythings.
>
> > It's a fantasy. No system can respond exactly the way you do unless it
> > is you.
>
> Explain what you mean by "you".  If you mean the processes, relation, and
> information which define me, then I would agree.  If you mean the atoms,
> molecules, and wet organic tissues I disagree.

You are not just information, or processes, or relations. You are what
a screaming, crying, crapping, organism of the Homo sapiens species on
Earth grows up to be. A person. Indivisible. Concrete. Spatiotemporal,
electromagnetic, sensorimotive. All of it. How much of it is necessary
to be "you"? The more isomorphic the better, obviously. Since we don't
see disembodied versions of you hovering around, and don't see silicon
emulations of you cropping up by themselves, we can assume that there
may be a good reason for this. Since I can give your wet organic
tissues a few crumbs of alkaloid substance which will utterly
incapacitate your 'processes, relations, and information' whether you
like it or not, we can assume that your atoms and molecules bear more
than a casual role in perpetuating this identity simulation you
currently enjoy.

> In this example, the frying pan is not what causes Elmer Fudd's depiction to
> appear to react.

Exactly! The state of your brain is not what causes you to react
either. It can cause you to react, but you can cause your brain to
react - as it does when you push your intention to move your hand down
your spine and out your arm. You can wire someone up like a marionette
and make them behave how you want them to behave but it's not going to
create an experience of self-generating motivation. Same with a
simulated brain. You can program it so act like a brain, but it has no
subjective content.

> The ability to form an arithmetic compression is one definition of
> intelligence.

> > It still uses a strategy
> > which requires no understanding. It just methodically compares the
> > positions of each piece to the positions which will result from each
> > scenario that can be simulated.
>
> How do people play chess?

They can do it methodically or they can do it irrationally. They can
take risks that make no sense - intimidate an opponent. All kinds of
ways. Most of all, they do it for the pleasure of playing and
thinking. Something that Deep Blue is incapable of.


> > It has no idea that it's playing a
> > game, it experiences no difference between winning and not winning.
>
> This is an assumption of yours.  If you saw a paralyzed person, you might
> likewise conclude they have no inner experience based on what you could
> observe.

No, because I know that the exterior behavior of a living organism
does not necessarily correspond to their internal state. If they
aren't dead, they could be having an inner experience. If they are in
a coma, their body, or tissues of their body, are still having an
experience. When the body is completely dead, the molecules experience
decomposition.

With a machine, it has an experience of being turned on which unites
the circuit through all of the branching paths of the
microelectronics. That's probably all it has. It's like one big
semiconducting molecule that has different parts of it's circuit open
at different times. I don't feel guilty about turning it off or
recycling it when I'm done with it.

> > > What in the human brain is not mechanical?
>
> > The experience of the human brain, or any other physical phenomenon is
> > not mechanical.
>
> Does experience affect how the brain works or is it an epiphenomenon?

Of course. It is an epiphenomenon and it is a phenomenon.

If it
> is not a  epiphenomenon, how is it physically/mechanically realized such
> that it can have physical effects on how the brain works?

Sensorimotive experience is involuted electromagnetic energy. Your
intention to move your hand is shared with the cells of your nervous
system, who share it with your muscles as electromagnetic energy. Your
muscles are as much motor-biased as your nervous system is sense-
biased so it's experience is small on the sensorimotive, big on the
electromagnetic. Basically your nervous system is the mechanical i/o
between sensorimotive phenomena and electromagnetic phenomena - even
though in a molecule they would be the same, in an organism the
sensorimotive and electromagnetic are dimorphisized through the
specialization of the tissues re: the body as a single organism.

> > The notion of a machine is as a logical abstraction
> > which can be felt or felt through, but it has no feeling itself.
>
> This is your hope.

Why would I hope that?

> > The actions themselves can be emulated but the way that the actions
> > are strung together can reveal the holes in the fabric of the
> > mechanism. When you make a phone call, how long does it take for you
> > to be able to tell whether it's a voicemail system vs a live person?
> > Sometimes you can hear recording noise even before the voice begins.
> > Occasionally you could be fooled for a few seconds. It's not that the
> > system breaks down, it's that the genuine person will sooner or later
> > pick up on subtle cues about the system which give it away.
>
> Any discrete simulation can be made 2^512 times more precise by just using
> 512 times more memory.  Even if the brain were continuous rather than
> discrete, there is no limit to how accurately its behavior could be
> simulated.  If you have 512 decimal places to work with, it is not clear at
> all that one would ever detect these subtle cues.

It doesn't matter, there still is no subjectivity there. Sooner or
later some subjectivity will be called for in a given situation that
may be noticed by someone interacting with it. You would have to load
the simulation with every possibility of a human life on Earth for all
time.

> A-life can and does.  Download smart sweepers and see it happen before your
> eyes:http://www.ai-junkie.com/files/smart_sweepers.zip

I know all about that kind of programming. It's not valid for the same
reason as Mickey Mouse made of people isn't valid.

> > You just said they can't unless they are programmed with that
> > capability. That's not 'by themselves'.
>
> Human beings did not design human brains.  They were programmed by years of
> evolution.

Are you saying that evolution will begin programming computers without
human intermediaries at some point?

> > You can emulate the alphabet of neuron behaviors but that doesn't mean
> > it can use those behaviors in the way that live neurons can.
>
> Why couldn't they?

Because it's not alive. It doesn't care.

> > For the same reason that hydrogen can't be gold and inorganic matter
> > can't be alive.
>
> A lump of coal and a glass of water aren't alive until they are organized
> appropriately, what makes them "organic" is a matter of their organization
> and relationships between each other.

It's not just the organization. That's what I'm pointing out. You
cannot organize silicon into DNA or ammonia into water. They are
different things. It's the inherent capacity of the thing to be
organized in that way. It's that finite quality which is actually what
prevents comp from emulating fire or consciousness. Math isn't finite.
It has no way to access the experience of having to be one thing and
not everything and anything.


> So long as this organization and the
> relationships are preserved, the pattern and its complexity are preserved.
>
> > The computer has no high level complexity,
>
> I think that is an absurd statement.  Is not a program running a
> protein-folding simulation more complex than a computer with a newly
> formatted hard drive and all zeros in its memory?

If you unplug the monitor nobody will know the difference. The
evaluation of complexity is ours, it doesn't reside in the computer.
The computer might have a more clickety clickety experience in the
protein-folding sim. More heat. More circuitry opening and closing.

> > This universe could no more be simulated on a computer than it could
> > on a network of plumbing.
>
> If one can build a Turing machine out of plumbing it could simulate anything
> any other computer can.

I understand that position, I just reject it on the grounds that
experience cannot be simulated.

> > Fire isn't made of plumbing.
>
> Fire is made of atoms swapping places and moving around quickly and emitting
> photons.  These are all discrete particles with understandable properties
> and known relations between them.  Thus these same relations can be
> replicated in any computer so that burning may take place.

This is wrong though. You cannot make a program that will burn. It's a
fantasy. You're just following the logic of the position to it's
absurd conclusion and deciding that it's not absurd.

> Also, are you sure you have never heard of Searle?  He also used the
> plumbing example as 
> well:http://www.google.com/#sclient=psy&hl=en&source=hp&q=john+searle+wate...

That's funny. I have heard of Searle but not read him. I saw him on a
YouTube but his position seemed too reactionary to me. It sounded like
he was saying what you are, that consciousness is mechanical.

> > Because it is simulated only through our interpretation.
>
> A tree falling has to be heard for it to produce a sound?

Of course. A human sound anyways.

 I can understand
> that, but what if there is a process which can hear within the simulation?

It won't hear anything, it will just have ear drum shaped contours
that are synched to the algorithms of air pressure compression. It
will look like our bodies look when we hear but there is no sound.

> > That's tautology. It's like saying 'if you couldn't see the inside of
> > your stomach, you wouldn't know how to digest food'. Qualia is not
> > necessary to the function of the brain.
>
> I think you need to "see" (having qualia) in order to see.  In other words,
> Zombies are not possible.  You cannot have something which performs as
> though it can see if it has no visual experience.

What do you mean by performing as though it can see? "The Star-nosed
Mole can detect, catch and eat food faster than the human eye can
follow (under 300 milliseconds).[1]" http://en.wikipedia.org/wiki/Blind_animals

Saying 'Zombies are not possible' doesn't mean anything. It just means
that you want to believe that internal experiences like our own simply
come with the universe automatically. Anything shaped like an eye can
see, or like an eardrum can hear. It's not true. It's not completely
untrue - some shapes have functions due to their shape, but other
things have functions in spite of their shape. Adrenaline is not
shaped like excitement.

> > Indeed you can lose
> > consciousness every night and your brain has no problem surviving
> > without your experience of it.
>
> You might have trouble running away from a lion while sleeping though.

You might have trouble running away from a lion while wide awake too.

> > Your body can find food and reproduce
> > without there being any sense of color or shape, odor, flavor, etc.
>
> It seems it would be significantly harder to find food if we had no sense of
> shape or color (is that a strawberry or a rock?).  You would expend a lot of
> energy chasing phantom food.

Nope. It's easy. Bacteria eat. The Star-Nose Mole eats. It's a just-so
story. http://en.wikipedia.org/wiki/Just-so_story

> > It
> > could detect and utilize it's surroundings mechanically, like Deep
> > Blue without ever having to feel that it is a thing.
>
> I doubt you could have something with the same level of survivability as a
> human that did not have experience.

Your doubts are unfounded. You are looking at what exists and deciding
that there must be some reason why it could not be any other way.

> So you also doubt zombies?  Then you should not be worrying about processes
> which behave as humans that have no inner life.

Zombies are just a useful philosophical idea.

> > Exactly. It's different. Can't do the same things with it.
>
> No, but you could use it to build a universal machine which can emulate any
> definable process.

It's also a philosophical idea. It's wildly overreaching IMO and based
upon the assumption that interiority does not exist, which I think is
obsolete.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to