On Sat, Jul 30, 2011 at 5:45 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
> On Jul 30, 5:18 pm, Jason Resch <jasonre...@gmail.com> wrote:
> > So would an accurate reproduction of your brain processes, assuming you
> > capable of lying and ignoring.
> No, it's not my brain lying or ignoring, it's the other side of the
> brain - the self, which is not directly accessible from the exterior
> (it is indirectly accessible through interaction with the environment
Just as you say there is an inside and outside part of the mind/brain, I
believe the same is true of Turing machines.
> > > > The system which responds in the same way you do can offer its
> opinion on
> > > > anythings.
> > > It's a fantasy. No system can respond exactly the way you do unless it
> > > is you.
> > Explain what you mean by "you". If you mean the processes, relation, and
> > information which define me, then I would agree. If you mean the atoms,
> > molecules, and wet organic tissues I disagree.
> You are not just information, or processes, or relations. You are what
> a screaming, crying, crapping, organism of the Homo sapiens species on
> Earth grows up to be.
How I respond to questions presented to me has little to do with whether I
am a screaming, crapping organism from Earth, and more to do with how my
brain is configured.
> A person. Indivisible. Concrete. Spatiotemporal,
> electromagnetic, sensorimotive. All of it. How much of it is necessary
> to be "you"? The more isomorphic the better, obviously. Since we don't
> see disembodied versions of you hovering around, and don't see silicon
> emulations of you cropping up by themselves, we can assume that there
> may be a good reason for this. Since I can give your wet organic
> tissues a few crumbs of alkaloid substance which will utterly
> incapacitate your 'processes, relations, and information' whether you
> like it or not, we can assume that your atoms and molecules bear more
> than a casual role in perpetuating this identity simulation you
> currently enjoy.
> > In this example, the frying pan is not what causes Elmer Fudd's depiction
> > appear to react.
> Exactly! The state of your brain is not what causes you to react
> either. It can cause you to react, but you can cause your brain to
> react - as it does when you push your intention to move your hand down
> your spine and out your arm. You can wire someone up like a marionette
> and make them behave how you want them to behave but it's not going to
> create an experience of self-generating motivation. Same with a
> simulated brain. You can program it so act like a brain, but it has no
> subjective content.
> > The ability to form an arithmetic compression is one definition of
> > intelligence.
> > > It still uses a strategy
> > > which requires no understanding. It just methodically compares the
> > > positions of each piece to the positions which will result from each
> > > scenario that can be simulated.
> > How do people play chess?
> They can do it methodically or they can do it irrationally. They can
> take risks that make no sense - intimidate an opponent. All kinds of
> ways. Most of all, they do it for the pleasure of playing and
> thinking. Something that Deep Blue is incapable of.
Isn't it interesting to you just how much computers are capable of? There
are millions of different programs you can install on your computer which
enable it to act like a different kind of tool, device, machine. Deep Blue
might not be capable of pleasure, but that doesn't mean no computer can
> > > It has no idea that it's playing a
> > > game, it experiences no difference between winning and not winning.
> > This is an assumption of yours. If you saw a paralyzed person, you might
> > likewise conclude they have no inner experience based on what you could
> > observe.
> No, because I know that the exterior behavior of a living organism
> does not necessarily correspond to their internal state. If they
> aren't dead, they could be having an inner experience. If they are in
> a coma, their body, or tissues of their body, are still having an
> experience. When the body is completely dead, the molecules experience
> With a machine, it has an experience of being turned on which unites
> the circuit through all of the branching paths of the
When I use the term machine I don't mean the low level circuits, but the
software. What does the software feel? It depends on what the program is.
Programs have no relation or dependence on their underlying hardware.
> That's probably all it has. It's like one big
> semiconducting molecule that has different parts of it's circuit open
> at different times. I don't feel guilty about turning it off or
> recycling it when I'm done with it.
> > > > What in the human brain is not mechanical?
> > > The experience of the human brain, or any other physical phenomenon is
> > > not mechanical.
> > Does experience affect how the brain works or is it an epiphenomenon?
> Of course. It is an epiphenomenon and it is a phenomenon.
> If it
> > is not a epiphenomenon, how is it physically/mechanically realized such
> > that it can have physical effects on how the brain works?
> Sensorimotive experience is involuted electromagnetic energy. Your
> intention to move your hand is shared with the cells of your nervous
> system, who share it with your muscles as electromagnetic energy. Your
> muscles are as much motor-biased as your nervous system is sense-
> biased so it's experience is small on the sensorimotive, big on the
> electromagnetic. Basically your nervous system is the mechanical i/o
> between sensorimotive phenomena and electromagnetic phenomena - even
> though in a molecule they would be the same, in an organism the
> sensorimotive and electromagnetic are dimorphisized through the
> specialization of the tissues re: the body as a single organism.
> > > The notion of a machine is as a logical abstraction
> > > which can be felt or felt through, but it has no feeling itself.
> > This is your hope.
> Why would I hope that?
Perhaps you hope to have some servile intelligent machines take care of you
when you are older. ;-)
> > > The actions themselves can be emulated but the way that the actions
> > > are strung together can reveal the holes in the fabric of the
> > > mechanism. When you make a phone call, how long does it take for you
> > > to be able to tell whether it's a voicemail system vs a live person?
> > > Sometimes you can hear recording noise even before the voice begins.
> > > Occasionally you could be fooled for a few seconds. It's not that the
> > > system breaks down, it's that the genuine person will sooner or later
> > > pick up on subtle cues about the system which give it away.
> > Any discrete simulation can be made 2^512 times more precise by just
> > 512 times more memory. Even if the brain were continuous rather than
> > discrete, there is no limit to how accurately its behavior could be
> > simulated. If you have 512 decimal places to work with, it is not clear
> > all that one would ever detect these subtle cues.
> It doesn't matter, there still is no subjectivity there. Sooner or
> later some subjectivity will be called for in a given situation that
> may be noticed by someone interacting with it. You would have to load
> the simulation with every possibility of a human life on Earth for all
Deep Blue did not have to be loaded with every possible chess game to play
at the Grandmaster level, its programming could provide an answer to any
possible chess board position. Likewise, software could offer an answer to
any given question without having to load it with pre-canned answers.
Your position sounds like that of Descartes:
Is your belief that no non-biological machine could ever pass the Turing
> > A-life can and does. Download smart sweepers and see it happen before
> > eyes:http://www.ai-junkie.com/files/smart_sweepers.zip
> I know all about that kind of programming. It's not valid for the same
> reason as Mickey Mouse made of people isn't valid.
I don't see what leads you to this conclusion.
> > > You just said they can't unless they are programmed with that
> > > capability. That's not 'by themselves'.
> > Human beings did not design human brains. They were programmed by years
> > evolution.
> Are you saying that evolution will begin programming computers without
> human intermediaries at some point?
I was saying it already has. Biological brains are computers. We didn't
have to program ourselves to provide meaning. Likewise, an a-lifeform which
evolves doesn't need to be viewed on a monitor by a human for it to have a
life of its own.
> > > You can emulate the alphabet of neuron behaviors but that doesn't mean
> > > it can use those behaviors in the way that live neurons can.
> > Why couldn't they?
> Because it's not alive. It doesn't care.
According to all the standard definitions of life, a-life is alive.
> > > For the same reason that hydrogen can't be gold and inorganic matter
> > > can't be alive.
> > A lump of coal and a glass of water aren't alive until they are organized
> > appropriately, what makes them "organic" is a matter of their
> > and relationships between each other.
> It's not just the organization. That's what I'm pointing out. You
> cannot organize silicon into DNA or ammonia into water. They are
> different things. It's the inherent capacity of the thing to be
> organized in that way.
What about a program which simulates all the elements of the periodic table,
or is such a thing not possible?
If it is possible, then what about building such a life form using simulated
If that is not possible, because you say the computer is made of silicon,
then what about if the computer was made of carbon?
(An interesting side note: modern CPUs actually are built using 50% of the
elements in the periodic table)
> It's that finite quality which is actually what
> prevents comp from emulating fire or consciousness.
So you are postulating some infinity in the mind or in physics?
> Math isn't finite.
> It has no way to access the experience of having to be one thing and
> not everything and anything.
Within math exist finite objects.
> > So long as this organization and the
> > relationships are preserved, the pattern and its complexity are
> > > The computer has no high level complexity,
> > I think that is an absurd statement. Is not a program running a
> > protein-folding simulation more complex than a computer with a newly
> > formatted hard drive and all zeros in its memory?
> If you unplug the monitor nobody will know the difference.
It is like saying if you cut someone's spinal cord so they are locked-in, no
one will know the difference. Those inside will know the difference.
> evaluation of complexity is ours, it doesn't reside in the computer.
> The computer might have a more clickety clickety experience in the
> protein-folding sim. More heat. More circuitry opening and closing.
Complexity (aka Entropy) is an objective property.
> > > This universe could no more be simulated on a computer than it could
> > > on a network of plumbing.
> > If one can build a Turing machine out of plumbing it could simulate
> > any other computer can.
> I understand that position, I just reject it on the grounds that
> experience cannot be simulated.
> > > Fire isn't made of plumbing.
> > Fire is made of atoms swapping places and moving around quickly and
> > photons. These are all discrete particles with understandable properties
> > and known relations between them. Thus these same relations can be
> > replicated in any computer so that burning may take place.
> This is wrong though. You cannot make a program that will burn.
What do you expect to see, the computer spontaneously burst into flames
while the program is simulating fire?
When programs simulate life forms little critters don't come crawling out of
the computer case. Ink and paper don't come out of the computer when
running Microsoft Word. (Unless you hit the print button of course)
> It's a
> fantasy. You're just following the logic of the position to it's
> absurd conclusion and deciding that it's not absurd.
You are sticking to your hypothesis despite that it leads to absurd
conclusions, like the rejection of the Church-Turing Thesis.
( http://mathworld.wolfram.com/Church-TuringThesis.html provides a good
> > Also, are you sure you have never heard of Searle? He also used the
> > plumbing example as well:
> That's funny. I have heard of Searle but not read him. I saw him on a
> YouTube but his position seemed too reactionary to me. It sounded like
> he was saying what you are, that consciousness is mechanical.
> > > Because it is simulated only through our interpretation.
> > A tree falling has to be heard for it to produce a sound?
> Of course. A human sound anyways.
> I can understand
> > that, but what if there is a process which can hear within the
> It won't hear anything, it will just have ear drum shaped contours
> that are synched to the algorithms of air pressure compression. It
> will look like our bodies look when we hear but there is no sound.
> > > That's tautology. It's like saying 'if you couldn't see the inside of
> > > your stomach, you wouldn't know how to digest food'. Qualia is not
> > > necessary to the function of the brain.
> > I think you need to "see" (having qualia) in order to see. In other
> > Zombies are not possible. You cannot have something which performs as
> > though it can see if it has no visual experience.
> What do you mean by performing as though it can see? "The Star-nosed
> Mole can detect, catch and eat food faster than the human eye can
> follow (under 300 milliseconds)."
If you take the word "see" to apply to other senses senses, then let me
re-write my sentence:
"You cannot have something which performs as though it has senses, if it has
Saying 'Zombies are not possible' doesn't mean anything.
It means if software exists that behaves like humans, they would have
consciousness like humans.
> It just means
> that you want to believe that internal experiences like our own simply
> come with the universe automatically. Anything shaped like an eye can
> see, or like an eardrum can hear. It's not true.
You need more than an eye to see, you need a mind to interpret the
information as a visual experience.
> It's not completely
> untrue - some shapes have functions due to their shape, but other
> things have functions in spite of their shape. Adrenaline is not
> shaped like excitement.
> > > Indeed you can lose
> > > consciousness every night and your brain has no problem surviving
> > > without your experience of it.
> > You might have trouble running away from a lion while sleeping though.
> You might have trouble running away from a lion while wide awake too.
Yes, but you will be more effective if you are able to see.
(The original point was that a person without any experience of sight could
not survive as well)
> > > Your body can find food and reproduce
> > > without there being any sense of color or shape, odor, flavor, etc.
> > It seems it would be significantly harder to find food if we had no sense
> > shape or color (is that a strawberry or a rock?). You would expend a lot
> > energy chasing phantom food.
> Nope. It's easy. Bacteria eat. The Star-Nose Mole eats. It's a just-so
> story. http://en.wikipedia.org/wiki/Just-so_story
It's obvious that people's survival success depends on their ability to see.
> > > It
> > > could detect and utilize it's surroundings mechanically, like Deep
> > > Blue without ever having to feel that it is a thing.
> > I doubt you could have something with the same level of survivability as
> > human that did not have experience.
> Your doubts are unfounded. You are looking at what exists and deciding
> that there must be some reason why it could not be any other way.
If a blind rabbit and a sighted-rabbit were born in the wild, which would
you bet has a better chance on surviving to adulthood?
> > So you also doubt zombies? Then you should not be worrying about
> > which behave as humans that have no inner life.
> Zombies are just a useful philosophical idea.
I agree. They are very useful for exposing flaws in theories of mind.
> > > Exactly. It's different. Can't do the same things with it.
> > No, but you could use it to build a universal machine which can emulate
> > definable process.
> It's also a philosophical idea. It's wildly overreaching IMO and based
> upon the assumption that interiority does not exist, which I think is
Interiority exists in software. Software can have a first person experience
and a third-person definition.
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at