On Jul 30, 10:10 pm, Jason Resch <jasonre...@gmail.com> wrote:
> On Sat, Jul 30, 2011 at 5:45 PM, Craig Weinberg <whatsons...@gmail.com>wrote:

> Just as you say there is an inside and outside part of the mind/brain, I
> believe the same is true of Turing machines.

A Turing machine, in and of itself, is conceptual. It is has no
interior. It's an idea, like Mickey Mouse. Mickey Mouse does different
things, interacts with his environment in different ways, but it has
no inside part. Every part of it is exposed and accessible.

> How I respond to questions presented to me has little to do with whether I
> am a screaming, crapping organism from Earth, and more to do with how my
> brain is configured.

It depends what the questions are. If you are born in some areas of
Earth there is a high probability that you will only be able to answer
questions in a language common to that region - regardless of how your
brain is configured.

> Isn't it interesting to you just how much computers are capable of?  There
> are millions of different programs you can install on your computer which
> enable it to act like a different kind of tool, device, machine.

Absolutely. That's why it's unnecessary and patronizing to exaggerate
their capabilities.

> Deep Blue
> might not be capable of pleasure, but that doesn't mean no computer can
> experience pleasure.

If you make a computer out of specialized living cells, they can
experience pleasure. Like the brain.

> When I use the term machine I don't mean the low level circuits, but the
> software.  What does the software feel?

Software doesn't feel. It's a complicated cartoon which we can
control.

> It depends on what the program is.
> Programs have no relation or dependence on their underlying hardware.

Programs have to be re-written to run under a different OS, let alone
running on a different machine. Try to run a program from 2011 on
hardware from 1995. Programs can only run if they are written
specifically and precisely to address the hardware. That's all it is -
instructions for the hardware.

> Deep Blue did not have to be loaded with every possible chess game to play
> at the Grandmaster level, its programming could provide an answer to any
> possible chess board position.  Likewise, software could offer an answer to
> any given question without having to load it with pre-canned answers.

You're saying that all questions are equivalent, and only the extent
of the set makes the computation seem impossible. That's never been my
objection. Questions about chess board positions are a specific kind
of computable question. The question 'how do you feel' or 'what do you
want' cannot be computed or emulated. It can be impersonated, but not
answered honestly by silicon.

> Your position sounds like that of 
> Descartes:http://plato.stanford.edu/entries/turing-test/
>
> Is your belief that no non-biological machine could ever pass the Turing
> test?

I think that the whole Turing test line of thinking is a waste of
time. Whether or not someone can be fooled is not up to the machine,
it's up to the person observing the machine. Some will get it wrong,
some will get it right. To European colonists, indigenous people did
not pass the Turing test, and were exterminated and enslaved because
of it. It says nothing about subjectivity or understanding on the part
of the machine being tested. This is no accident, it's the way I think
subjectivity works. It's based on isomorphism. If your exterior
matches an exterior I am familiar with, I will tend to assume that the
interior matches my idea of what the interior usually is with this
match. I can be wrong, especially when the exterior is designed
explicitly to deceive and impersonate.

> > > A-life can and does.  Download smart sweepers and see it happen before
> > your
> > > eyes:http://www.ai-junkie.com/files/smart_sweepers.zip
>
> > I know all about that kind of programming. It's not valid for the same
> > reason as Mickey Mouse made of people isn't valid.
>
> I don't see what leads you to this conclusion.

It's just shadows. There is no substance to feel anything.

> I was saying it already has.  Biological brains are computers.

Brains are biological entities which do some computing. They also do
feeling and experiencing. Silicon chips are inorganic non-entities
which we do computing with but their feeling and experience is likely
limited to mineral levels of elaboration. More detection than feeling.

 > We didn't
> have to program ourselves to provide meaning.  Likewise, an a-lifeform which
> evolves doesn't need to be viewed on a monitor by a human for it to have a
> life of its own.

It doesn't have a life of it's own even if a human looks at it on a
monitor. It's just looking at a monitor and imagining the meaningless
pixels blinking are doing something. Just as when your mouse cursor
moves across the screen you imagine it's a thing moving rather than
just a set of pixels changing color and brightness in one area of the
screen after another. Pattern is in the eye of the beholder.

> According to all the standard definitions of life, a-life is alive.

What standard definition might that be. Any definition of life which
does not include the desire to feel better and avoid pain and death is
radically insufficient. Life is an experience of being alive, much
more than any third party measurement of growth or reproduction.

> What about a program which simulates all the elements of the periodic table,
> or is such a thing not possible?

Simulate them in what way? Water that a virtual person can drink? I
can do that with a comic book character.

> If it is possible, then what about building such a life form using simulated
> carbon?

Then you get cartoons that you can imagine are alive if you want to.

> If that is not possible, because you say the computer is made of silicon,
> then what about if the computer was made of carbon?

It's never been a problem with silicon, it's that the experiences of
living organisms come from the experiences of living cells which come
from organic molecules, which do not contain silicon. If a computer is
made of cells, then sure, it can feel. Good luck getting it to do what
you want.

> > It's that finite quality which is actually what
> > prevents comp from emulating fire or consciousness.
>
> So you are postulating some infinity in the mind or in physics?

No, no. I'm postulating that the quality of being forced to be one
particular kind of thing is what life and experience is all about.
Since comp is not that - since it is that which can emulate, it cannot
emulate it's opposite. It can't emulate those qualities which arise
from something being only able to be exactly one thing. In a human,
those qualities scale up to your identity. You can only be exactly
you. A computation can't make itself into an uncomputation that
forever will only be you, therefore the experience is permanently
inaccessible.

> Within math exist finite objects.

Maybe. If so, by definition they cannot simulate anything. To simulate
you need variables, and elemental properties are invariable.

> If you unplug the monitor nobody will know the difference.

> It is like saying if you cut someone's spinal cord so they are locked-in, no
> one will know the difference.  Those inside will know the difference.

Do you think that your Excel spreadsheet knows whether or not the
monitor is plugged in?

> > The
> > evaluation of complexity is ours, it doesn't reside in the computer.
> > The computer might have a more clickety clickety experience in the
> > protein-folding sim. More heat. More circuitry opening and closing.
>
> Complexity (aka Entropy) is an objective property.

There is no objective difference between high entropy and low entropy
except through our interpretation of the relation to each other. It's
just a way of quantifying our evaluation of certain kinds of patterns
and their behavior.

>
> > This is wrong though. You cannot make a program that will burn.
>
> What do you expect to see, the computer spontaneously burst into flames
> while the program is simulating fire?

Not unless the assertion that fire can be emulated were true. If that
were the case, then of course I would expect the emulation to burn as
fire burns, flame as fire flames, illuminate as fire illuminates.
Otherwise, you're just cherry picking the easy parts of fire to
emulate and disqualifying the rest. Instead just say 'what did you
expect to see, a program simulating fire?' and you can see how much
more sense it makes.

> When programs simulate life forms little critters don't come crawling out of
> the computer case.  Ink and paper don't come out of the computer when
> running Microsoft Word. (Unless you hit the print button of course)

Right, because the simulation is only partial. Consciousness is not
included because the experience cannot simulate the experiencer.

> > It's a
> > fantasy. You're just following the logic of the position to it's
> > absurd conclusion and deciding that it's not absurd.
>
> You are sticking to your hypothesis despite that it leads to absurd
> conclusions, like the rejection of the Church-Turing Thesis.
> (http://mathworld.wolfram.com/Church-TuringThesis.htmlprovides a good
> definition )

C-T is false because it assumes that there is a difference between
'real world computation' and Turing machine equivalents. Computation
is not what makes the world. The world makes computation. All
computation is the same, yes, but computation is a subjective
simulation of objective patterns. It's not an object.

> > What do you mean by performing as though it can see? "The Star-nosed
> > Mole can detect, catch and eat food faster than the human eye can
> > follow (under 300 milliseconds).[1]"
> >http://en.wikipedia.org/wiki/Blind_animals
>
> If you take the word "see" to apply to other senses senses, then let me
> re-write my sentence:
> "You cannot have something which performs as though it has senses, if it has
> no sensations"

If you rewrite your sentence to be 'You cannot have something which
performs as though it has senses if it has no sensorimotive detection'
then I agree. There is, however, the matter of quality of perception.
Seeing is not the same as tapping in front of you with a cane, but it
satisfies the sentence. A computer definitely needs circuitry that
detects keystrokes and mouseclicks. It taps it's cane until a
millivolt or whatever makes it jump.

> > It just means
> > that you want to believe that internal experiences like our own simply
> > come with the universe automatically. Anything shaped like an eye can
> > see, or like an eardrum can hear. It's not true.
>
> You need more than an eye to see, you need a mind to interpret the
> information as a visual experience.

Right. You need a mind that speaks 'vision'. Silicon doesn't speak
that. Vision comes from the sea and sun. Photosynthesis. Predation.
http://news.nationalgeographic.com/news/2011/05/110502-sea-urchins-eyes-science-animals/

> Yes, but you will be more effective if you are able to see.
> (The original point was that a person without any experience of sight could
> not survive as well)

You would be way more effective if you just had telepathy with lions,
or omniscience, teleportation, or time travel. The fact that vision is
helpful doesn't make it suddenly available as a phenomenon out of
nowhere.

> It's obvious that people's survival success depends on their ability to see.

Since most animals can see, it's not really worth mentioning. People's
survival success depends on their ability ability to pee also, but we
don't have to have a special urethrolfactory sense to detect urination
related threats.

> > Your doubts are unfounded. You are looking at what exists and deciding
> > that there must be some reason why it could not be any other way.
>
> If a blind rabbit and a sighted-rabbit were born in the wild, which would
> you bet has a better chance on surviving to adulthood?

Again, sight is not necessary for the survival of animals or any
organism. Once sight exists, it confers advantages that outweigh it's
disadvantages (seeing can get you into trouble too. Seeing tasty eggs
in an eagle's nest could present a fatal risk.) Why don't plants need
to see? They can't even move so it would behoove them to be able to
see so that they could pretend to shrivel up if an herbivore
approaches.

> > Zombies are just a useful philosophical idea.
>
> I agree.  They are very useful for exposing flaws in theories of mind.

Or justifying wishful thinking.

> Interiority exists in software.  Software can have a first person experience
> and a third-person definition.

You're saying that when a program runs it can be considered to have
it's own internally consistent world. I say that world is in our
imagination, not a phenomenon that exists in any other way. A chip can
have a first person experience of software - because thats what
software is, instructions to hardware, but the chip doesn't experience
what we experience, it just gets binary blips. When we connect a
monitor to it we can see some of those blips which we have formatted
to be seen on a monitor. The monitor doesn't see the pattern though.
Or the video card or the software that handles them. They see nothing
and will spew out any binary pattern that's fed into them.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to