On Jan 18, 9:56 pm, Jason Resch <jasonre...@gmail.com> wrote:
> On Tue, Jan 17, 2012 at 2:20 PM, Craig Weinberg <whatsons...@gmail.com>wrote:
>
> > On Jan 17, 12:51 am, Jason Resch <jasonre...@gmail.com> wrote:
> > > On Mon, Jan 16, 2012 at 10:29 PM, Craig Weinberg <whatsons...@gmail.com
> > >wrote:
>
> > > > That's what I'm saying though. A Turing machine cannot be built in
> > > > liquid, gas, or vacuum. It is a logic of solid objects only. That
> > > > means it's repertoire is not infinite, since it can't simulate a
> > > > Turing machine that is not made of some simulated solidity.
>
> > > Well you're asking for something impossible, not something impossible to
> > > simulate, but something that is logically impossible.
>
> > We can simulate logical impossibilities graphically though (Escher,
> > etc). My point is that a Turing machine is not even truly universal,
> > let alone infinite. It's an object oriented syntax that is limited to
> > particular kinds of functions, none of which include biological
> > awareness (which might make sense since biology is almost entirely
> > fluid-solution based.)
>
> But its not entirely free of solids.  You can build a computer out of
> mostly fluids and solutions too.

Yes, but you can't grow a computer by watering it or kill it by
depriving it of water. Biology is synonymous with water as far as we
know.

>
>
>
> > > Also, something can be infinite without encompassing everything.  A line
> > > can be infinite in length without every point in existence having to lie
> > on
> > > that line.
>
> > If that's what you meant though, it's not saying much of anything
> > about the repertoire. A player piano has an infinite repertoire too.
> > So what?
>
> A piano cannot tell you how any finite process will evolve over time.

I agree, computers are impressive, although It's not real time, it's
generic theoretical time, and it's not real processes, it's models of
our assumptions about real processes.

>
>
>
> > > > > To date, there is nothing we
> > > > > (individually or as a race) has accomplished that could not in
> > principle
> > > > > also be accomplished by an appropriately programed Turing machine.
>
> > > > Even if that were true, no Turing machine has ever known what it has
> > > > accomplished,
>
> > > Assuming you and I aren't Turing machines.
>
> > It would be begging the question otherwise.
>
> All known biological processes are Turing emulable.

Is Jason Resch a biological process? Where can your full name at birth
be found in your biology?

>
>
>
> > > > so in principle nothing can ever be accomplished by a
> > > > Turing machine independently of our perception.
>
> > > Do asteroids and planets exist "out there" even if no one perceives them?
>
> > They don't need humans to perceive them to exist, but my view is that
> > gravity is evidence that all physical objects perceive each other. Not
> > in a biological sense of feeling, seeing, or knowing, but in the most
> > primitive forms of collision detection, accumulation, attraction to
> > mass, etc.
>
> If atoms can perceive gravitational forces, why can't computers perceive
> their inputs?

They perceive the push and pull of current and the instinct to
restrain or allow that flow. That is the level of perception of an
electronic computer. The parts of a mechanical computer perceive the
force of the changes to their mass and supply the appropriate physical
response. A human being has many orders of magnitude higher
proliferation, development, and integration of sense channels - in the
chemical, biological, zoological, and anthropological frequency
ranges.

>
>
>
> > > > What is an
> > > > 'accomplishment' in computational terms?
>
> > > I don't know.
>
> > > > > > You can't build it out of uncontrollable living organisms.
> > > > > > There are physical constraints even on what can function as a
> > simple
> > > > > > AND gate. It has no existence in a vacuum or a liquid or gas.
>
> > > > > > Just as basic logic functions are impossible under those ordinary
> > > > > > physically disorganized conditions, it may be the case that
> > awareness
> > > > > > can only develop by itself under the opposite conditions. It needs
> > a
> > > > > > variety of solids, liquids, and gases - very specific ones. It's
> > not
> > > > > > Legos. It's alive. This means that consciousness may not be a
> > concept
> > > > > > at all - not generalizable in any way. Consciousness is the
> > opposite,
> > > > > > it is a specific enactment of particular events and materials. A
> > brain
> > > > > > can only show us that a person is a live, but not who that person
> > is.
> > > > > > The who cannot be simulated because it is an unrepeatable event in
> > the
> > > > > > cosmos. A computer is not a single event. It is parts which have
> > been
> > > > > > assembled together. It did not replicate itself from a single
> > living
> > > > > > cell.
>
> > > > > > > > You can't make a machine that acts like a person without
> > > > > > > > it becoming a person automatically. That clearly is ridiculous
> > to
> > > > me.
>
> > > > > > > What do you think about Strong AI, do you think it is possible?
>
> > > > > > The whole concept is a category error.
>
> > > > > Let me use a more limited example of Strong AI.  Do you think there
> > is
> > > > any
> > > > > existing or past human profession that an appropriately built android
> > > > > (which is driven by a computer and a program) could not excel at?
>
> > > > Artist, musician, therapist, actor, talk show host, teacher,
> > > > caregiver, parent, comedian, diplomat, clothing designer, director,
> > > > movie critic, author, etc.
>
> > > What do you base this on?  What is it about being a machine that
> > precludes
> > > them from fulfilling any of these roles?
>
> > Machines have no feeling. These kinds of careers rely on sensitivity
> > to human feeling and meaning. They require that you care about things
> > that humans care about. Caring cannot be programmed.
>
> A program model of a psychologist's biology will tell you exactly what the
> psychologist will do and say in any situation.

No, the program will fail to be perceived as authentic. It is a way to
try to simulate caring, but it is not caring. The program feels
nothing so it will not be able to respond freely in the moment. The
psychologist's biology will not tell anyone what they will do and say
in any situation. There are words that the psychologist will use in
the future which have not been invented yet, new therapies and
articles to ready. Will the program simulate the future of psychology
and language as well? This idea of biology as a static template of
human behavior is a leftover fairy tale of the 19th century.

>
> > That is the
> > opposite of caring, because programming requires no investment by the
> > programmed. There is no subject in a program, only an object
> > programmed to behave in a way that seems like it could be a subject in
> > some ways.
>
> If there is no subject in the emulation of the psychologist's biology, then
> it is a zombie.

No, it's an inanimate object. A zombie arises from the expectation
that it should have a subject in the first place. Of course it has no
subject, it's a machine. It's a stapler with a face painted on it.

>The evolution of the program can be used to drive the
> servos and motors in the android, and it will behave indistinguishably.

Indistinguishably to whom? Are these magical servos and motors that
smell and sound like a living person? That show up on an X-Ray as
bones and organs?

>
>
>
> > > Also, although their abilities are limited, the below examples certainly
> > > show that computers are making inroads along many of these lines of work,
> > > and will only improve overtime as computers become more powerful.
>
> > Many professions would be much better performed by a computer. Human
> > oversight might be desirable for something like surgery, but I would
> > probably go with the computer over a human surgeon.
>
> > > Artist and Musician: Computer generated music has been around since at
> > > least the 60s:http://www.youtube.com/watch?v=X4Neivqp2K4
>
> > Yep, 47 years since then and still no improvement whatsoever. Based on
> > that I think we cannot assume that computer generated music will
> > improve significantly over time as computers become more powerful.
> > They can just make more realistic music sound just as bad.
>
> Until computers have greater power than the parts of the human brain
> involved in these skills, and we understand those mechanisms (or reverse
> engineer/copy them) AI will lag behind human ability.

AI for trivial applications has progressed, but it's quality of
understanding or feeling has not progressed in any way. AI remains
paper thin and stiff with rigor mechina.

>
>
>
> > > Therapist: ELIZA, the computer psychologist has been around since 1964:
> >http://nlp-addiction.com/eliza/
>
> > Again, no improvement in almost 50 years. Does anyone use ELIZA for
> > psychology? No. It's utterly useless except as a novelty and
> > linguistics demonstration.
>
> > > Teacher:http://en.wikipedia.org/wiki/Rosetta_Stone_%28software%29
>
> > It's not a teacher, it's a computer assisted learning regimen. An
> > exercise machine is not the same thing as a personal trainer or a
> > coach.
>
> > > Caregiver: The Japanese are actively researching and developing
> > caregiving
> > > robots to take care of their aging population:
> >http://web-japan.org/trends/09_sci-tech/sci100225.html
>
> > That doesn't mean that they will excel at being caregivers.
>
> > > Comedian: "What kind of murderer has moral fiber?" — "A cereal killer."
> > > This joke was written by a computer. (
> >http://www.newscientist.com/article/dn1719)
> > > Movie Critic:http://www.netflixprize.com/
>
> > Again, generating a sophomoric pun (in a sea of garbage jokes) is not
> > the same thing as 'excelling at being a comedian.' All of these
> > examples reveal the utter failure of computation to get passed square
> > one in any of these areas. It is obvious to me that the failure is
> > rooted in precisely the failure of computation to simulate awareness
> > beyond a trivial form of sophistication. Limited capacities for
> > simulating trivial music, conversation, humor, compassion are
> > radically overestimated, even though there has been no sign of
> > progress at all since the beginning of computing.
>
> > > > >  Could
> > > > > there be a successful android surgeon, computer programmer,
> > psychologist,
> > > > > lawyer, etc.
>
> > > > I would say there could be very successful android surgeons, less so
> > > > computer programmers and lawyers because there is an element of
> > > > creativity there,
>
> > > Computers have demonstrated creativity:
> >http://www.mendeley.com/research/automated-design-previously-patented...
>
> > link doesn't come up.
>
> Sorry, it was 
> incomplete:http://www.mendeley.com/research/automated-design-previously-patented...

Same truncated link. Bit.ly it?

>
>
>
> > > > and not so much for a psychologist, because the job
> > > > requires the understanding of feeling, which is not possible for a
> > > > computer executed in material that cannot feel like an animal feels.
>
> > > But a computer program will have the same output (outwardly visible
> > > behavior) regardless of its substrate.  Clearly the material on which the
> > > Turing machine is executed cannot have any effect on its performance.
>
> > If that were the case then a Turing machine should be executable as a
> > truck load of live hamsters or a dense layer of fog.
>
> It has to be a Turing machine before it gains all the powers of every other
> Turing machine.  Not everything is a Turing machine, but any Turing
> machine, regardless of its substrate is equally capable.

You're overlooking the fact that the Turning machineness owes it's
existence entirely to the physical qualities of it's substrate. They
are all equally capable, but also equally incapable.

>
> > The fact that it
> > cannot work that way is evidence that the material does relate to the
> > ability of a Turing machine to perform even basic functions.
>
> > Art, music, comedy, compassion, etc are not 'output'.
>
> Niether are the nerve impulses from your spinal cord art, music, comedy,
> compassion, etc.  But the output can be used to control a body or other
> mechanism to express any and all of those things.

That's the neuron doctrine, but I don't think it's true. Nerve
impulses are just traffic signals. They are not the signifying agents
of the psyche. They psyche is the phenomenology of an entire human
life - that is what discovers comedy and art, not output of a generic
pattern of impulses, but the actual sense of the tissues of the body
and the autobiographical story they tell as a whole, as well as the
stories of the people of the entire society. The brain is just the
aggregator of the nervous system's sense, which is the aggregator of
the sense of the body as a whole in relation to it's world.

>
> > They are
> > experiences which can be shared. A Turing machine can't experience
> > anything by itself, it is only the substrate that experiences.
>
> > If a
> > > Turing machine run on carbon makes a better psychologist, then that same
> > > program executed on a silicon Turing machine will be just as successful.
>
> > The machine exploits the common sense of object oriented substrates.
> > It doesn't matter whether it runs on silicon or boron or gadolinium,
> > because any sufficiently polite solid material will do. None of them
> > make a good psychologist. For that you need something that neurons run
> > on themselves.
>
> You need to believe a psychologist is capable of hyper-computation for this
> position to be consistent.
>

No, because computation is only one channel of sense. It's the machine
that is hypo-sentient. A psychologist isn't a hyper computer, they are
a living being that has computational capacities as well as visual,
aural, tactile, emotional, intuitive, verbal, etc capacities. Animal
capacities.

>
>
> > > > Until silicon can feel proud and ashamed, it won't be any good at
> > > > psychology.
>
> > > Unless there is something about psychologists that is infinite, then
> > there
> > > is no externally visible behavior a psychologist is capable of that the
> > > android controlled by a Turing machine could not also do.
>
> > A keyboard can be programmed to type any sentence. Does that mean it
> > is Shakespeare? A Turing machine can only impersonate intelligence
> > trivially,
>
> Again different human skills require different levels of computational
> resources.  Do you think Deep Blue can only play chess at a trivial level?

Yes. Deep Blue doesn't know that it is even playing chess. It's like a
Rubiks Cube that remembers how to get back to the ideal arrangement
regardless of how scrambled you make it. It's just executing a program
and has no investment in it. Of course it's not a trivial
accomplishment for the programmers of Deep Blue, and for computer
science in general, that is very real and non-trivial, but no, Deep
Blue doesn't know or care who it beats in playing the game.

>
> > it can't embody it authentically.
>
> If something behaves intelligently it is intelligent.
>
> > It's not about matching
> > behaviors, it's about having the sensitivity and feeling to know when
> > and why the behaviors are appropriate. It's about originating new
> > behaviors that are significant improvements over previous approaches.
>
> > > > > Or do you believe there is some inherent limitation of
> > > > > computers that would prevent them from being capable in one of these
> > > > > roles?  If so please provide an example.
>
> > > > Computers are inherently limited by their material substrate. A
> > > > mechanism of electronic silicon will never know what it is to feel
> > > > pain, fear, pleasure, etc. Any role which emphasizes a talent for
> > > > feeling and understanding would fail to be fulfilled by the promise of
> > > > disembodied recursive enumeration.
>
> > > Do you think something have to feel to perfectly act as though it is
> > > feeling?  Actors can pretend to suffer if their role is to be tortured
> > in a
> > > movie, yet they feel no pain.
>
> > They aren't feeling pain at the moment, but they are capable of
> > experiencing pain, therefore they can fake it with feeling.
>
> An actors performance comes down to how they move their muscles, there is a
> limited number of muscles in a human body,

That's a reductionist assumption. It's seductive, I agree, but no, an
actor's image is a gestalt of semantic cues that has little to do with
muscles and everything to do with their capacity to reveal interior
experiences. It's like saying that there is a limited number of
windows in a building so that limits what can be seen through them.
Sense is all about specular reflection, not literal correspondence.

>and a finite number of ways in
> which an actor can move them in any performance of finite length.  These
> movements could be replicated even by a process which has never felt pain,
> and do so as convincingly as any actor.

Yes, they are called movies. The projector and screen feel nothing. If
you make a movie starring androids, you might have some success in
fooling some audience members for some time, but ultimately people
won't feel the same about it. We can't even make CGI animation that is
convincing as non-CGI. It might be great looking, but the water feels
the same as fire. The landscapes feel sterile. We have senses about
realism which go beyond what we know about sense. Ways that our sense
channels are integrated detect inconsistencies at a subconscious
level.

>
>
>
> > >  If you are into sci-fi, you should watch the
> > > recent (not 1970s) Battlestar Galactica series.  Among other things, it
> > > explores a racism against machines who in all respects look act and
> > behave
> > > like humans.
>
> > Yeah I have watched a lot of that BSG. I like how the cylons are
> > monotheistic and humans are pagan.  It's a good show. I would agree,
> > if it were the case that AI robots were indistinguishable to us that
> > it would be a valid philosophical issue. My view though is that there
> > are some good reasons that will never be the case.
>
> If we scanned brain images into computers powerful enough to run them, and
> they always broke down or failed to function, I would consider that
> evidence against computationalism.  The fact that the current computers of
> our time have not equaled or surpassed us in every respect is no more
> evidence against computationalism, then would it have been in the 1900's
> that no man-made mechanical machine could convincingly behave like a
> human.  We know roughly the computational power of the human brain, and
> computers of today are somewhere between that of an insect and that of a
> mouse.  Once we reach the computational power of a mouse brain, then in 7
> years we will reach the power of a cat brain, and 7 years later the power
> of a human brain (Assuming Moore's law of doubling computational power for
> the same price each year).

Computing power without feeling doesn't scale up qualitatively. You
just have more insects acting like a cat. To get a cat, you need
something that can feel like a cat.

>
> > As the AI horizon
> > continues to recede infinitely, even in the face of ever faster
> > hardware and more bloated software, we will continue to have to deal
> > with actual racism rather than theoretical anthropism.
>
> 20 years ago would you have been surprised to learn that a computer would
> beat the leading Jeapordy champions, or that we would have self-driving
> cars before flying cars?

No, I would be disappointed. By 2012, are you kidding? My 1992 self
expected to be uploaded into a computer by now. Computer technology
seemed much more promising to me 20 and 30 years ago. It was fun then.
The actual computers were fun. Not just using them to look at things
and talk to people, but just playing around with them when they were
completely open and accessible was so freeing. Now they are just a sad
consolation prize in the face of a civilization in steep decline.


>
> > If the cylons
> > were genetically engineered beings instead, well, that's a different
> > story entirely. Living creatures matter, programs don't (except to the
> > living creatures that use them).
>
> Are you afraid to burn coal in your stove out of concern that the material
> will sense being burned?

Haha. It's not the carbon alone that makes organisms live. Carbon is a
necessary but not sufficient ingredient.

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to