On 19 Jan 2012, at 03:56, Jason Resch wrote:
On Tue, Jan 17, 2012 at 2:20 PM, Craig Weinberg
On Jan 17, 12:51 am, Jason Resch <jasonre...@gmail.com> wrote:
> On Mon, Jan 16, 2012 at 10:29 PM, Craig Weinberg
> > That's what I'm saying though. A Turing machine cannot be built in
> > liquid, gas, or vacuum. It is a logic of solid objects only. That
> > means it's repertoire is not infinite, since it can't simulate a
> > Turing machine that is not made of some simulated solidity.
> Well you're asking for something impossible, not something
> simulate, but something that is logically impossible.
We can simulate logical impossibilities graphically though (Escher,
etc). My point is that a Turing machine is not even truly universal,
let alone infinite. It's an object oriented syntax that is limited to
particular kinds of functions, none of which include biological
awareness (which might make sense since biology is almost entirely
But its not entirely free of solids. You can build a computer out
of mostly fluids and solutions too.
I agree. Even with gas in some volume.
> Also, something can be infinite without encompassing everything.
> can be infinite in length without every point in existence having
to lie on
> that line.
If that's what you meant though, it's not saying much of anything
about the repertoire. A player piano has an infinite repertoire too.
A piano cannot tell you how any finite process will evolve over time.
Yes. Craig argue that machine cannot thinks by pointing on its fridge.
> > > To date, there is nothing we
> > > (individually or as a race) has accomplished that could not in
> > > also be accomplished by an appropriately programed Turing
> > Even if that were true, no Turing machine has ever known what it
> > accomplished,
> Assuming you and I aren't Turing machines.
It would be begging the question otherwise.
All known biological processes are Turing emulable.
> > so in principle nothing can ever be accomplished by a
> > Turing machine independently of our perception.
> Do asteroids and planets exist "out there" even if no one
They don't need humans to perceive them to exist, but my view is that
gravity is evidence that all physical objects perceive each other. Not
in a biological sense of feeling, seeing, or knowing, but in the most
primitive forms of collision detection, accumulation, attraction to
If atoms can perceive gravitational forces, why can't computers
perceive their inputs?
> > What is an
> > 'accomplishment' in computational terms?
> I don't know.
> > > > You can't build it out of uncontrollable living organisms.
> > > > There are physical constraints even on what can function as
> > > > AND gate. It has no existence in a vacuum or a liquid or gas.
> > > > Just as basic logic functions are impossible under those
> > > > physically disorganized conditions, it may be the case that
> > > > can only develop by itself under the opposite conditions. It
> > > > variety of solids, liquids, and gases - very specific ones.
> > > > Legos. It's alive. This means that consciousness may not be
> > > > at all - not generalizable in any way. Consciousness is the
> > > > it is a specific enactment of particular events and
materials. A brain
> > > > can only show us that a person is a live, but not who that
> > > > The who cannot be simulated because it is an unrepeatable
event in the
> > > > cosmos. A computer is not a single event. It is parts which
> > > > assembled together. It did not replicate itself from a
> > > > cell.
> > > > > > You can't make a machine that acts like a person without
> > > > > > it becoming a person automatically. That clearly is
> > me.
> > > > > What do you think about Strong AI, do you think it is
> > > > The whole concept is a category error.
> > > Let me use a more limited example of Strong AI. Do you think
> > any
> > > existing or past human profession that an appropriately built
> > > (which is driven by a computer and a program) could not excel
> > Artist, musician, therapist, actor, talk show host, teacher,
> > caregiver, parent, comedian, diplomat, clothing designer,
> > movie critic, author, etc.
> What do you base this on? What is it about being a machine that
> them from fulfilling any of these roles?
Machines have no feeling. These kinds of careers rely on sensitivity
to human feeling and meaning. They require that you care about things
that humans care about. Caring cannot be programmed.
A program model of a psychologist's biology will tell you exactly
what the psychologist will do and say in any situation.
But Craig might be right. Caring and many things can be Turing
emulable, yet not programmable. If artificial machine evolves, they
might indeed not care about what humans care about. Especially if we
dismiss them as strange, foreigners, or slaves.
That is the
opposite of caring, because programming requires no investment by the
programmed. There is no subject in a program, only an object
programmed to behave in a way that seems like it could be a subject in
If there is no subject in the emulation of the psychologist's
biology, then it is a zombie. The evolution of the program can be
used to drive the servos and motors in the android, and it will
> Also, although their abilities are limited, the below examples
> show that computers are making inroads along many of these lines
> and will only improve overtime as computers become more powerful.
Many professions would be much better performed by a computer. Human
oversight might be desirable for something like surgery, but I would
probably go with the computer over a human surgeon.
> Artist and Musician: Computer generated music has been around
> least the 60s:http://www.youtube.com/watch?v=X4Neivqp2K4
Yep, 47 years since then and still no improvement whatsoever. Based on
that I think we cannot assume that computer generated music will
improve significantly over time as computers become more powerful.
They can just make more realistic music sound just as bad.
Until computers have greater power than the parts of the human brain
involved in these skills, and we understand those mechanisms (or
reverse engineer/copy them) AI will lag behind human ability.
Perhaps. It is also possible that we will transform ourself into
computer more quickly that hand made computers evolve enough to
When you look how education investment and quality decreased over the
50 last years, it looks like today's machine might already be clever
than our future kids. The singularity point might get closer, not by
machine evolving, but by humans regressing.
> Therapist: ELIZA, the computer psychologist has been around since
Again, no improvement in almost 50 years. Does anyone use ELIZA for
psychology? No. It's utterly useless except as a novelty and
It's not a teacher, it's a computer assisted learning regimen. An
exercise machine is not the same thing as a personal trainer or a
> Caregiver: The Japanese are actively researching and developing
> robots to take care of their aging
That doesn't mean that they will excel at being caregivers.
> Comedian: "What kind of murderer has moral fiber?" — "A cereal
> This joke was written by a computer. (http://www.newscientist.com/article/dn1719
> Movie Critic:http://www.netflixprize.com/
Again, generating a sophomoric pun (in a sea of garbage jokes) is not
the same thing as 'excelling at being a comedian.' All of these
examples reveal the utter failure of computation to get passed square
one in any of these areas. It is obvious to me that the failure is
rooted in precisely the failure of computation to simulate awareness
beyond a trivial form of sophistication. Limited capacities for
simulating trivial music, conversation, humor, compassion are
radically overestimated, even though there has been no sign of
progress at all since the beginning of computing.
> > > Could
> > > there be a successful android surgeon, computer programmer,
> > > lawyer, etc.
> > I would say there could be very successful android surgeons,
> > computer programmers and lawyers because there is an element of
> > creativity there,
> Computers have demonstrated creativity:http://www.mendeley.com/research/automated-design-previously-patented
link doesn't come up.
Sorry, it was incomplete:
> > and not so much for a psychologist, because the job
> > requires the understanding of feeling, which is not possible for a
> > computer executed in material that cannot feel like an animal
> But a computer program will have the same output (outwardly visible
> behavior) regardless of its substrate. Clearly the material on
> Turing machine is executed cannot have any effect on its
If that were the case then a Turing machine should be executable as a
truck load of live hamsters or a dense layer of fog.
It has to be a Turing machine before it gains all the powers of
every other Turing machine. Not everything is a Turing machine, but
any Turing machine, regardless of its substrate is equally capable.
The fact that it
cannot work that way is evidence that the material does relate to the
ability of a Turing machine to perform even basic functions.
Art, music, comedy, compassion, etc are not 'output'.
Niether are the nerve impulses from your spinal cord art, music,
comedy, compassion, etc. But the output can be used to control a
body or other mechanism to express any and all of those things.
experiences which can be shared. A Turing machine can't experience
anything by itself, it is only the substrate that experiences.
> If a
> Turing machine run on carbon makes a better psychologist, then
> program executed on a silicon Turing machine will be just as
The machine exploits the common sense of object oriented substrates.
It doesn't matter whether it runs on silicon or boron or gadolinium,
because any sufficiently polite solid material will do. None of them
make a good psychologist. For that you need something that neurons run
You need to believe a psychologist is capable of hyper-computation
for this position to be consistent.
That's what we told to Craig since the beginning.
> > Until silicon can feel proud and ashamed, it won't be any good at
> > psychology.
> Unless there is something about psychologists that is infinite,
> is no externally visible behavior a psychologist is capable of
> android controlled by a Turing machine could not also do.
A keyboard can be programmed to type any sentence. Does that mean it
is Shakespeare? A Turing machine can only impersonate intelligence
Again different human skills require different levels of
computational resources. Do you think Deep Blue can only play chess
at a trivial level?
it can't embody it authentically.
If something behaves intelligently it is intelligent.
Yes. Note that something intelligent does not necessarily behave
intelligently. In fact intelligence is a prerequisite for being
stupid. Pebbles are not stupid.
It's not about matching
behaviors, it's about having the sensitivity and feeling to know when
and why the behaviors are appropriate. It's about originating new
behaviors that are significant improvements over previous approaches.
> > > Or do you believe there is some inherent limitation of
> > > computers that would prevent them from being capable in one of
> > > roles? If so please provide an example.
> > Computers are inherently limited by their material substrate. A
> > mechanism of electronic silicon will never know what it is to feel
> > pain, fear, pleasure, etc. Any role which emphasizes a talent for
> > feeling and understanding would fail to be fulfilled by the
> > disembodied recursive enumeration.
> Do you think something have to feel to perfectly act as though it is
> feeling? Actors can pretend to suffer if their role is to be
tortured in a
> movie, yet they feel no pain.
They aren't feeling pain at the moment, but they are capable of
experiencing pain, therefore they can fake it with feeling.
An actors performance comes down to how they move their muscles,
there is a limited number of muscles in a human body, and a finite
number of ways in which an actor can move them in any performance of
finite length. These movements could be replicated even by a
process which has never felt pain, and do so as convincingly as any
> If you are into sci-fi, you should watch the
> recent (not 1970s) Battlestar Galactica series. Among other
> explores a racism against machines who in all respects look act
> like humans.
Yeah I have watched a lot of that BSG. I like how the cylons are
monotheistic and humans are pagan. It's a good show. I would agree,
if it were the case that AI robots were indistinguishable to us that
it would be a valid philosophical issue. My view though is that there
are some good reasons that will never be the case.
If we scanned brain images into computers powerful enough to run
them, and they always broke down or failed to function, I would
consider that evidence against computationalism. The fact that the
current computers of our time have not equaled or surpassed us in
every respect is no more evidence against computationalism, then
would it have been in the 1900's that no man-made mechanical machine
could convincingly behave like a human. We know roughly the
computational power of the human brain, and computers of today are
somewhere between that of an insect and that of a mouse. Once we
reach the computational power of a mouse brain, then in 7 years we
will reach the power of a cat brain, and 7 years later the power of
a human brain (Assuming Moore's law of doubling computational power
for the same price each year).
As the AI horizon
continues to recede infinitely, even in the face of ever faster
hardware and more bloated software, we will continue to have to deal
with actual racism rather than theoretical anthropism.
20 years ago would you have been surprised to learn that a computer
would beat the leading Jeapordy champions, or that we would have
self-driving cars before flying cars?
Sure. I have been treated as a complete morons for having said that
computer will able to play chess and will able to to symboblic
derivation and integration. The dogma, when I was young, was that
computer are only crunching numbers machines, capable of doing only
numerical calculations and nothing else.
If the cylons
were genetically engineered beings instead, well, that's a different
story entirely. Living creatures matter, programs don't (except to the
living creatures that use them).
Are you afraid to burn coal in your stove out of concern that the
material will sense being burned?
Yes. Craig's "theory" is a bit frightening with respect of this. But
of course that is not an argument. Craig might accuse you of wishful
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to
For more options, visit this group at