On Aug 23, 7:18 pm, meekerdb <meeke...@verizon.net> wrote:
> On 8/23/2011 3:36 PM, Craig Weinberg wrote:
> > On Aug 23, 5:58 pm, meekerdb<meeke...@verizon.net>  wrote:
> >> On 8/23/2011 2:13 PM, Craig Weinberg wrote:
> >>> The basic difference is the ability to feel. Literally proving it
> >>> would require a brain implant that remotes to the device, but I would
> >>> be very impressed if a machine could convincingly answer personal
> >>> questions like 'what do you want', or 'what's bothering you'. If they
> >>> could continue to converse fluently about those answers and reveal a
> >>> coherent personality which was not preconfigured in the software.
> >> "Not preconfigured in software" sounds like an escape clause.  Your use
> >> of speech was preconfigured in the software of your brain.  All infants
> >> learns to speak the language they hear - and if they don't hear any they
> >> make one up.
> > Right. Making one up = not preconfigured. If a machine can make a
> > coherent identity up for itself with a point of view without having
> > any templates to choose from, then I would be impressed. Note that
> > infants making up their own language don't wind up with a mix of
> > French, Chinese, and Braille. Let a machine tell me what it wants or
> > how it feels without a programmer telling it how it might answer.
> But there are strong similarities in all languages, including made up
> ones.

There are strong similarities in identities too, so a computer should
have no trouble discovering it's own if it's experiences were like
ours - but they aren't.

>  So what makes you think evolution hasn't programmed how you
> feel?

It has programmed how we feel, but it hasn't programmed *that* we
feel. Evolution has evolved DNA from simpler molecules and atoms. The
experiences of those things evolved as well. We are the capitulated
experience of that evolution of experience and how experience has
shaped experience-ability. We mistake our intellectual experience for
a universal sense and when we impose it on matter of lesser ability,
all it can do is just what we tell it to do, thereby reflecting our
own sense back to us and not causing it to learn how to feel like us
or understand what we understand.

 And if it has why deny consciousness to a machine programmed by a
> human to want certain things.  

I'm not denying it, I'm pointing out that we're fooling ourselves if
we think that's possible. It is no different from making a movie with
cartoons that say they want certain things. The cartoon has no
consciousness, that's absurd. It's the consciousness of the cartoonist
which is refracted to us through the sense we can make of the cartoon
images. The cartoon itself isn't even an image, it's meaningless
shaded regions. It means nothing to spider or a plant, it's a purely
human to human text. A computer is the same thing, it's a human to
human text processor. A text is not an interpreter, it's just that
which is interpreted by an interpreter.

What difference does it make where the
> desire comes from?

It makes a difference if you are talking about replacing your brain
with something that keeps your body alive and pretends to be you. It
makes a difference if you are talking about making something to use as
a perpetual servant. Otherwise it's fine. We imagine that fictional
characters are real all the time, I don't have a problem with that at
all - it's fun.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to