On Sep 14, 7:07 pm, meekerdb <meeke...@verizon.net> wrote:
> On 9/14/2011 3:04 PM, Craig Weinberg wrote:
> > On Sep 14, 3:16 pm, Evgenii Rudnyi<use...@rudnyi.ru>  wrote:
> >> Even a better example would be Big Dog:
> >>http://www.youtube.com/watch?v=W1czBcnX1Ww
> >> Here it would be hard to say that Big Dog does not perceive. Yet the
> >> question remains if this perception still unconscious, or one already
> >> can find some elements of conscious perception.
> > I would say that there is still no perceiver in Big Dog. There are
> > sophisticated algorithms coordinating it's input and outputs, but
> > there is no experience of those algorithms. Whatever map it constructs
> > is primitive, generic, and bearing only superficial resemblance to the
> > world which we observe it interacting.
> And what resemblance is there between the world you observe and I observe or 
> a tiger
> observes.  Big Dog's map is based on GPS so it probably knows where it is and 
> which way is
> north better than you do.

It definitely knows some things better than I do, but not things that
I care about, and not even things that it cares about. A tiger lives
in a world much more integrated with ours than Big Dog. We both relate
to food and respond to threats, etc. We can hear and see some of the
same things. Big Dog would not know whether it was on the moon or at
the bottom of a river. It just knows how weight on legs can try to
move and not fall over, but it just knows about them as an
abstraction. If doesn't feel what it looks to us like it feels. It's
just running a program.

> > It doesn't care where it goes
> > or whether or not it has to struggle to walk.
> You presume to much.  I'd bet it does 'care'.  It certainly doesn't try to go 
> through tree
> trunks.

I don't think it can care. It tries to go forward and if it can't go,
it goes somewhere else. It won't mind if you put it into a locked room
for 100 years while it bumps into the wall over and over. It won't
ever figure out that it's alone or not alone or trapped or free. It
won't evolve or reproduce or have an identity crisis.

> > Since it is designed
> > specifically to simulate familiar actions of a quadruped, it plays on
> > our HADD and prognosia sympathies, not much different from a stuffed
> > animal, just to a greater extent.
> Like our sympathies for featherless bipeds.

Sympathy for other living organisms is what sympathy is. Machines have
no sympathy, so sympathy toward them is fanciful.

> >> Then we can ask ourselves the same for insects, and finally go onward
> >> along the tree of life.
> > Insects have a nervous system but made of few neurons. My guess is
> > that their qualia is orders of magnitude less significant than our own
> > as a rule of thumb, but some insects might, just as some mammals might
> > accumulate more significant qualia through experience than others. A
> > nervous system I think, functions as an organism within an organism,
> > which might be a minimum requirement for awareness of awareness
> > (consciousness). It's probably a pretty shallow consciousness, not
> > enough that would even get our attention. Their own lives might not
> > matter very much to them, but more than a cell, and much more than a
> > molecule.
> So why can't you admit as much about a computer like Watson.  It's an order 
> of magnitude
> more complex than an insect nervous system.  Oh yeah, I forgot --- it's not 
> made of the
> right stuff.

Watson does have a molecular level awareness. I'm not denying that. It
knows about volts and watts and switches of switches of switches. It
doesn't know about Jeopardy though. No more than the characters in a
movie know about the audience in the movie theater.


You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
For more options, visit this group at 

Reply via email to