On Feb 18, 2008 10:11 PM, Richard Loosemore <[EMAIL PROTECTED]> wrote:


> You assume that the system does not go through a learning phase
> (childhood) during which it acquires its knowledge by itself.  Why do
> you assume this?  Because an AGI that was motivated only to seek
> electricity and pheromones is going to be as curious, as active, as
> knowledge seeking, as exploratory (etc etc etc) as a moth that has been
> preprogrammed to go towards bright lights.  It will never learn aything
> by itself because you left out the [curiosity] motivation (and a lot
> else besides!).
>

I think your reply points back to the confusion between intelligence and
motivation. "Curiosity" would be a property of intelligence and not
motivation. After all, you need a motivation to be curious. Moreover, the
curiosity would be guided by the kind of motivation. A benevolent motive
would drive the curiosity to seek benevolent solutions, like say solar
power, while a malevolent motive could drive it to seek destructive ones.

I see motivation as a much more basic property of intelligence. It needs to
answer "why" not "what" or "how".


> But when we try to get an AGI to have the kind of structured behavior
> necessary to learn by itself, we discover ..... what?  That you cannot
> have that kind of structured exploratory behavior without also having an
> extremely sophisticated motivation system.
>

So, in the sense that I mentioned above, why do you say/imply that a
pheromone (or neuro transmitter) based motivation is not sophisticated
enough? And, without getting your hands messy with chemistry, how do you
propose to "explain" your emotions to a non-human intelligence? How would
you distinguish construction from destruction, chaos from order, or why two
people being able to eat a square meal is somehow better than 2 million
reading Dilbert comics.


In other words you cannot have your cake and eat it too:  you cannot
> assume that this hypothetical AGI is (a) completely able to build its
> own understanding of the world, right up to the human level and beyond,
> while also being (b) driven by an extremely dumb motivation system that
> makes the AGI seek only a couple of simple goals.
>

In fact, I do think a & b are together possible and they best describe how
human brains work. Our motivation system is extremely "dumb": reproduction!
And it is expressed with nothing more than a feed back loop using
neuro-transmitters.

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to