Benjamin Goertzel wrote:
I think that if it were dumb enough that it could be treated as a tool,
then it would have to no be able to understand that it was being used as
a tool.
And if it could not understand that, it would just not have any hope of
being generally intelligent.
You seem to be assuming this hypothetical AGI will have a human-like
motivational system, in which being a tool innately feels bad for
one's status. But an AGI need not relate to the notion of status in
the same way that humans do, not having a primate-based motivational
system...
I hear what you say, but I wasn't trying to imply that. (Even if it did
have a human-like motivational system I wouldn't insert a "status
consciousness" module anyway).
What I meant was that if it had awareness of the consequences of its
actions, it would think before acting, and if it thought about
consequences before acting, it would, ipso facto, not be a "tool".
No tool that I have ever known has been able to do that. It's a big
difference, with very significant consequences.
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=61196946-b85ad2