For example, the statement that "An AI is an agent that tries to satisfy
a set of goals" (or some such wording) seems to make the AI endeavor
look like it starts from a clear conceptual beginning that DOES NOT
REFER to any attempt to copy human intelligence.  This is why the
agent-talk definition is required.  It is a political tool, of you will.


No.

It is possible to define intelligence in an abstract way that is not closely
coupled to human intelligence.  Even though obviously this definition
uses concepts created by humans based partly on introspection.
(Similarly, we can create a definition of gravity in a way that is not
closely
coupled to the Earth in particular, even though our language for discussing
gravity was created based on our experiences on Earth.)

Hutter, Legg and I have done this already.  Others have too.

But, pragmatically, if someone created an AI that had nothing to do with
human intelligence, we wouldn't necessarily even be able to recognize
that it was intelligent!

I am explicitly trying to copy many of humans' intelligent behaviors, and
many aspects of human cognitive architecture and dynamics ... even
though I am not in toto trying to build an artificial human...

-- Ben G

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to