It would be nice to have a universal definition of general intelligence, but I 
don't think we even share enough common intuition about what is intelligent or 
what is general.
 
Instead what we seem to have is, for example, a definition based on uncertain 
reasoning from somebody building an reasoning engine based on uncertainty; a 
definition based on goal achievement from somebody building an architecture 
with achieving goals as a major feature, and so on.
 
There seem to be too many viewpoints... a thermostat may or may not be 
intelligent but it is not "generally intelligent".  But is a cat generally 
intelligent?  If so, which is more generally intelligent:  a cat or a giraffe?  
If not, why not? 
 
Rather than try to come up with universally accepted definitions for a concept 
that we all view differently, perhaps any proposed AGI (or AGI-like) path could 
put forward its perceived endpoint:  that is, imagine the system you'd like to 
build...
 
One example might be:
 
* Working up from the physical and mental capabilities of simple animals to 
eventually achieve human-like abilities (with various flavors of brain 
inspiration)
 
 - we could debate whether it's a good idea to proceed this way, and have 
thoughts about the likelihood of success of the method, but we'd agree that the 
target is generally intelligent by all of our intuitive senses.
 
Another example might be:
 
* Pass the Turing Test by building gradually more and more capable chat bots.
 
  - most of us on this list would probably find the development path to get 
stuck in local maxima very early (thus the Loebner Prize and its laughingstock 
status), but the end result would be generally accepted as being generally 
intelligent.
 
Other possibilities could be somewhat different.  For example, one thing not 
usually discussed very much is WHY we are trying to build AGI in the first 
place.  One end goal might look something like this:
 
I want an intelligence that I can give instructions such as "figure out a 
minimum cost way to build a { rocket ship to mars / whole body disease scanner 
/ nanofabricator / ... }".  This device would have to design factories, parts, 
arrange shipments, assign tasks to humans and robots, etc.  For a wide enough 
set of inputs and object specification methods, we'd probably call it an AGI.
 
The point is that maybe we don't need a definition of intelligence, all we need 
is a vision of an endpoint and (the really interesting bit), the steps we'll 
take to get there.
 
 
* 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to