Pei Wang wrote:
Let's don't confuse two statements:

(1) To be able to use a natural language (so as to passing Turing
Test) is not a necessary condition for a system to be intelligent.

(2) A true AGI should have the potential to learn any natural language
(though not necessarily to the level of native speakers).

I agree with both of them, and I don't think they contradict to each other.

"Natural" language isn't. Humans have one specific idiosyncratic built-in grammar, and we might have serious trouble learning to communicate in anything else - especially if the language was being used by a mind quite unlike our own. Even a "programming language" is still something that humans made, and how many people do you know who can *seriously*, not-jokingly, think in syntactical C++ the way they can think in English?

I certainly think that something could be humanish-level intelligent in terms of optimization ability, and not be able to learn English, if it had a sufficiently alien cognitive architecture - nor would we be able to learn its languge.

Of course you can't be superintelligent and unable to speak English - *that* wouldn't make any sense. I assume that's what you mean by "true AGI" above.

--
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to