--- On Sun, 10/19/08, Dr. Matthias Heger <[EMAIL PROTECTED]> wrote:
> Every email program can receive meaning, store meaning and
> it can express it
> outwardly in order to send it to another computer. It even
> can do it without
> loss of any information. Regarding this point, it even
> outperforms humans
> already who have no conscious access to the full meaning
> (information) in
> their brains.

Email programs do not store meaning, they store data. The email program has no 
understanding of the stuff it stores, so this is a poor analogy. 
 
> The only thing which needs much intelligence from the
> nowadays point of view
> is the learning of the process of outwardly expressing
> meaning, i.e. the
> learning of language. The understanding of language itself
> is simple.

Isn't the *learning* of language the entire point? If you don't have an answer 
for how an AI learns language, you haven't solved anything.  The understanding 
of language only seems simple from the point of view of a fluent speaker. 
Fluency however should not be confused with a lack of intellectual effort - 
rather, it's a state in which the effort involved is automatic and beyond 
awareness.

> To show that intelligence is separated from language
> understanding I have
> already given the example that a person could have spoken
> with Einstein but
> needed not to have the same intelligence. Another example
> are humans who
> cannot hear and speak but are intelligent. They only have
> the problem to get
> the knowledge from other humans since language is the
> common social
> communication protocol to transfer knowledge from brain to
> brain.

Einstein had to express his (non-linguistic) internal insights in natural 
language and in mathematical language.  In both modalities he had to use his 
intelligence to make the translation from his mental models. 

Deaf people speak in sign language, which is only different from spoken 
language in superficial ways. This does not tell us much about language that we 
didn't already know. 

> In my opinion language is overestimated in AI for the
> following reason:
> When we think we believe that we think in our language.
> From this we
> conclude that our thoughts are inherently structured by
> linguistic elements.
> And if our thoughts are so deeply connected with language
> then it is a small
> step to conclude that our whole intelligence depends
> inherently on language.

It is surely true that much/most of our cognitive processing is not at all 
linguistic, and that there is much that happens beyond our awareness. However, 
language is a necessary tool, for humans at least, to obtain a competent 
conceptual framework, even if that framework ultimately transcends the 
linguistic dynamics that helped develop it. Without language it is hard to see 
how humans could develop self-reflectivity. 

Terren

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to